1

Vuzix Receives and Delivers Significant Follow-on Smart Glasses Order to Fortune 50 Global Retailer to Support Warehousing and Logistics Operations

COVID-19 has continued to create global supply uncertainties, disruptions, and inflationary forces which are forcing companies of all sizes to better manage their supply chains. Combined with the ongoing growth of online shopping, attaining new productivity levels for product transportation, inventory management and order fulfillment will become a distinct competitive advantage.

Smart glasses are becoming a cost-effective tool to facilitate these objectives and an increasing number of the world’s largest firms are starting to move from trialing them to deploying them.

“Vuzix has spent a fair amount of time and resources honing this technology and we are now seeing growing market adoption of our products that are allowing companies to operate with greater productivity. We are pleased to be working with this client, which represents just one of multiple major retailers either implementing or testing our technology for logistics and warehouse usage,” said Paul Travers, President and Chief Executive Officer at Vuzix.

 




Case Study Augmented Reality in Construction Planning Holo-Light

Human ability to imagine objects that are not physically present is limited. It is even more difficult for us to mentally place them in an existing environment. How often, for example, has it happened to you that a newly purchased piece of furniture was too large for the intended space?

In construction planning and architecture, this problem is amplified. Whereas in the case of the previously mentioned piece of furniture, only a single part has to be inserted into an existing space, in architecture we are often dealing with entire buildings in which floors, rooms and objects stand in a relationship to one another; and of course, the building itself as a whole must also fit into its surroundings. In this process, our lack of imagination can lead to mistakes with far-reaching consequences.

This is where technology helps our imagination tremendously. Augmented reality (AR) in combination with Building Information Modeling (BIM) ensures that we can “actually” see all objects and relationships.

What is BIM? What is AR?

Building Information Modeling (BIM) is to be understood as a digital method that is used throughout the life cycle of a building. In this process, all data and information related to the construction is stored and mapped in a BIM-enabled software.

Augmented reality is the computer-aided expansion of reality perception. Specifically in construction planning, the BIM models are “projected” into the real environment.

Benefits of AR in Construction Planning

AR applications for construction planning help our imagination tremendously. They support the entire decision-making process both on the side of the construction planner and on the side of the client. AR glasses can be used to better present and understand the planned building. Thus, decisions in the early planning phases can be made more easily and more correctly, which reduces planning and construction costs accordingly.

“Especially in the case of existing building conversion, it is advantageous if you can visualize the superimposition between the model and reality.”

DI Dr. Timur Uzunoglu, Managing Director convex ZT GmbH

AR Use Cases in Construction Planning at convex

At convex ZT GmbH, we use AR technology from the design phase to operation. With Holo-Light’s AR3S software, we bring BIM planning closer to clients and enable greater planning transparency. Building owners feel more involved in the planning process during our AR-assisted planning meetings and can make better decisions. We make AR inspections together with the builders directly on site. These AR inspections provide a direct impression on site in real time and help to weigh alternatives against each other. In revitalizations of existing buildings, it is often challenging to bring the new structures into a functioning harmony with the existing buildings, and AR helps very well there, too.

 




ThirdEye Targets EPA Green Goals for Metaverse

The solution works using sustainability targets from the United States Environmental Protection Agency (US-EPA), which aims to build a carbon-neutral future for the planet.

Citing EPA figures, ThirdEye said the COVID-19 pandemic sharply reduced global transport traffic, which was the “largest contributor to anthropogenic [US] greenhouse gas emissions at 29 [percent].”

ThirdEye’s AR/MR telepresence solutions allow companies to lower their carbon footprint by reducing the overall need for global transport, and the firm’s RemoteEye platform has cut onsite visits to allow significant cost savings, leading to a major improvement in return on investment (ROI).

Nick Cherukuri, Founder of ThirdEye, said his company’s RemoteEye platform aims to include a Carbon Footprint Score for its users to calculate the organisation’s carbon footprint with AR.

Explaining further on the benefits of AR technologies, he continued, stating,

“Not only are AR and MR teleconferencing platforms financially prudent due to traveling less, but by using this technology to share knowledge and operational workflows, there are tremendous carbon emission savings. For example, we can bring education and telehealth to underprivileged areas around the world with augmented and mixed reality”

The company’s RespondEye, which complies with the US Health Insurance Portability and Accountability Act (HIPAA), also allows doctors to tackle health problems for remote patients “anytime, anywhere.” Doctors can later assign patients and carers medical diagnoses and treatment options.

Enterprises can also benefit from the introduction of 3D digital twins to reduce inventory and other digital assets, ThirdEye said, adding doing so would reduce production emissions and costs.

The news comes as the US firm aims to expand its solutions to the Asia-Pacific with its X2 MR smart glasses and a major partnership with Go VR Immersive, a Hong Kong-based XR startup.

Tthe smart glasses would be deployed to remote workers across China, just shortly his firm inked a major partnership with Microsoft to deploy HoloLens 2 MR head-mounted displays in the Asia-Pacific region.

 




Magic Leap’s New AR Headset Will Debut in 2022

A few things mentioned include:

  • Eye examinations can be done at a fraction of the cost
  • Magic Leap’s next generation AR glasses are smaller lighter, faster
  • They have a greater field view – this has doubled in their next gen device
  • Vertical representation e.g. surgery digital content overlaid across the knee and look at virtual screens
  • Bringing light dimmer to bring more focus to what needs to be concentrated on (again surgical use)

Answering criticism about lack of progress, Johnson argued that 4 healthcare companies are testing the devices right now and other industries are working with Magic Leap at the moment. These include:

  • Health
  • Defense and Public Sector
  • Manufacturing
  • Automotive and Transport
  • Oil and Gas
  • Architecture, Engineering, Construction (AEC)

You can watch the video here 




Realwear Navigator First Look at the Future of Assisted Reality

This offers a frontline connected worker platform for the integration of multiple assisted and augmented reality (SLAM) experiences into a high-performance industrial solution

RealWear Navigator™ 500 solution is the all-new head-mounted device product platform specifically designed to engage, empower and elevate the frontline worker for the next several years.

Building on the accumulated experience of the last four years, working with 5000 enterprise customers in 60 countries with solutions based on our HMT-1™ and HMT-1Z1™ platforms, this new product brings targeted innovation in all the key areas that matter most to achieving solid results at scale.

RealWear has been known for establishing and gaining major customer deployments for frontline worker solutions based on “assisted reality”.

The core concept of assisted reality is that it makes a different tradeoff than mixed reality. Assisted reality is better suited to the majority of industrial use cases where user safety is paramount.

The goals of assisted reality are to keep the user’s attention in the real world, with a direct line of sight, for the most part unoccluded by digital objects or “holograms” that require extra cognitive focus for humans to process.

Situational awareness of moving machinery, approaching forklifts or other vehicles, steam escape valves, slip and trip hazards and electrical and chemical hazards is key for RealWear’s customers. These are the same working environments that mandate specific personal protective equipment for safety glasses and goggles, to hard hats, hearing protection, heavy gloves and even respirators. Users in these situations mostly require both hands to be available for the use of tools and equipment, or to hold on to railings, ropework, etc.

In turn the user interface for assisted reality cannot rely on the availability of hands to operate handheld controllers, or to draw gestures in the air.  RealWear’s assisted reality solutions rely on voice recognition that is field proven in very high noise environments, plus the minimal use of head motion detection. The platform uses a single articulated micro-display easily adjusted to sit below the dominant eye that does not obstruct direct vision and provides the user a view similar to a 7-inch tablet screen at arm’s length.

A core concept of mixed reality has been the placement of virtual 3D digital objects overlaid on the physical world – such as 3D models or animations. This requires two stereoscopic see-through displays that are brought to a point of focus that typically is not in the same plane as the real-world object. The resulting vergence-accommodation conflict – where the greater convergence of the eyes when looking at near objects is in conflict with the focal distance, or accommodation of the eye’s lens needed to bring the digital image into focus – is a source of eyestrain, discomfort and in some cases headaches after extended use. In addition, in bright conditions, especially outdoors, mixed reality displays struggle to provide sufficient contrast with the real world and therefore they always either cut a significant amount of light from the real world using darkened glass or have to generate such a bright display that battery life is very short unless tethered with a cord to a separate battery pack. Both situations contribute to eyestrain with extended use.

However mixed reality applications do allow information to be overlaid on the real-world asset which in some use cases can provide an additional boost in productivity in identifying the item to be worked on.

So how could this tradeoff be solved?   Is it possible to tag or overlay information on the real 3D world while also maintaining safety, situational awareness, low eyestrain, hands-free use and full-shift battery life?

We’ve long believed that the answer lies in amping up the amount of “assistance” in assisted reality rather than solely focusing on the amount of reality, with power-hungry, wide field of view, super bright stereoscopic, transparent and ultra-high resolution displays. With advanced camera capabilities and computer-vision processing, key information about real-world assets can be placed on the camera view shown in the single, monocular, non-see-through (opaque) display.

Read more. 

 




Samsung and Microsoft may be working on a future augmented reality hardware

A report by The Elec suggests that Microsoft and Samsung are working together on future augmented reality hardware. It is not disclosed whether this is for the consumer market, enterprise market, or both. All that is known is that the project is AR-related and may involve some sort of hardware Samsung will be producing (rather than Microsoft). Samsung’s investments in DigiLens, the company behind tech found in AR display devices, may further substantiate the idea that the former will be handling the physical gadgetry in its collaboration with Microsoft.

Apparently, several divisions of Samsung are involved in the project, with Samsung Display, Samsung Electro-Mechanics, and Samsung SDI are all tied in. This AR project started in the middle fo 2021 and aims to result in a commercially viable product by 2024.

 

 




RealWear Introduces RealWear Navigator™ 500 Industrial-Strength Assisted Reality Wearable for Frontline Workers

Fully optimized for hands-free use, RealWear Navigator 500 is an innovative platform solution that combines hardware, software, and cloud-ready services with a rugged design that is one-third lighter and slimmer than the previous generation, making it easier for frontline workers to wear the device for their entire shift. The hardware is designed as a modular platform with an upgradeable 48 megapixel (MP) camera system, a truly hot-swappable battery, with Wi-Fi, and an optional 4G (and soon-to-be-available 5G) modem. The voice-controlled user interface includes unique noise-cancelation technology designed for high-noise environments. RealWear has more than 200 optimized partner apps supporting a variety of use cases, such as remote collaboration, guided workflow and IoT and AI data visualization.

Assisted reality [infographic available] is a non-immersive experience and has become the preferred Extended Reality (XR) solution for frontline industrial workers, especially where high situational awareness is necessary. Assisted reality experiences are closer to the physical world, compared to virtual reality (VR) and augmented reality (AR) experiences that immerse workers in the metaverse.

With RealWear Navigator 500, RealWear has again raised the bar for how assisted reality and other XR technologies are deployed at the world’s leading industrial companies. Automotive, logistics, manufacturing, food & beverage and energy companies, among others, can use RealWear Navigator 500 to deliver real-time access to online information and expertise to the world’s more than 100 million industrial frontline workers.

“With pandemic concerns continuing to press upon the global economy, how technology is enabling a ‘new way to work’ is very much in focus, particularly for industrial frontline workers,” said Andrew Chrostowski, Chairman and CEO of RealWear. “Today we’re unveiling something far bigger than a product. The RealWear Navigator 500 delivers the next generation of work with a ‘reality-first, digital-second’ enterprise solution for remote collaboration, operational efficiency, and hybrid work in safety-critical industries. Assisted reality – more so than augmented or virtual reality – is designed specifically for the frontline worker who requires both hands for the job, striking the perfect balance of keeping workers 100% present and self-aware with the ability to safely navigate industrial surroundings. After all, nobody wants to be near hazardous equipment with their head stuck into the metaverse.”

Read the rest of the full press release here. 




Ford Technical Assistance Center Using TeamViewer Frontline Augmented Reality Solution to Streamline Customer Vehicle Repairs Worldwide

The new service is offered by Ford’s Technical Assistance Center (TAC), a centralized diagnostic troubleshooting team that provides support to all Ford and Lincoln dealerships’ technicians who diagnose and repair customer vehicles.  Dealer technicians can initially reach out to TAC specialists via a web-based portal or even on a phone.  With the new See What I See program, TAC specialists can now start a remote AR session using TeamViewer Frontline through a pair of onsite RealWear smart glasses to share, in real time, exactly what the repair technician is looking at.  TAC specialists can add on-screen annotations and additional documentation directly in the line of sight of the repair technicians, as well as zoom in, share their screen, record the session and even turn on flashlights remotely.

“My team diagnoses some of the most complex and complicated vehicle issues,” says Bryan Jenkins, TAC powertrain operations manager.  “I would frequently hear my team say that if they could only see what that technician is talking about, or what the technician is doing or how they’re completing a test, then they could solve the problem more accurately.  A picture is worth 1000 words, but sometimes that still wasn’t quite enough, and we needed a way to see something live and in action.  And that’s what really kicked this whole program off.”

Ford’s See What I See program is an additional layer of support that is already used by more than 400 dealers in the U.S., Mexico, South Africa, Thailand, Australia, New Zealand and the U.K.  Currently Ford is promoting the new program to its full network of 3,100 U.S. based dealers, with a positive response. “Feedback from the dealers has been really good,” says Jenkins.  “From the dealer technician perspective, they just turn on their smart glasses and accept an incoming call, then it is like my specialists are there looking over their shoulder to help resolve the problem.”

“We are very excited to add Ford to our growing list of forward-thinking customers that are leveraging AR solutions to improve business processes,” says Patty Nagle, president of TeamViewer Americas.  “The majority of workers globally do not sit in front of a desk.  Our goal is to enable those frontline workers with AR guided solutions to enable them to do their jobs better by digitalizing and streamlining processes.”




Tech trends driving Industry to v5.0 – Rockwell Automation

Rarely has industrial automation changed at such an exponential rate. The combination of various technology trends has propelled enterprises into Industry 4.0 so fast that Frost & Sullivan has already delivered an Industry 5.0 blueprint to guide the journey.

Edge-and-cloud integration, converged development environments, artificial intelligence (AI) and autonomous production are far more than conceptual. These technological innovations are already happening.

“This is a unique time in our industry,” explained Cyril Perducat, who shared the automation supplier’s plans for the immediate future at Automation Fair 2021 in Houston. “The future is a trajectory, a path that we are already on. When I think of Industry 4.0, which was first coined in 2011, there is certainly a lot of learning over the past 10 years of what Industry 4.0 can deliver. And COVID has accelerated many of those dimensions.”

Remote connectivity, advanced engineering with multiple digital twins, mixing physical and digital assets, and the change of human-machine interaction are driving industry along that path toward Industry 5.0.

Perducat questioned whether it’s too soon to look at Industry 5.0 when all the promise of Industry 4.0 has not yet been delivered, but he identified five changes that are attainable and impactful in Frost & Sullivan’s comparison of Industry 4.0 to Industry 5.0:

  • delivery of customer experience,
  • hyper customization,
  • responsive and distributed supply chain,
  • experience-activated (interactive) products, and
  • return of manpower to factories.

“We are able to bring more capabilities to people,” said Perducat. “Human resources are scarce. By delivering systems that make the human-machine interaction more efficient, we make it more impactful while remaining safe.”

size=0 width=”100%” noshade style=’color:white’ align=center>

Rockwell Automation has identified four areas where technology can move companies along that journey:

  • evolution of cloud, edge and software,
  • universal control and converged integrated development environments (IDEs),
  • AI native operation management, including software as a service (SaaS) and digital services, and
  • autonomous systems and augmented workforce.

“We believe in control at the enterprise level,” explained Perducat. “We believe in systems with software-defined architecture and the underlying hardware. It doesn’t mean hardware is becoming obsolete. And it’s not that every piece of the system needs to be smart. The entire system, from the device to the edge and to the cloud, is smart. Edge + cloud architecture is fundamental.”

In the converged environment, control, safety and motion all come together and must work in an integrated fashion. This is especially true with the growth of robotics. “The boundaries between control and robotics are becoming more and more blurred,” said Perducat. “Safety is very fundamental in this more complex architecture. It does not work if it is not safe.”

Operations management becomes more efficient when AI is native to the architecture and is at the level of the enterprise. “A holistic view requires a lot of data and the ability to process that data,” explained Perducat. “Part of this has to be autonomous using the power of applied AI; it’s not just one more tool but is everywhere in the architecture. We can use AI on the machine to translate vibrations into data. We can think of AI in terms of process modeling. And model predictive control is evolving with AI. When you can orchestrate all the elements of the architecture, that is a system.”

FactoryTalk Analytics LogixAI is a modeling engine that enables closed-loop optimization through four steps—observe (sensor), infer (model), decide (controller) and act (actuator).

Finally, by transforming from automated systems to autonomous systems, it enables better decisions to expand human possibility.

AI can also help to simplify a new generation of design. “You can use AI to help to generate blocks of code, like individuals working together peer-to-peer, but one of them is AI, augmenting human possibility,” explained Perducat.

“We see the next step to autonomous manufacturing as an opportunity to deliver value to our customers,” he said. “The autonomous system is reimagining the fundamental principles of autonomous control systems. You don’t need to rip and replace. We have the ability to augment existing systems with new technology.”

Perducat stressed that it cannot be just technology innovation. “Technology only creates possibilities or potential values,” he explained. “It has to be accessible by users, so we have to innovate on the user experience point of view. We want to bring that to all the products, experiences and models. In a digital native world, innovation extends beyond technology and features.




HP is Using HoloLens to Help Customers Remotely Repair Industrial Printers

While many AR companies are focused on building AR products, HP is making an interesting move in using the technology as an add-on to improve an existing line of its business. The company’s newly announced xRServices program promises to deliver remote AR support for its industrial printer customers.

The program employs Microsoft’s HoloLens 2 headset, which HP’s customers can use to access AR training and live guided instructions to fix issues that arrive with complex, commercial scale printers.

HP is pitching the solution as a way to allow even untrained individuals to fix issues with the help of a specialist on the other end who can guide them step-by-step through troubleshooting and repairs with AR instruction. Further the company says the service can be used to provide AR training for various workflows and issues that may arise with the company’s industrial printers.

HP hasn’t clearly detailed exactly what software it’s running on HoloLens to facilitate xRServices, but it seems likely that it is leveraging Microsoft’s Dynamics 365 Remote Assist platform which includes many of the AR functions that HP showcased in its xRServices concept video—like augmented annotation, document visualization, and video chatting through the headset.