1

PTC Holds On to Top Spot in ABI Research’s Enterprise AR Platforms Competitive Assessment

The Enterprise AR Platforms assessment analyzed 8 AR platform players operating today: PTC, Ubimax, RE’FLEKT, Atheer, Upskill, ScopeAR, Librestream, and Fieldbit. Scoring criteria were split across present day (implementation) and forward looking (innovation) criteria. Implementation criteria included customers/partnerships/footprint, platform breadth, device support, user experience options, accessibility, and pricing. Innovation criteria included use case applicability, IoT synergy, cloud connectivity, machine vision capabilities, data visualization capabilities, and transformative technology capabilities.

“The most competitive players in augmented reality platforms today are able to serve a few key use cases, while also allowing flexibility in offering throughout the lifetime of a customer engagement,” says Eric Abbruzzese, Research Director at ABI Research. “Over the past few years, showcasing value to a customer was limited to the pilot phase and small-scale engagements. Today, there is still the need to show upfront ROI potential at this scale, but equally important is an ability to scale up with the customer as they grow in number of users and types of use cases. The most competitive operators are able to span these needs, while adding value through differentiated portfolio offerings.”

PTC takes the top spot for a couple primary reasons: broad and deep AR capabilities are backed up with value-add offerings and impactful enabling technologies. Augmented reality has seen a strong growth in industrial markets, where PTC is already strong and can leverage their existing products like Creo and Windchill to strengthen related aspects of an AR implementation. Ubimax and RE’FLEKT take the second and third spots; Ubimax is a leader in platform breadth and device support allowing for great flexibility and capability for growth, while RE’FLEKT offers a unique approach to content creation and distribution for customers.

This assessment is of course not inclusive of all AR competitors. There are numerous use-case focused players that purposefully limit platform breadth in order to maximize value around a specific use case or application. At the same time, there is increasing desire for AR platforms that can perform in multiple areas. This can include supporting numerous use cases with a single platform, such as remote expertise, training, and data logging, but can also include enabling technologies like machine vision and SLAM tracking, device management, and content creation. These technologies are often differentiators in this space, with companies expanding their portfolio reach and overall value through them.

There are nearly infinite variables involved in a proper augmented reality investment and implementation, all of which can prove to be a potential barrier. With the right enabling platform, these variables and barriers can be minimized, leading to faster and more complete ROI. “Not every customer needs a complete end-to-end solution, and certain aspects of a partner will be relevant based on a customer’s KPIs, examining target use cases, environment, and workflows. A proper understanding of competitive strengths and weaknesses across key areas will help make partnership opportunities clearer and more effective,” concludes Abbruzzese.

These findings are from ABI Research’s Enterprise Augmented Reality Platforms competitive assessment report. This report is part of the company’s Augmented and Virtual Reality research service, which includes research, data, and analyst insights. Competitive Assessment reports offer comprehensive analysis of implementation strategies and innovation to offer unparalleled insight into a company’s performance and standing in comparison to its competitors.

Read PTC’s AREA member profile.




Oil and gas companies evaluate wearable tech to improve safety and efficiency

Initial research focuses on enabling the real-time monitoring of field technicians to ensure their safety and to provide audiovisual assistance to perform asset maintenance, with the hope of adopting lightweight yet robust wearable devices, GlobalData noted.

These features of wearable technology encourage oil and gas companies to adopt helmets, smart glasses, wristbands and other devices that incorporate technologies such as wireless connectivity, artificial intelligence (AI) and augmented reality (AR).

Ravindra Puranik, oil and gas analyst at GlobalData, said, “Mobility is considered as the main driver and precursor to implementing any wearable technology in the oil and gas industry. Ever since the evolution of digital technologies, companies in the oil and gas sector are using industry-grade smartphones to capture field-level data and exchange information with onshore experts.”

“Instead of handheld smartphones, hands-free devices will increase work efficiency among the frontline workforce. Through different applications, wearable smart devices are expected to bring a paradigm shift in oil and gas field operations,” he added.

Wearable devices are also designed to provide safety to field staff by monitoring the wearer’s health condition, alerting them from exposure to potential hazards and also providing access to live locations of workers to the onshore support team. This, in turn, provides a sense of security among the workers and increases productivity.

“The oil and gas industry is integrating wearable tech with inspection and maintenance technologies to improve data collection and minimise risk to its workforce. Wearable devices in the oil and gas industry are made to withstand extreme temperature variations and resist oil, chemical spills, heavy rain, and dust among other things, making the working environment more secure,” he stated.

GlobalData’s thematic research identifies oil and gas companies, such as BP and Shell, among the leading adopters of wearable technology. In addition to these, several other companies, including Saudi Aramco, Eni, Marathon Petroleum, Chevron, ExxonMobil (AREA member), Baker Hughes, Schlumberger and NOV, have also started to incorporate wearable technology into their operations.




Microsoft Azure – IoT Signals Report

A 20-minute online survey was conducted with over 3,000 decision makers at enterprise companies across the US, UK, Germany, France, China, and Japan who were currently involved in IoT. The research included business decision makers (BDMs), IT decision makers (ITDMs), and developers from a range of industries such as manufacturing, retail/wholesale, government, transportation, healthcare, and more.

In the commercial arena, the Internet of Things continues to grow in popularity. Business decision makers, IT decision makers, and developers at enterprise-size commercial organizations are incorporating IoT into their businesses at high rates, and the overwhelming majority is satisfied with the business results. As an outcome, companies are increasingly eager to adopt IoT.

The enthusiasm for IoT adoption is global, and it also crosses industries. Among the enterprise IoT decision makers we surveyed, 85% say they have at least one IoT project in either the learning, proof of concept, purchase, or use phase, with many reporting they have one or more projects currently in ‘use’.

Adoption rates are similar across surveyed countries (US, UK, Germany, France, China, and Japan) and core industries (manufacturing, retail/wholesale, transportation, government, and healthcare).

The report can be read in full here. 

 




Making VR and AR More than Buzzwords in Construction

At Digital Construction Week 2019, Jonathan Hooper from Laing O’Rourke talked about what’s stopped many companies from realizing this potential and helped illustrate what it will mean for those same companies to move past such challenges.

There’s no denying that the technology is getting better and cheaper. Some companies are introducing standalone VR solutions for AEC professionals while others are bringing point clouds to AR/VR devices. Cheaper and more powerful headsets have also helped to remove traditional barriers around adoption, but those barriers are about more than logistics and costs. For many potential users, the challenges around adoption are more about the applications of VR/AR technology. Or rather, the applications of this technology that could be happening, but aren’t.

Potential users need to know more because understanding the software isn’t enough. They need to know how to drive it and deploy it. If they don’t, then these technologies end up being outsourced, if used at all. When that happens, VR/AR solutions are just a small piece of a project, rather than a holistic solution that can tie everything together.

While he’s seen many companies utilize VR and AR solutions at the start of a project to help sell a client into a project, that’s the same point when many stop using it. Sometimes it’s wholly discarded, which is as much of a shame as it is a waste. Once the baseline of a project has been built in these environments, that same asset can be used elsewhere instead of being reinvented or replaced. The potential to bring the technology into construction workflows is right there, but many don’t realize it.

So how do we make these VR and AR applications do more in these construction environments?

Ultimately, all of the applicable content and information needs to be driven through to the end. The same people who are working on the project need to also drive the development of the app. Content and embedded info need to be created and available as 3D model metadata via interactive APIs so that everyone can access everything in the app.

Measuring the productivity of this kind of endeavor is key though. Thankfully, there are plenty of ways to do it and Hooper talked through what it has meant to measure the value of VR/AR solutions. He showcased a health and safety walkthrough in a VR environment which highlighted where issues can be documented for specific health and safety issues. The models he was using had been created to pitch the product and these same assets were then being leveraged in multiple ways and phases, positively impacting the costs for the client.

Quality training presents another critical use case for a mixed reality headset.

Hooper mentioned that in the use case he was talking through, the stakeholders wanted their workforce to be able to understand if there are any areas of concern with an asset. By constructing a virtual environment, they could pull out those assets to visually look at and inspect them. In this way, the workforce can be educated to know what to look out for and document that information as desired, all of which can then be extracted into documents like Excel as needed.

That ability to export specific data into other formats and programs is essential, as VR and AR technology is really about creating apps, systems and programs that everyone can access. By getting all of this data into one application, users can collaborate more effectively and then take that data elsewhere.

That concept tied into another use case Hooper showed which was one focused on AR that used QR codes. The idea in using these codes was to reuse assets that already existed and easily drop them into 2D documentation to open them up to everyone. Paper with the QR code can be printed, and then people can easily create and see that asset. These solutions are critical for potential users to understand, because they can be deployed with next to no IT skills or software licenses. Bigger workforces can utilize them with little issue with the right support.

What’s the future for VR and AR technologies in construction?

In the short term, we’ll see them used to train existing staff and help diversify an existing workforce so that companies can go beyond AEC when they look to make new hires. The people who help adopt this technology don’t need to have that background to develop these apps, and they’re ones that will prove to be essential in solving the construction productivity imperative. However, these applications are just the beginning, as we’re only scratching the surface of what will eventually be possible.




AR tools used to manufacture the next U.S. manned spacecraft

Among those adopting AR is the world’s largest military contractor, Lockheed Martin Corp., which is working with software developer Scope AR to develop how-to manuals that include animations for assembling spacecraft components. The partners said the collaboration has reduced the time required to interpret assembly instructions by 95 percent, along with an 85 percent reduction in overall training time and a more than 40-percent boost in productivity.

Lockheed Martin first implemented AR technology in 2017 within its space division, which is currently building the NASA’s Orion spacecraft.

Shelley Peterson, Lockheed Martin’s augmented technology project leader, said AR tools are being used to assembled various Orion components, including the skeletal framework of the spacecraft’s titanium heat shield that must withstand re-entry temperatures as high as 5,000 F.

San Francisco-based Scope AR’s tools also have been used for spacecraft components like cable assemblies and instrument panels, as well as the forward bay where the Orion crew seat module is situated. AR technology is used, for example, to develop the work instructions for drilling and torqueing steps, Peterson said.

Peterson also noted in an interview that technologies like Scope AR’s software and Microsoft’s Hololens “mixed reality” tool have helped accelerate the interpretation and presentation of workflow data ranging from assembly, manufacturing, test and maintenance steps. That translates into time savings and reductions in touch labor for the narrow tolerances required for fasteners, transducers, accelerometers and other spacecraft components.

In one example, Peterson said the Lockheed Martin’s space unit has realized a roughly $38 savings per fastener. This for an aerospace manufacturer that buys more than 2 million fasteners a year.

The company said AR allows it to create workflows more rapidly than traditional methods, although Peterson said existing design data can be used to supplement AR-based work instructions. AR software also can be used to add part identifiers or color coding of parts. Assembly steps can then be animated.

Lockheed Martin is developing a reputation as an early adopter of disruptive technologies. Previously, it has invested in a quantum computing center focused on challenges such as using the added computational power to debug millions of lines of mission-critical code.

For its part, Scope AR has gradually developed industrial use cases for its software, starting with training assembly workers and eventually partnering with global manufacturers like Lockheed Martin, Boeing, Siemens and Toyota. It claims to be the first AR vendor to develop an “enterprise-class” AR video platform for Microsoft’s Hololens.

CEO Scott Montgomerie said surgical application of AR technology works best, with the Lockheed Martin use case illustrating how a specific project like Orion can benefit from what Montgomerie calls “real-time knowledge transfer.”

That augmented knowledge includes step-by-step instructions, animations in the form of digital overlays and live support from remote experts. “You don’t want to add another layer of process,” Montgomerie explained in a recent blog post. “You want to ensure workers can access knowledge from subject matter experts or resources….”

Read Scope AR’s AREA member profile
Read original Article on EE Times Lockheed Martin embraces AR on the Shop Floor

 




Sensing for augmented and virtual reality and for advanced manufacturing (MIT)

Q: What do you see as the next frontier for sensing as it relates to augmented and virtual reality?

A: Sensors are an enabling technology for AR/VR. When you slip on a VR headset and enter an immersive environment, sensors map your movements and gestures to create a convincing virtual experience.

But sensors have a role beyond the headset. When we’re interacting with the real world we’re constrained by our own senses—seeing, hearing, touching, and feeling. But imagine sensors providing data within AR/VR to enhance your understanding of the physical environment, such as allowing you to see air currents, thermal gradients, or the electricity flowing through wires superimposed on top of the real physical structure. That’s not something you could do any place else other than a virtual environment.

Another example: MIT.nano is a massive generator of data. Could AR/VR provide a more intuitive and powerful way to study information coming from the metrology instruments in the basement, or the fabrication tools in the clean room? Could it allow you to look at data on a massive scale, instead of always having to look under a microscope or on a flat screen that’s the size of your laptop? Sensors are also critical for haptics, which are interactions related to the sensation of touch. As I apply pressure to a device or pick up an object—real or virtual—can I receive physical feedback that conveys that state of interaction to me?

You can’t be an engineer or a scientist without being involved with sensing instrumentation in some way. Recognizing the widespread presence of sensing on campus, SENSE.nano and MIT.nano—with MIT.nano’s new Immersion Lab providing the tools and facility—are trying to bring together researchers on both the hardware and software sides to explore the future of these technologies.

Q: Why is SENSE.nano focusing on sensing for advanced manufacturing?

A: In this era of big data, we sometimes forget that data comes from someplace: sensors and instruments. As soon as the data industry as a whole has solved the big data challenges we have now with the data that’s coming from current sensors—wearable physiological monitors, or from factories, or from your automobiles—it is going to be starved for new sensors with improved functionality.

Coupled with that, there are a large number of manufacturing technologies—in the U.S. and worldwide—that are either coming to maturity or receiving a lot of investment. For example, researchers are looking at novel ways to make integrated photonics devices combining electronics and optics for on-chip sensors; exploring novel fiber manufacturing approaches to embed sensors into your clothing or composites; and developing flexible materials that mold to the body or to the shape of an automobile as the substrate for integrated circuits or as a sensor. These various manufacturing technologies enable us to think of new, innovative ways to create sensors that are lower in cost and more readily immersed into our environment.

Q: You’ve said that a factory is not just a place that produces products, but also a machine that produces information. What does that mean?

A: Today’s manufacturers have to approach a factory not just as a physical place, but also as a data center. Seeing physical operation and data as interconnected can improve quality, drive down costs, and increase the rate of production. And sensors and sensing systems are the tools to collect this data and improve the manufacturing process.

Communications technologies now make it easy to transmit data from a machine to a central location. For example, we can apply sensing techniques to individual machines and then collect data across an entire factory so that information on how to debug one computer-controlled machine can be used to improve another in the same facility. Or, suppose I’m the producer of those machines and I’ve deployed them to any number of manufacturers. If I can get a little bit of information from each of my customers to optimize the machine’s operating performance, I can turn around and share improvements with all the companies who purchase my equipment. When information is shared amongst manufacturers, it helps all of them drive down their costs and improve quality.




Tackling change in Automotive Sector with AR

The automotive industry has experienced more innovation in the last 20 years that in much of the previous 50 – and that pace of that innovation is only likely to accelerate. Since Toyota announced the Prius as the first mass-produced hybrid back in 1997, auto makers across the board embraced innovation at every level.

Powertrain innovation drives change

Hybrids have been joined by mass-produced rechargeable electric cars, as leading car manufacturers raced into a market popularized by Elon Musk’s Telsa range. The company sold close to 100,000 units in the third quarter of 2019, with the vast majority of the sales coming from the popular Model 3 sedan.

Industry giant Volkswagen, meanwhile, has set its sights on having 70 new electric vehicle models by 2028 – and building some 22 million electric cars in in the next decade. The company is also partnering to build out a network of electric car charging stations around Europe.

The drive for an all-in approach to electric vehicles was highlighted in recent research from JP Morgan. “The growth in electric vehicles (EVs) and hybrid electric vehicles (HEVs) is climbing and by 2025, EVs and HEVs will account for an estimated 30% of all vehicle sales,” states the report. “Comparatively, in 2016 just under 1 million vehicles or 1% of global auto sales came from plug-in electric vehicles (PEVs).”

Full speed ahead for the connected car

Automotive industry innovation doesn’t stop with major changes in how cars are powered. It also extends to what you can do in your car. In the last decade, we’ve moved to a point where the majority of cars assume that drivers either are – or can easily be – connected. As a result, everything from GPS maps to turn-by-turn directions to entertainment systems that provide support for technologies such as Amazon Alexa, Apple CarPlay and Android Auto are all easy and affordable options for the modern car buyer.

In addition, the modern connected car is both generating and consuming large amounts of data – something that will only accelerate with the further development of semi-autonomous and autonomous vehicles. In fact, a Techcrunch report in 2019 suggested that within the next few years Americans could be “generating 1.8 TB of data every year in their vehicles”.

The road ahead is paved with skill shortages

Industries around the world are grappling with the impact of baby boomer workforce retirement – and the auto sector is by no means immune to this trend. While the US Bureau of Labor Statistics suggests that the total number of people employed in this profession will change little in the next 10 years (and decline slightly in total numbers), the fact is that many already in the profession will age out during that time – creating a demand that will need to be met.

And given the increasing complexity and reliability of new vehicles, the skill level of the people that are hired as service technicians will need to grow. Ironically, the growth in popularity of electric vehicles – which are known to be more reliable than traditional internal combustion engines – mean that they may need service less often (or only for more difficult issues).

All of the above points to a need for not only a growth in the number of skilled technicians available in the next 10 years, but also a way to help safely train and support those technicians.

In the next chapter of this book, Atheer will explore how AR can help the auto industry leverage the power of Augmented Reality to meet the many challenges and opportunities it faces. Reserve your copy online.

Read Atheer’s AREA member profile 

 




Is Augmented Reality the Next Frontier in Flight Training?

Red 6 Aerospace’s software simulates enemies that pilots can fight during live flights. Rather than hooking up users to a closed, indoor system, the simulation works outside and adjusts as the user moves, according to creator Dan Robinson. He argues the invention can stop the Air Force’s dependence on expensive, traditional simulators and adversary air contracts while freeing up its aggressor pilots for missions other than Red Air.

Whereas virtual reality creates an entirely new world around you, augmented reality adds images to your regular surroundings that aren’t there in real life—for instance, showing an aircraft against the actual sky instead of creating both the airplane and the sky.

“We can simulate any near-peer adversary, which we are absolutely unable to do right now,” said Robinson, a former United Kingdom Royal Air Force pilot. “My vision is taking this technology to a point where we should never have to physically put another Red Air adversary, i.e., a real aeroplane, in the sky to provide Red Air again.”

The Air Force is increasingly trying to integrate AR and VR into regimens from maintenance to training to mission planning to operations. It argues airmen learn quickly through digital methods that are more responsive and require fewer traditional resources like instructors and certain equipment.

Air Education and Training Command’s Pilot Training Next initiative is helping spearhead that effort, as is AFWERX, the Air Force’s organization that helps find and foster new technologies, largely from commercial industry. Red 6, which launched in January 2018, holds a Small Business Innovation Research contract with the Air Force and is partnering with Air Combat Command’s Training Support Squadron, the service said. The company also secured $2.4 million in its first round of seed funding earlier this year, according to the Los Angeles Business Journal.

Red 6 demonstrated its AR simulation on the ground in February for the Air Force Test Pilot School, Air Combat Command, Air Force Research Laboratory, AFWERX, AETC, and others, in an aircraft the company built, Robinson said. The event was successful, he said. A second demo is planned for next month.

Robinson said the company has already started vetting its AR in the air. The Air Force, Navy, and Royal Air Force, as well as aerospace companies and investors, are slated to attend the demo, he said.

As the service looks toward AR and VR, the military acknowledges there’s more to learn about the software. In March, an Air Force Institute of Technology study focused on using AR for maintenance pointed out that the technology may need a wireless network connection, that the technology can mildly disorient users, and that simple tasks can become more difficult in the virtual world.

Although AR can be beneficial overall, the study said, the Air Force’s infrastructure security “may hinder full integration.”

The service needs to understand the technology’s expected benefits and implications outside of the limited uses that have already been studied, the report stated. Others in the Air Force expect that the service, driven by younger airmen rising through its ranks, will embrace AR as “digital natives.”

“If the Air Force fully implements VR/AR into its flight training processes, the students could have virtual hands-on experience much earlier in their careers, which could bridge the training-to-experience gap challenge that the Air Force now faces,” the service said in a January release.

The full article in AirForce Mag can be read here.




How Augmented Reality is Being Used in Industry

A number of uses of Augmented Reality in business are discussed in this article with links to videos. Included is a section on examples of AR in Worker Training

Augmented reality examples also increasingly abound in potentially dangerous sectors, where the cost of training engineers can hit overheads hard and regular deployment on site can highly skilled staff at risk.

Jonathan Bridges, Chief Innovation Officer at networking service Exponential-e notes firms utilising AR can give their employees “a full immersive experience that can be used to represent a deep sea dive or a runway, for example. Enabling pilots, engineers, soldiers, and surgeons to get to grips with the key parts of their job in a safe and controlled environment saves companies money, improves skills and reduces the risks associated with training for these dangerous jobs.”

AREA member Lockheed Martin uses AR software in conjunction with the Microsoft HoloLens’s to accelerate workflows and the manufacturing of NASA’s Orion spacecraft.

San Francisco-based software firm Scope AR (another AREA member) develops a host of enterprise augmented reality examples from engine tuning to visualised heads-up displays of workflow charts.

In terms of AR in video communications, hardware and graphics specialists Nvidia have released a software solution for messy video streaming backgrounds. The RTX Greenscreen loads a virtualised image or environment into the background of video stream. It uses AI to track and demarcate the outline of the user’s body so they are placed in front of the virtualised background.

Dmitry Ogievich, CEO of computer vision and augmented reality firm Banuba told us that: “Sometimes the perception of AR in video communications is one that is limited to fun self expression – of which face filters and 3D avatars are second to none – but the truth is how AR solutions are being utilised varies significantly based on the target audience and use case.”

“In enterprise, people tend to seek two main things; comfort and privacy. With an increasingly mobile workforce, it is often desirable to be able to remove any background video and exclusively show the people in focus, eliminating any privacy fears. AR makes this possible no matter your surroundings.”

With the coming onslaught of 5G enabled applications expect to see a host of augmented reality examples come out of the woodwork as both the commercial and industrial sectors find innovative uses.




4 Enterprise AR use cases

AR is beginning to have an impact in business contexts, as a wider range of enterprises pilot and adopt AR capabilities. The global market for enterprise AR applications is estimated to reach $14.2 billion by 2022, according to ARtillery Research. In a 2018 HBR-Analytics Services survey, 49 percent of respondents were piloting or had deployed some form of mixed reality in their company workflows, and 68 percent said that mixed reality would play an important role in achieving strategic goals.

AR is currently delivering significant value in areas such as training and simulation, work instructions, remote assistance, inspection and repairs, and knowledge capture.

According to PTC’s 2019 State of Industrial Augmented Reality report, however, pockets of AR innovation are taking place in verticals including consumer packaged goods, retail, architecture and construction, professional services, and education. Here, the tools are enabling new sales and marketing experiences, improving operational efficiency, increasing engineering quality, and creating new products and services.

Four use cases from companies that are deploying AR today:

  1. Unilever’s AR use case: Remote assistance and knowledge sharing

Global consumer goods manufacturer Unilever estimates that it will lose some 330 years of collective work and domain experience in just one of its European factories as its aging workforce retires. That loss of expertise in its plants ­– and lack of know-how among newer hires ­– can lead to costly downtime in its facilities.

The company began working with AR training and knowledge solutions provider ScopeAR, exploring ways to reduce that downtime with a live AR support application that allows technicians to collaborate with experts remotely. Users can share their view of a situation with a remote expert, and the AR maps work instructions and expert collaboration directly onto an object or area. Unilever says that it has seen a 50 percent reduction in downtime in facilities where the AR tools are in use, creating a direct ROI of 1,717 percent of the initial investment.

  1. Boeing’s AR use case: Wiring an airplane

Historically, engineers would consult 20-foot-long paper diagrams as they did their work.

The wiring of an aircraft has always been a big pain point, both in production and during inspection, according to Paul Davies, a Boeing research & technology engineer. What’s more, it’s a process with no room for error. Historically, engineers would have to consult their 20-foot-long paper diagrams of the complex and detailed wiring requirements as they did their work. Not only was the process inefficient, but it was also almost impossible to do correctly the first time, resulting in significant rewiring work for each plane.

Today, some of those engineers instead put on Microsoft Hololens that display digital 3D writing diagrams directly on KC-46 tankers and 767 freighters they are wiring. Initial studies indicate that the AR approach results in a 90 percent improvement in first-time quality when compared to using two-dimensional information on the airplane, and cut the time required to do the wiring work by 30 percent. That saves millions of dollars per aircraft, the company says.

  1. DHL Supply Chain’s AR use case: Better warehouse operations

Smartglasses help personnel locate, scan, sort, and move inventory without handheld scanners.

DHL was one of the first companies to explore AR back in 2014 and has recently expanded its “vision picking program” worldwide. The third-party logistics provider gives warehouse workers smartglasses (currently the latest version of Google Glass Enterprise Edition) which help them locate, scan, sort, and move inventory without using handheld scanners or referencing paper forms.

The integrated heads-up display overlays key parcel information within the company’s logistics hubs, scans barcodes, and relays instructions in real time. Workers using the glasses are 15 percent more productive, according to DHL. DHL has been progressively rolling them out to more of its warehouses around the world during the last few years, most recently expanding use to its internal express hubs in Brussels and Los Angeles, with plans to roll them out at airports in New York, Cincinnati, and Chicago. Looking ahead, DHL Supply Chain COO and CIO Markus Voss said the glasses could eventually be upgraded with object recognition.

  1. Lowe’s AR use case: Making DIY less painful

Home improvement retailer Lowe’s has focused on one particular statistic that could be the key to its continued growth: 32 percent of home improvement projects are abandoned before they even start. That amounts to some $70 billion. In fact, it’s one of the problems the company’s Lowe’s Innovation Labs has dedicated itself to solving – and AR has proven a particularly valuable tool in developing new solutions. Lowe’s Vision Navigation app overlays turn-by-turn digital directions, enabling customers to navigate its stores more efficiently. Customers shopping for two or more items were able to find products two times faster than with self-navigation, and the AR app also help associates (particularly new ones) do the same.