1

What The Future Of Manufacturing Could Look Like With AR/VR (Forbes)

Our community of readers interested in AR in the enterprise are likely to be interested in a recent article in Forbes Technology Council from the experience and perspective of Dan Gamota, working at a high-tech lab co-located in a Silicon Valley innovation center.

!Going to work was an opportunity to be fully immersed in a continuous learning environment with cutting-edge technologies and some of the best minds in engineering, science and manufacturing. Until, of course, the day we shifted to a work-from-home model.  Overnight, we were separated from each other as well as our vital lab hardware, software and tools. Yet we still are developing dozens of critical manufacturing processes, many of which have been transferred, deployed and audited in factories and facilities all over the world.”

The team moved on despite the pandemic with seamless collaboration and accelerate innovation, by collectively reaching for thier augmented reality and virtual reality (AR/VR) headsets.

These tools already have proven indispensable for training production-line operators while guiding them through complex manufacturing operations. In Singapore, for instance, a team of engineers working in our additive manufacturing center uses AR to reduce training time by 50% on complex 3D printers. Similarly, AR helps speed maintenance instruction training and facilitates remote support. Topcs covered in the article include Building Cyber-Physical Bridges, Innovation Without Boundaries and Advancing Innovation With Avatars.

Read the full original article here.

 

 




A new 3D approach to remote design engineering

And trying to untangle complex problems remotely from thousands of miles away is fraught with difficulties – even when using products like Microsoft’s Remote Assist. The expert often has to resort to waving their hands around on a screen to communicate to the technician which part of a machine they should be fixing – and which parts should be left alone.

Real-time immersive 3D collaboration is now adding a new dimension to such problem solving – users can share live, complex 3D files such as CAD data, interact with them and reveal ‘hidden’ parts deep within a machine that may be causing an issue. The technology also transforms day-to-day collaboration between remote engineering team members. Design reviews, for example, can be brought to life, with participants ‘walking through’ a model, no matter where they are in the world.

 

The fundamental problem at the root of many of these issues until now has been that enterprise teams have lacked the ability to effectively collaborate in real time using live, complex 3D data. The solution lies in purpose-built framework technology for integrating natively real-time collaboration and immersive device support directly into legacy enterprise software packages.

The key to enabling true real-time collaboration is to start where the data ‘sits’ and ensure that this original data ‘truth’ is the same for everybody when working together, no matter where they are located or what device they wish to use. This way, everyone in the team has the correct and most up-to-date information available.

Whether it is a CAD package, PLM software, an MRI scanner, robotic simulation software or a laser scanning system, many industries are becoming increasingly dependent on spatial data types and digital twins. These complex data formats are usually incompatible or just too cumbersome to use ‘as is’ in existing collaboration platforms such as Webex, Skype, Google docs or Slack – all built primarily for 2D data formats such as video, text or images.

Moreover, the legacy software generating the data itself is unlikely to have any in-built real-time collaboration functionality – forcing enterprise users to resort to one of two methods. One option is to manually export the data, carry out a painful and time-consuming reformatting process, then manually import the newly crunched data into some type of third-party standalone collaboration package. The alternative is to ignore the spatial nature of the data entirely and instead screen-grab or render out 2D ‘flat’ images of the original 3D data for use in a basic PowerPoint presentation or something similar.

Neither of these methods allows teams to efficiently collaborate using a live data truth – i.e. the original data itself instead of a reformatted, already out-of-date interpretation of it. So, both methods only compound the root collaboration problem instead of helping to solve it.

The latest generation of real-time immersive 3D collaboration technology is integrated directly into the host software, grabbing the original data at source before efficiently pushing it into a real-time environment which users can access using their choice of device (VR, AR, desktop, browser or mobile) for instant and intuitive collaboration. End-to-end encryption ensures that even the most sensitive data may be confidently shared across remote locations.

The integration into the host package provides not only a live connection to the data but also a bi-directional connection, meaning that users are still connected to the host software package running in the background. The advantage of this over standalone applications is that it still gives access to core features of the host package – enabling accurate measurement of a CAD model using vertex or spline snapping to the original B-Rep held in the CAD package, for example. All the underlying metadata from the host package is also available to the immersive experience – and annotations, snapshots, action lists or spatial co-ordinate changes can be saved back into the host package.

The new post-pandemic requirement to have a distributed workforce – in conjunction with the rollout and adoption of key technology enablers such as server-side rendering and high-capacity, low-latency connectivity – is set to accelerate the adoption and integration of real-time immersive collaboration solutions. In the future, 5G technology will also open up the potential to stream to immersive AR and VR devices – untethering the experience and facilitating factory-wide adoption of immersive solutions. For example, as ‘Industrial Internet of Things’ (IIoT) data streams from smart devices in the factory, it will be overlaid via AR glasses in the physical space. And as cloud service providers build out features such as spatial anchoring to support ever-larger physical spaces, these new services will be used within collaborative environments rich with real-time data.

Factory workers, for example, will have the ability to ‘dial an expert’ directly from a virtual panel on a smart factory device. This offsite expert will appear as a holographic colleague and bring with them live 3D data for that individual machine. Both users will have real-time IIoT data overlaid intuitively on the fully interactive 3D model to facilitate a more effective diagnosis and maintenance process.

Empowering shop-floor workers with hands-free AR and detailed 3D data will dramatically improve assembly line efficiency, with an intuitive environment where product data is fully interactive. Users will be able to move, hide, isolate and cross-section through parts, while using mark-up and voice tools to create efficient instructions for the assembly or disassembly of complex products. These instructions will be recorded and delivered as holographic guides via AR directly on the assembly line.

The next generation of real-time immersive 3D collaboration technology is even set to enable you to have a scaled-down hologram of your latest engine design sitting right in front of you on your desk. As you work on the design and refine it using your CAD software, the changes will be dynamically loaded into the hologram so that you can see the effects immediately and make any further necessary adjustments.

Meanwhile, digital sleeving – with 3D images overlaid on physical designs – will enable you to check how two parts of the engine come together, even when they are being designed by different teams in different locations. Similarly, you will be able to see how, for example, cabling will fit inside your latest aircraft seat design or where best to put the maintenance pockets for easy access.

This kind of approach adds a new dimension to the handoff between design and manufacturing. If adjustments need to be made to a fan assembly design, for example, the relevant part can be isolated within an immersive design review – and speech-to-text notes can be added to the part number and automatically become part of the change request. It’s all a far cry from endless design iterations, spreadsheets and printouts – or CAD screen shares using a 2D representation of a 3D problem.

In the post-pandemic remote world, conferencing is bringing people, video and documents together. Collaboration is now adding the fourth dimension of 3D immersive experience to complete the picture.

 




5 Tips on How AR Smart Glasses Increase Employee Satisfaction – Ubimax

Here are Ubimax’s top 5 tips:

1. SMART GLASSES IMPROVE WORKING ERGONOMICS

Smart glasses allow working with both hands throughout. Tablets or notebooks no longer need to be held in the arm. This takes the strain off many tasks and prevents incorrect posture. Imbalanced strains and joint wear due to permanent one-sided holding of scanners and repetitive movements are avoided. The workflows themselves are structured and often simplified. Unnecessary steps are eliminated.

2. INCREASE OF OCCUPATIONAL SECURITY

The ability to use both hands is a major security advantage of smart glasses. For example, it is a lot safer for a logistics worker to climb a ladder using both hands instead of operating a hand scanner in elevated positions. In addition, warnings about ergonomically questionable situations, for example when unhandy or heavy parts need to be handled as it is often the case in industrial environments, can be displayed directly on the glasses. The punctual display of safety instructions can also reduce the potential of injury in high-risk occupational fields, utilities, or in production through the targeted use of AR devices. Step-by-step instructions increase security by indicating temporary prohibited zones, e.g. for tests in laboratories, or notes on hygiene regulations.

3. WORK FACILITATION THROUGH EXPERT CALLS AND REDUCED TRAVEL ACTIVITIES

Smart glasses enable global collaboration across distances and time zones. In case of problems, experts can easily be consulted via video call on the smart glasses. The expert sees exactly what the employee sees while the employee continues to work hands-free. Collaboration through remote support enables remote training of employees and saves money by reducing the travel expenses of experts. They do not have to travel at short notice but can help their colleagues on site from their own desks. Global collaboration also increases the feeling of belonging and team spirit.

4. ALL AGE GROUPS BENEFIT FROM THE USE OF SMART GLASSES

The use of smart glasses does not only delight young, technology-oriented employees. Older employees also benefit from them. For example, heavy, unwieldy scanners are no longer needed for order picking which makes work easier on the one hand and helps workers achieve their targets faster on the other. Furthermore, smart glasses simplify the inclusion of handicapped workers. The possibility to show instructions step-by-step at the employee’s pace and to carry out a subsequent quality control allows them to participate fully in working life.

5. QUICK ONBOARDING DUE TO EASIER KNOWLEDGE TRANSFER

Onboarding training is simplified due to smart glasses. Step-by-step instructions, automatic quality checks, help and expert support enable employees to work independently and productively a lot faster. This not only makes their daily work easier, but also that of their colleagues who do not have to take on the task of training new colleagues in addition to their own work. There are fewer disappointments and negative experiences which reduces the drop-out rate, and quality and productivity reach the level of experienced colleagues much earlier.

 




Digital twins and predictive maintenance to increase efficiency at Repsol facilities

It was already in 2017 when this alliance was sealed to integrate tools such as the cloud within the oil company , to store the huge amount of data handled by the company. “Digitization is the lever towards the energy transition , ” explained the company’s CIO, Valero Marín, in a meeting with the media. The company has a return in the form of cash flow of € 1 billion until 2022 and an additional € 300 million  until 2020.

The oil company chaired by Antonio Brufau advances in the implementation of data technology, analytical models and artificial intelligence. ” All this allows predictive maintenance and improve the efficiency of operations,” said Marín. “We also explore the use of drones and the blockchain for our operations.”

And a practical example of the use of blockchain is the creation of a certification platform in which Repsol  has integrated its collaborating companies so that there is a record of the operations carried out with suppliers and distributors.
So that employees are not negatively impacted by the advancement of digitization in the company, the company has already trained 2,500 employees in artificial intelligence and another 500 in blockchain.

The oil company contemplates its future with connected and intelligent service stations . An idea that he plans to carry out through the installation of Internet of Things technologies.

The digitization processes will also reach their refineries and plants with facilities that remotely emit information to generate predictive models.

But not only that, both companies are working on the creation of digital twins, that is, data models that would allow analysis and reproduction of a scenario. Given that digital twins are a digital copy of a certain machine, it would allow simulations and understanding the consequences of certain changes, generating scenarios and validating hypotheses.

But the oil company also has a Cloud Competence Center in which more than a hundred professionals specialized in cloud technologies work . Repsol expects 70% of the infrastructure to be in cloud environments by 2022, compared to 30% today, for which it aims to reach a total of 4,000 servers in the cloud.

 




Pandemic sees surge in companies using AR

The more practical cousin of virtual reality, AR is mainly used to provide remote training and technical support to production sites and R&D centers with the help of smart glasses and 3D imaging similar to Google Street View. It allows viewers to pause videos, draw circles and lines into the image, and even use their own projected hands to point and gesture.

After remote teams helped complete a new beverage factory in Thailand seven weeks ahead of schedule, test new KitKat confectionery molds in absentia and commission new pet-food production lines in the US, Nestle plans to expand the technology across the company.

“Today we understand the full potential of the positive impact of the crisis as well,” Thomas Hauser, Nestle’s head of product and technology development, said in an interview. “We enjoy a higher level of efficiency, speed and a reduced impact on the environment.”

Philips, Electrolux

Joining Nestle are appliance makers Royal Philips NV and Electrolux AB in betting on the use of augmented reality due to the pandemic.

While Electrolux used it to deal with not being able to install equipment it shipped to North America and Latin America, Philips relied on the technology while urgently expanding ventilator capacity to cope with a surge in critically ill Covid-19 patients needing help with breathing.

In a race to set up additional production lines, the Dutch company remotely connected different sites to help train workers and exchange knowledge, bypassing the need for travel. Part of that drive is also focused on artificial intelligence in an attempt to detect how patients are trending on the basis of data analytics. The technology helps to forecast whether they fall into a delirium or into sepsis, and whether they need help.

“You see a rapid integration of virtual reality technologies,” said Philips chief executive officer Frans Van Houten. “The whole world will see an acceleration in the adoption of informatics.” – Bloomberg

Original article appears here.




Essential Steps For Any Business To Prepare For Augmented Reality

How can a business make itself ready to successfully apply AR? What will make implementation easier and more effective and ensure that the initial efforts provide a solid foundation for future transformation?

Knowing Where You Are And Where You Want To Go

There are two things you need to do at the very beginning: Identify a business goal, and assess what you are currently doing to achieve that goal.

A business goal can be retaining expertise by transferring skills from older or retiring workers to newer or unskilled workers. It can be providing product demos to prospects for products in a portfolio. It can be ensuring that engineers collaborate successfully on meeting permitting and safety requirements for new assembly lines across global locations.

Most importantly, what are the processes and procedures? Where are the bottlenecks or particular difficulties?

When considering where best to apply AR, further assessment is necessary — technology readiness. An impressive AR demo can be created for almost any business situation, but it’s important to choose a use case that can scale. AR can be used to improve routine, repetitive activities, but it won’t show its true value there, and investment in it will show less return.

AR really shines at helping with complex, varied and changing circumstances. The wider the range of product types, manufacturing procedures or workforce capabilities, the more clearly AR will show its value and the wider the organizational uptake will be.

Delivering AR To The User

AR content can be delivered to the end user in a variety of ways, and careful consideration of that user’s needs and the constraints of their work environment is necessary for a successful demonstration.

For example, a sales rep may need to present a broad portfolio of thousands of product configurations to prospects and customers. Currently, that may involve shipping samples to trade shows, providing spec sheets, and linking to diagrams and videos on webpages.

With AR, a customer can see all the details of a specific product, get a good idea of how it works and understand how it differs from the competition. Implementing AR on a phone or tablet can allow that sales rep to easily build a relationship with that customer, demonstrate a product of interest, communicate its details and use, and answer any technical questions while maintaining the touch essential to the sales process.

However, if the goal is to improve worker productivity on the production line, where various tools need to be picked up and used, AR content can be best delivered through a hands-free wearable, whether binocular eyewear such as Microsoft HoloLens or Magic Leap or monocular eyewear such as Google Glass Enterprise or the RealWear HMT-1. That information is overlaid on what the worker is seeing, whether it is instructions, fill levels or safety precautions, without interfering with the worker’s tasks.

It’s worth spending some time to really consider the various possible ways your AR could be used now and in the future so the chosen technology presents the information in an optimal way for the user.

Ensuring Access To The Necessary Content

An audit of the information necessary to build an AR experience that communicates effectively to the user can turn up gaps. This is fairly common because the range of information AR can communicate is much wider than is possible with existing channels. Ensuring the availability of this information as early in the process as possible can make for effective implementation.

If you want to provide procedural guidance to line workers, you must have — or be able to create or capture — digitized work instructions. If you want to provide 3D instructions on how to maintain and service a newly acquired machine, you must have the 3D CAD data. If you want workers to see diagnostic information about a machine’s performance such as vibration, temperatures and fill levels, that machine must have the necessary sensors and connectivity.

Identifying this information will require acquiring, storing, managing, distributing and analyzing new types of data and repurposing data you already have.

While not ideal, the lack of some information is not fatal. For example, if there is no 3D CAD data for your machine, using a head-mounted device to record an expert performing all the required maintenance procedures can fill the gap. However, identifying those gaps and planning methods for filling those gaps is essential.

Presenting That Content In A Useable Way

Technologies such as web and mobile apps, which were new not so long ago, are now established, and the methods for creating them and making them usable are defined. AR is much earlier in the process of becoming routine, so the specifics of AR usability still require attention.

Even an AR project that addresses a business goal, understands user needs and is supplied with the right content can fail if the user experience is inadequate. There are many ways to go wrong, from excessive or poorly organized information to inadequate visual contrast.

The need for usability is great, and tools to assist in AR content authoring are developing quickly. They’re already providing significant assistance to content developers, but understanding the capabilities and needs of the worker and rigorously establishing what information is most important in what context is key in this step.

You are Ready For AR

Almost every business can improve efficiency, reduce costs, more quickly skill workers or ensure compliance through the information AR communicates. Choosing the right place to try AR first takes some thought and planning, which will enable an effective AR implementation that will provide a foundation for future growth.

 




AR, AI and IIoT empower Front Line teams

The technologies underpinning the Industrial Internet of Things (IIoT) are key to the success of Industry 4.0. I recently had the honor of hosting a panel during IIoT World Days virtual conference looking at the role and power of analytics in IIoT.

Each of the five guests on the panel had insights into what they called the “Manufacturing Analytics Journey” – taking a detailed look at how analytics impacts profitability, powers prediction, informs intelligent optimization and leverages big data.

The insights they offered about the importance of data and analytics got me to thinking about the important role that AR, AI and mobile devices can play in actually making use of that data on the front line.

As it happens, integration with industrial IIoT infrastructures is something that our team has spent a great deal of time working on over the last several years. Since the first release of our “Transforming the Enterprise” white paper back in late 2018, we have been clear about the relationship between IIoT, AR and AI.

In the latest release of that White Paper, we spelled out exactly how we saw the connections between AR. AI, IIoT and machine learning. We start with the context of the frontline team member in an industrial setting who is servicing a piece of equipment.

This context could leverage data about:

  • the work identity profile of the frontline team member
  • the skill set data of the frontline team member
  • historical data covering the work instructions they may have previously worked with in relation to a particular piece of equipment they are servicing
  • the remote experts or colleagues they typically work with
  • and what level of certification and training they may have in undertaking the job they’re about to do.

Once we have that foundational context, we can combine it with information about location, time and date (all drawn from the mobile device itself) – and then start using relevant industrial IoT data to provide:

  • very specific assistance that is relevant to the task at hand,
  • insights into how the equipment that the frontline team member is working on may relate to other useful IoT data from similar equipment
  • live diagnostic data from the equipment itself.

We believe that front line teams need to be able to use their mobile devices (including smart glasses, tablets and smartphones) to get information from machines, sensors, and the IIoT infrastructure and see the the data flow into their field of vision.

The IoT data can come from the frontline team member’s immediate work environment – with QR code or object recognition scans being used to perhaps draw information about when a piece of equipment was last serviced, provide immediate access to all relevant service records, work instructions and performance data for the equipment itself.

And the utility of having these technologies linked doesn’t stop there. Context is also a vital component of helping systems become more intelligent (though ML and AI technologies) and predictive.

Leveraging both edge computing and AR technologies, enhanced by machine learning and artificial intelligence, creates a platform that can anticipate what members of the extended enterprise will need to do next – sometimes before they know it themselves.

It builds on the idea that an organization has the capability, with the simple introduction of something like our Front Line OS (powered by AR and AI), to hold up a mirror to itself – and its supply chain – to gain true predictive insight in both the specific and broad collaborations of the extended enterprise.

 




Daimler Transforms the Automotive Lifecycle

The division uses Unity to create a mixed reality pipeline connected to systems and Product Lifecycle Management (PLM) data, then deploy applications to multiple platforms, including Microsoft HoloLens, Oculus devices, and smartphones.

This blog details a few of the ways in which they create and deploy HoloLens applications at various stages of the automotive lifecycle.

Production

Daimler Protics uses Unity for a variety of use cases in the production phase, from planning factory layouts (e.g., previsualizing machinery and architecture) to assembly training (e.g., training workers on how to assemble the cars). Safety inspection is one of them (see video above).

Automakers often use robotic laser welding to precisely and efficiently fuse various parts of the vehicle together. When Daimler’s robot cell is in operation, however, the space is closed off to prevent anyone from looking inside and losing their sight, making safety inspections difficult.

The team developed an application that replays each robot’s logged movements on the HoloLens once a session is complete. This application displays predefined safety spaces, so it’s easy to verify whether the robot’s movements have adhered to safety protocols.

Sales and marketing

Mercedes-Benz formed the EQ brand for its new fleet of electric vehicles. For the Mercedes-Benz EQC, the automaker’s first fully-electric compact luxury SUV, the Daimler Protics team created a HoloLens experience to help drivers better understand the inner workings of an electric vehicle compared to the gas-powered versions to which they’re accustomed.

Designed for auto shows and dealership showrooms, the self-serve application guides users – the vast majority of whom have never used a mixed reality headset – showing them where to look and identifying various points of interest as they walk around the vehicle. Daimler’s goal is to tell a rich, interactive story about the Mercedes-Benz EQC, including the location of the battery powering the vehicle, and how it works and charges.

After-sales: Maintenance and repair

Traditional training programs use cut-section models to instruct technicians on how to service an automotive transmission. While working on a full-scale physical model is helpful for understanding, the educational value of a cutaway version that’s disconnected from the car is limited.

Daimler Protics solved this dilemma using mixed reality. The application not only surfaces the transmission’s various hard-to-see components, it also makes it easy to replicate the experience of the running engine, and visualize how it changes when shifting gears or braking.

Read the original article here.




First knee replacement surgery successfully completed with Augmented Reality Technology

Pixee anticipates the number of total knee replacement cases using the Knee+ technology will increase quickly as they already have a sizable list of surgeons interested in trying this innovation using the Vuzix M400 Smart Glasses. The combination is compact, easy to use, wireless and does not require disposables.

Pixee Medical expects to sign their first distribution agreements with implant manufacturers over the next few weeks, allowing their solution built around the Vuzix M400 Smart Glasses to be promoted by them worldwide. Pixee Medical is pursuing and expecting FDA approval (510k) for Knee+ before the end of 2020.

“The team at Pixee Medical created an innovative path to bring the Vuzix M400 Smart Glasses into the operating room to perform knee replacement surgeries and we look forward to supporting the worldwide distribution of their innovative AR solution,” said Paul Travers, President and Chief Executive Officer at Vuzix.

Read more in the full press release




Virtual Reality and Augmented Reality in Learning and Training: overhyped or new industry standard?

How Can VR Change the Training Industry?

Let’s take a look at how VR will affect the 70:20:10 model of learning. We know that classroom and e-learning modules only account for 10 percent of learning. Seventy percent of learning comes from tackling real-world tasks and problems. The other 20 percent comes from social learning via observation of others and feedback.
But what if learners could receive on-the-job experiences without actually being on the job? VR promises to do just that via a simulated environment.

Within the simulated environment, the learner must make on-the-spot decisions and respond to real-time stimuli. For example, learners in law enforcement will feel their hearts pound and their palms sweat during simulated live shooter scenarios. Your employees will stress over making the best possible decisions for your business by de-escalating angry customers or having difficult employee conversations. Even though it looks like a video game, it isn’t. It’s not about saving the world anymore, it’s about saving you money with the best trained talent. No other training medium can invoke authentic emotional responses like VR.

Is AR Just As Effective As VR?

Many of us have already experienced a primitive form of AR through Alexa or Google Home. Voice-activated tools augment our daily conversations by making the internet a conversation partner.

But AR is much more than voice commands. It can also superimpose virtual images onto the physical world. This augmented experience allows people to make different decisions. If we include chatbots, AR could provide a unique learning experience guided by a computer. It would be GPS navigation for learners.

Unlike VR, AR has already begun to change the daily practice of some professions. The FDA approved Opensight, a Microsoft AR-enhanced medical imaging product, which allows clinicians to overlay scans onto the patient and interact with the data in 3D. Similarly, Tradiebot developed an AR app for car mechanics that overlays the repair steps onto the physical car, then guides the mechanic through the repair. These innovations represent game-changing performance supports for certain professions.

Is the L&D Industry Adopting AR/VR Technology Right Now?

The 2019 Training Industry Report surveyed 240 U.S.-based education and training organizations. Here’s what they discovered about American AR/VR adoption:

15 percent of all organizations plan to invest in AR/VR technology.

1.6 percent of training is delivered with AR.

1.9 percent of training is delivered with VR.

23 percent of large companies use VR, and 11 percent use AR.

Less than 5 percent of small or mid-sized companies use VR, AR or AI.

As a whole, the industry is not seeing a rapid adoption of VR or AR. One widely used technological adoption model by sociologist Everett M. Roger suggests 5 phases of adoption: Innovators (2.5 percent), early adopters (13.5 percent), early majority (34 percent), late majority (34 percent) and laggards (16 percent). Currently, only innovators are using VR/AR.

However, if we only look at large companies, then the adoption picture changes. They appear to be entering the early adoption phase with 23 percent of them using the new technology. As the cost of AR/VR continues to fall, I predict more companies will adopt it.

Where does VR training give business the biggest boost?

Virtual reality training comes out of the educational method called “simulated training.” The aviation industry began using simulated training as early as 1929. They’ve continued to use simulated pilot training because the cost of fueling an airplane is still greater than the cost of an expensive simulation.

Like the aviation industry, educational institutions have been quick to adopt VR. Many schools and colleges cannot afford expensive laboratories. Virtual science labs provide a way for students to gain valuable laboratory experience without investing in high-tech lab equipment or materials.

For some industries, simulations allow employees to experience dangerous situations without actually endangering them. Construction workers can make dangerous errors in a virtual environment. Similarly, law enforcement officers can de-escalate life-threatening situations or react to emergencies virtually. Unlike a textbook, the simulated experience forces trainees to grapple with their own fears and emotional responses. Then, they won’t be panicking in a real-life emergency.

VR also represents an opportunity to quickly train medical professionals on new instruments or complex, new procedures. They can practice first using virtual instruments before performing the procedure on a live patient. Today, up to 30 percent of general surgeons are not yet ready to work independently at the end of their residency. VR training might help fill the gap for new surgeons.

Finally, large companies have begun to use VR for less dangerous, expensive or life-threatening skills. However, these skills still benefit from life experience. Walmart has created a VR Black Friday simulator to prepare their retail employees for the shopping holiday. Other companies have started to use VR to onboard employees by allowing them to experience their first day via VR before actually starting their job roles to reduce anxiety. Some of these skills, such as soft skills training, can be bought ready-made off the shelf.

Truly, the sky is probably the limit for the applicability of simulated training. That’s why I’m betting VR will eventually be a standard part of training like videos are today.

How do companies use AR now?

AR, unlike VR, requires the real world. AR simply enhances the real world experience.

The best example of AR in action are performance supports or job aids. Traditionally, when employees seldom used a process they would look at a laminated job aid. Today, with search navigation, we look it up. AR would take our walkthrough videos one step further by providing voice instructions and a virtual overlay to help guide us through the process.

What about processes employees do constantly? Can AR also improve them?

Research from the WHO on safe surgeries suggests using checklists improves surgical safety. AR could help perform safety checks in a variety of industries, such as general maintenance checks for machinery or safety awareness in warehouses.

AR also promises to engage learners during traditional coursework. Like Alexa or Google Home, learners could access more information to support personalized learning. They could also receive instant feedback by turning AR on to check their work. This feature could provide automated, scalable feedback to hands-on professionals in construction or manufacturing where assessing hands-on projects without expending a large number of resources presents a huge challenge. Voice-enabled AR could also lead learners through a process, even something as simple as onboarding.

Ultimately, this technology promises to improve the user-experience.

What do we need going forward for widespread AR/VR adoption?

Currently, AR/VR training tools need to be custom built by a firm or bought off the shelf. To be truly effective, companies need content authoring tools. Right now, instructional designers use tools like Articulate Storyline and Adobe Captivate. In the future, they’ll need tools for AR/VR.

Virtual and augmented reality have not seen wide public buy-in. VR headsets and AR glasses remain toys. Wider public adoption will facilitate wider adoption in the training industry, too.

From a design perspective, the headsets may also need to become more comfortable so workers like doctors or mechanics can use them for hours at a time. Prices for VR headsets and AR glasses also remain high.

Since only 1 percent of small-medium size businesses invest in VR training today, I expect we’ll see greater investment by these companies when prices drop. Given the return on investment for VR training, Josh Bersin suggests businesses focus first on the skills and competencies driving their core business. When trying to determine business-critical operations, I suggest small-midsize businesses think about where they stand to lose money. For example, manufacturing or construction companies lose money when employee errors create product defects. VR training on how to make those products could create substantial gains. Similarly, closing more sales would generate more revenue so it makes sense to invest in VR training for your sales team.

Should you invest soon?

Like most learning technology the answer is: “It depends.” Technology never offers a silver bullet. It’s a tool for your L&D team.