1

Leading Water Utility in Wales Turns to AR to Reduce Errors and Improve Service

Dŵr Cymru Welsh Water (DCWW) supplies drinking water and wastewater services to most of Wales and parts of western England. A public utility with 3,000 employees serving 1.4 million homes and businesses, DCWW maintains 26,500km of water mains and 838 sewage treatment works. DCWW recently launched a pilot project to develop a mobile solution with AR capabilities to replace thousands of pages of operations and maintenance manuals. The AREA spoke recently with DCWW’s Gary Smith and Glen Peek to learn more about the solution, which they call the Interactive Work Operations Manual (IWOM).

AREA: What problem were you trying to solve with this solution?

DCWW: We need to provide our operational teams with comprehensive information on how to operate and maintain our assets, which was traditionally delivered in the form of an operations and maintenance manual, which could run to thousands of printed pages. We wanted a solution that could deliver that complex information in a robust device, utilising sensing technology and augmented and virtual reality. We wanted something that could tell users their location on an asset, allow for items of equipment to be interrogated via a readable code or tag, and provide information including running state and maintenance requirements. We felt a solution like that could help users make better-informed, accurate decisions, helping to reduce errors and risk and improve customer service.

AREA: What is the IWOM?

DCWW: The Interactive Work Operations Manual is a smart, tablet-based electronic manual that can be worn or held by users, delivering complex technical information including schematics, process flows, electrical drawings, maintenance records and service requirements in a simple to use, intuitive device. The IWOM uses near field communication (NFC) and QR codes to identify equipment and presents users with information about the equipment. The IWOM is intrinsically safe so can be used in hazardous, outdoors, wet and dusty environments. It uses AR to overlay an augmented layer. This layer shows process information and instructions such as directional information for valves and switches, or ideal operating ranges for gauges and dials – information hovers in front of users and does not obstruct their view, minimising risk but enhancing what they see, helping users make the right decisions and record operating information.

The IWOM delivers this information in an interactive and visually attractive way. Site inductions include videos, highlighting areas of high hazard and daily dynamic risks. Users sign the device and these records interface with SAP so that records can be retrieved either on a site-by-site or per-user basis to help prove compliance.

AREA: How is the IWOM integrated with “lean” operating principles?

DCWW: The device automates the delivery of lean rounds and prompts for user action at the required timeframes. Users undertake rounds and record findings, which are automatically and seamlessly synched with the central lean records. This removes the need for paper records and multiple handing of the same data, thereby saving time, driving efficiency and reducing the likelihood of operator error in data transfer.

AREA: How were you able to ensure user acceptance?

DCWW: Several ways. First, we engaged with users from the start. Our development team have worked side by side with the operational teams to ensure that they developed what users want. User feedback to the initial proof of concept has been excellent and there is a real hunger now to widen the pilot scope and show what we can achieve with the smart application of the technology, driving efficiencies and increasing safety and reliability.

We also engaged with industry and emerging technology developers. The development team realised they were leading the way in the water and sewage sector in the application and use of this technology. The team have engaged with industrial and commercial sectors to share their ideas with universities and other groups, including the AREA, the British Computer Society, and the UK Wearable Technology Development Group.

During the later stages of the pilot device development, additional teams were drawn into the assessment of beta test versions and the ideas were showcased at Works Management Board, Directors Briefings, team meetings and innovation groups to measure its acceptability and usability. Without this involvement, the programme would not have been a success.

Our development team is now developing a similar version of the IWOM for the Waste Water Treatment process, and a second pilot version is under development and will be delivered before the end of March 2018.

AREA: How long did it take to develop the IWOM?

DCWW: The development has progressed from an idea generated and showcased in the spring of 2017 to a live working application developed over a nine month period and now installed on a ruggedized handheld tablet or wearable device, allowing the users to access the content from any location in a safe environment with the option of either hands-free or handheld technology.

AREA: Does the IWOM include a Digital Twin?

DCWW: Yes, the IWOM’s digital twin is a replica of our physical assets, processes and systems. In the pilot program, we digitally mapped the inside and outside of the pilot site with 3D Point Cloud laser scanning equipment and these data points were used to create a digital image. It is possible to virtually walk or fly through this digital model, allowing users to view any area of the pilot site from the tablet or remotely. One significant advantage of this process is that it enables a user to view the building, structure or an item in a plant from any location and direct an operator on the site without physically being present in the same location.

AREA: How does this tie into the Internet of Things?

DCWW: The IWOM is a driving force in Welsh Water’s IoT strategy. We are working to connect all the devices in our environment that are embedded with electronics, sensors, and actuators so that they can exchange data. The IWOM leads the way in how the IoT is being developed in the water and sewage sector.

AREA: What makes the IWOM unique?

DCWW: The development team have been able to take the best of the existing paper documentation and merge it with cutting-edge technology at a very low unit cost to deliver an intuitive product to operators in the field. For example, site inductions and dynamic risk assessments are delivered interactively and are crucial to helping reduce risk and ensure employees and visitors return home safely at the end of their working day. The IWOM is also one of a very few industrial AR applications that is entirely self-contained in a handheld device.

AREA: What benefits has the IWOM delivered?

DCWW: The IWOM resolves the need to reproduce complex operating manuals in a paper format. Updates to the operational manuals are presently delivered by a team of CAD technicians who transcribe the design data into a rendered 3D digital format rather than stripping out detail to produce a simple 2D printed image. It’s much more efficient. Also, by eliminating paper manual updates, we’ve saved the operators the trouble and error associated with manually removing and replacing pages. The IWOM is updated centrally and an electronic copy pushed to users so they always have the most up-to-date version of the manual at the point they need it.

AREA: How easy is it to replicate the IWOM?

DCWW: The IWOM has not been resource intensive. Much of the development has been done in-house, by enthusiastic skilled amateurs and in the team’s own time in the evening and weekends. Welsh Water purchased a tablet computer to display the output and secured a “two for one” deal on the wearable device so we have kept the total hardware costs down. A development partner was secured by tender to help us design the bespoke software and they were so committed to the project that they contributed to the development costs. Companywide implementation would require a bespoke package to be developed for each site. We believe we can develop these in-house as our skill set grows, and rather than place the software package on a bespoke device on each site, we believe we can host it on our own servers. That would mean that the information on any asset where an IWOM had been created would be available to any user with the right credentials and user access.

AREA: How can our readers learn more about your work?

DCWW: Welsh Water is presenting at the April 2018 Welsh Innovation Event, the STEM event in February, Smart Water 2018, the Wearable Technology Conference, and at a hosted seminar for the British Computer Society in the autumn of this year.

AREA: What would like to say to AREA members and the broader AR ecosystem?

DCWW: The IWOM development team would like to share their ideas with others. We want to continue to explore what other industries are doing in this area and share best practices. In particular, we would like to hear from other developers in the AR, IoT and haptic fields of expertise.

Gary Smith is Head of IMS and Asset Information, Glen Peek is WOMS Manager, and Ben Cale is the Data Analyst at Dŵr Cymru Welsh Water.”




How AREA Research Projects are Furthering the Adoption of Enterprise AR

Research

With the recent publication of our Measuring AR ROI Calculator and Best Practices Report, we at the AREA are demonstrating our commitment to addressing obstacles that enterprises face when introducing and expanding their AR initiatives.

The report, calculator and case study are the output of the second in an ongoing series of research projects aimed at addressing the critical questions faced by enterprises seeking to launch and expand AR initiatives.

“The ROI report and the detailed case study prepared by Strategy Analytics for the AREA offer the most detailed explanations of the factors that must be considered when preparing a complete ROI analysis on AR and help to pinpoint where impacts will be greatest,” said Christine Perey, PEREY Research & Consulting, and the chair of the AREA Research Committee. “The calculator with instructions is the first tool of its kind and can be used immediately by business planners and AR project managers.”

Selected and funded directly by members, the AREA research projects offer tangible value, not only to enterprises developing their AR strategies, but also AR solution providers and non-commercial members. Member-exclusive research results include:

For more information, please email info@thearea.org.




Recapping the AREA/DMDII 2nd Enterprise AR Workshop

The Augmented Reality Enterprise Alliance (AREA) and the Digital Manufacturing and Design Innovation Institute (DMDII), a UI LABS collaboration recently hosted the 2nd Enterprise AR workshop at the UI Labs in Chicago. With over 110 attendees from enterprises who have purchased and are deploying AR solutions, to providers offering leading-edge AR solutions, to non-commercial organisations, such as universities and government agencies

“The goal of the workshop is to bring together practitioners of Enterprise AR to enable open and wide conversation on the state of the ecosystem and to identify and solve barriers to adoption,” commented Mark Sage, the Executive Director of the AREA.

Hosted at the excellent UI LABS and supported by the AREA members, the attendees enjoyed two days of discussions, networking, and interactive sessions.

Here’s a brief video summary capturing some of the highlights.

Introduction from the Event Sponsors

Sponsored by Boeing and Upskill, the workshop was kicked off by Paul Davies, Associate Technical Fellow at Boeing and the AREA President. His introduction focused on the status of the Enterprise AR ecosystem, highlighting the benefits gained from AR and some of the challenges that need to be addressed.

Summary of AR Benefits

Mr Davies added, “We at Boeing are pleased to be Gold sponsors of this workshop. It was great to listen to and interact with other companies who are working on AR solutions. The ability to discuss in detail the issues and potential solutions allows Boeing and the ecosystem to learn quickly.”

 Developing the Enterprise AR Requirements Schema

The rest of the day focused on brainstorming and developing a set of use cases that the AREA will build on to create the AREA requirements / needs database and ultimately be added to the AREA Marketplace. The session was led by Glen Oliver, Research Engineer from AREA member Lockheed Martin, and Dr. Michael Rygol, Managing Director of Chrysalisforge.

The attendees were organized into 17 teams and presented with an AR use case (based on the use cases documented by the AREA). The teams were asked to add more detail to the use case and define a scenario (a definition of how to solve the business problems often containing a number of use cases and technologies).

The following example was provided:

  • A field service technician arrives at the site of an industrial generator. They use their portable device
    to connect to a live data stream of IoT data from the generator to view a set of diagnostics and
    service history of the generator.
  • Using the AR device and app they are able to pinpoint the spatial location of the reported error code
    on the generator. The AR service app suggests a number of procedures to perform. One of the
    procedures requires a minor disassembly.
  • The technician is presented with a set of step-by-step instructions, each of which provides an in-context 3D display of the step.
  • With a subsequent procedure, there is an anomaly which neither the technician nor the app is able to diagnose. The technician makes an interactive call to a remote subject matter expert who connects into the live session. Following a discussion, the SME annotates visual locations over the shared display, resulting in a successful repair.
  • The job requires approximately one hour to perform, meaning the portable device should function without interruption throughout the task.
  • With the job complete, the technician completes the digital paperwork and marks the job complete
    (which is duly stored in the on-line service record of the generator).

*blue = use case

The tables were buzzing with debate and discussion with a lot of excellent output. The use of a maturity model to highlight the changes in scenarios was a very useful tool. At the end of the session the table leaders were asked to present their feedback on how useful the conversations had been.

Technology Showcase and Networking Session

The day ended with a networking session where the following companies provided demos of their solutions:

Day 2: Focus on Barriers to AR Adoption

The second day of the workshop started with an insightful talk from Jay Kim, Chief Strategy Officer at Upskill (event Sliver sponsors) who outlined the benefits of Enterprise AR and how to avoid “pilot purgatory” (i.e., the continual cycle of delivering pilots with limited industrialisation of the solution).

Next, Lars Bergstrom, at Mozilla Research, provided a look into how enterprises will soon be able to deliver AR experiences to any AR device via a web browser. The attendees found the session very interesting to understand the potential of WebAR and how it might benefit their organisations.

Barriers to Enterprise AR Adoption – Safety and Security

The next two sessions generated discussion and debate on two of the key barriers to adoption of Enterprise AR. Expertly moderated by the AREA Committee chairs for:

  • Security – Tony Hodgson, Bob Labelle and Frank Cohee of Brainwaive LLC
  • Safety – Dr. Brian Laughlin, Technical Fellow at Boeing

Both session provided an overview of the potential issues for enterprises deploying AR and providers building AR solutions. Again, many attendees offered contributions on the issues, possible solutions and best practice in these fields.

The AREA will document the feedback and share the content with the attendees, as well as using it to help inform the AREA committees dedicated to providing insight, research and solutions to these barriers.

Barriers to Enterprise AR Adoption – Change Management

Everyone was brought back together to participate in a panel session focusing on change management, both from an organisation and human perspective.

Chaired by Mark Sage, the panel included thought leaders and practitioners:

  • Paul Davies – Associate Technical Fellow at Boeing
  • Mimi Hsu – Corporate Digital Manufacturing lead at Lockheed Martin
  • Beth Scicchitano – Project Manager for the AR Team at Newport News Shipbuilding
  • Jay Kim – Chief Strategy Officer at Upskill
  • Carl Byers – Chief Strategy Officer at Contextere

After a short introduction, the questions focused on “if AR should be a topic discussed at the CEO level or by the IT / Innovation teams.” After insightful comments from the panel, the audience was asked to provide their input.

Questions then focused on how to convince the workforce to embrace AR. Boeing, Newport News Shipbuilding and Lockheed Martin provided practical and useful examples.

There followed a range of questions from the audience with the panel members offering their experiences in how their organisations have been able to overcome some of the change management challenges when implementing AR solutions.

Final Thoughts

The general feedback on the two days was excellent. The ability to share, debate and discuss the potential and challenges of Enterprise AR was useful for all attendees.

The AREA; the only global, membership-funded, non-profit alliance dedicated to helping accelerate the adoption of Enterprise AR by supporting the growth of a comprehensive ecosystem and its members to develop thought leadership content, reduce the barriers to adoption and run workshops to help enterprises effectively implement Augmented Reality technology to create long-term benefits.

Will continue to work with The Digital Manufacturing and Design Innovation Institute (DMDII), where innovative manufacturers go to forge their futures. In partnership with UI LABS and the Department of Defense, DMDII equips U.S. factories with the digital tools and expertise they need to begin building every part better than the last. As a result, more than 300 partners increase their productivity and win more business.

If you are interested in AREA membership, please contact Mark Sage, Executive Director.

To inquire about DMDII membership, please contact  Liz Stuck, Director of Membership Engagement




Take the AREA 2018 Enterprise AR Ecosystem Survey Now

The Augmented Reality (AR) marketplace is evolving so rapidly, it’s a challenge to gauge the current state of market education, enterprise adoption, provider investment, and more. What are the greatest barriers to growth? How quickly are companies taking pilots into production? Where should the industry be focusing its efforts?

To answer these and other questions and better measure trends and momentum, we at the AREA are pleased to launch our second annual ecosystem survey.

The survey takes only five minutes to complete. Submissions will be accepted through February 16, 2018. We’ll compile the responses and share the results as soon as they’re available.

Take the survey here.

Make sure your thoughts and observations are captured so our survey will be as comprehensive and meaningful as possible. Thank you!




Insights on Enterprise AR from CES 2018

2017 was the year that Augmented Reality emerged from the trough of disillusionment. Enterprise AR, with nuts-and-bolts use cases and revenue, became the fastest-growing category within the AR/VR universe. According to ARtillry intelligence, hardware and software spending in 2017 was $3 billion, more than triple that of 2016.

Some of the most compelling Enterprise AR products and business strategies were on display at CES 2018. The world’s largest consumer electronics convention was an excellent opportunity for companies to exhibit in the public eye. At the forefront this year was the convergence of trends enabling the next stage of AR: hardware development and miniaturization, user-centric design, and business model innovation.

Enterprise AR’s primary focus is on using visual data to increase the capabilities of workers. It has been found that out of the five senses, sight constitutes 83% of information processed by the brain. The value proposition for augmenting visual information is real; workers are 30% more productive with AR-information delivered in context, according to Jim Heppelmann and Mike Campbell of PTC. Hence most applications gaining traction at the moment revolve around the delivery or production of visual information.

CES 2018 also revealed some of the gaps that need to be filled for the AR movement to accelerate. First, “the world is seriously devoid of AR talent,” as Jim Heppelmann noted. Secondly, the nature of spatially-based visuals requires complex, high-resolution objects to be delivered to the user. These are generally too large and dynamic to be contained within static apps on a local client and thus need to be web streamed live. The developer community needs to establish protocols for real-time AR asset streams as it has done for web VR in the past.

Wearable displays present a different paradigm for interaction and control. A killer app may be lacking the killer interaction method. Currently the most prevalent input methods are voice, swiping, and RGB and IR camera-based gesture recognition. These will leave you wanting in adverse physical environments and when performing complex tasks such as web navigation and emails. One possibility would be leveraging micro-movements as input in the same way game controllers respond to millimeter actions: small actions allow for high ergonomic efficiency and bandwidth. This type of work is being pursued by Pison and other human computer interface firms. Other firms are experimenting with multimodal combinations of brain, eye, voice, and bioactivity signals to enable context awareness.

At CES 2018, the following companies demonstrated compelling lessons for how to find edge in a rapidly ascendant industry:

Realwear – dominate with a differentiated product

Realwear’s industrial headset has high quality voice recognition and works in environments with over 95db of noise. The headset performed well on the loud CES conference floor with no false positives, even with whispered commands. The basic list of voice commands are processed on-board for smoother operations compared to internet-enabled engines like Alexa. The HMT-1 product launched in Oct 2017 and has such rapid uptake that it will be one of the top three AR headsets in use in 2018. Over 200 customers and 75 solution partners are using HMT-1 already.

In the crowded field of headset companies, Realwear has been able to achieve quick adoption and growth rates by catering to a specific user base. The company primarily serves rugged industries where using hands to do work is critically important. Matt Firlik, Head of Marketing & Business Operations, says, “The number one application is remote mentor, which gives field workers access to experts located on the other side of the site, or other side of the world.”  With that use case, remote experts can annotate what users see on their micro-display, and coach them through complex maintenance or assembly procedures. Other use cases enable users to complete work orders, view documents like complex schematics, and engage with IoT data. According to Firlik, “the HMT-1 gives workers in the field a voice that keeps them connected to their colleagues, the back office, and the work they have to do since they will never have to pick up a tablet or clipboard to do their job again.”

The company’s success is a testament to the power of product differentiation and strong focus. Realwear’s technology bets on the interaction method of on-board voice as a competitive advantage. There is no dependence on technologies utilized by other headsets such head tracking, swiping, hand grasping, and cloud-processed voice. Even so, voice recognition as a whole faces a difficult journey to become a robust, standalone modality. A significant portion of users find the experience frustrating especially older workers or those with thick accents. The challenge for Realwear will be to expand rapidly and become deeply entrenched in enterprise workflows before competitors are able to catch up in voice recognition quality. (Realwear is an AREA member and participates in many of the committees seeking to reduce barriers to AR adoption.)

Augmen.tv – how AR can leverage existing huge markets

Augmen.tv is the first camera-vision and AR streaming app for TV augmentation. Content is detected on the TV and synchronized at the millisecond time frame. Key to the user experience is the multitude of interactive AR content that extends and enriches the viewing experience. The comprehensive demo included characters jumping out of the scene, sports players and statistics displaying around the TV, and placing the viewer in an immersive 360 scene.

The company stands to build upon a successful debut on European TV and test in the US on preprogrammed as well as live shows. The app was number one in the App Store in Germany with nearly a million downloads. Users were incredibly driven to experience the tech despite the app having a massive download size; the launch iOS version was 1.1 GB! The next generation will offload content to cloud and edge servers for lighter storage on user devices. The ability to call these assets in real-time will be a major technological innovation for the entire AR industry.

The business pathway for Augmen.tv could be akin to that of Amazon Web Services (AWS). Amazon built AWS out of necessity to scale internal computing capability up and down throughout the year. The excess capacity during down times presented an opportunity to sell processing power to enterprises as a standalone offering. The challenge for a young company like Augmen.tv is to manage content creation while building first-class camera vision and asset streaming capabilities. If the company achieves the balance of being both a media and tech company, then it stands to benefit from two huge markets.

Proglove – the simplest products can be highly lucrative

Proglove is a company that surprised in both its technological simplicity and its rate of success. The entire product consists of a simple bar code scanner that is worn on the back of a partial glove with an in-palm trigger. The scanner is used mainly at car manufacturing plants and package shipping warehouses. Some use cases pair it with smart glasses for assistive reality. After just two years, Proglove is already being used in every European BMW factory. The wearable, available as a complete system for $3000, saves three seconds off every task. At car plants where each worker performs one task thousands of times a day for 300 days a year, the ROI is highly significant.

The minimalist functionality of the product was the result of paying attention to customer feedback. Proglove originally sought to develop a glove with a range of features including RFID, bending sensors, motion tracking, and displays. “We found out most of the customers would be happy with a bar code scanner. That was our MVP. Simple to use. In industry, they need time to adopt. If you have something really radical, then it might kill you as a startup until you see first revenue,” explains founder Paul Gunther. It is impressive how Proglove found a way to charge a high price for ubiquitous technology. The challenge for them will be to avoid competition leading to pricing pressures. One tactic to mitigate this is to become the official supplier for clients as the company has done with BMW.

Mira – holistic design is key

Mira has developed a technologically simple yet extremely well-designed iPhone-based headset. It is highly comfortable for the user and also very socially friendly due to its transparent and open display. Others can easily see the user’s face as well as the contents being displayed. Seeing the headset in action from the outside conveys a feeling of curiosity versus enclosed VR and AR headsets. Reinforcing this feeling is the lighthearted nature of the company’s demos, which are focused on entertainment and social collaboration games. The content is app-based and allows experiences to be shared simultaneously on both the headsets and phones.

The most critical aspect of Mira’s innovation may be the design of the holistic user experience. By presenting easy-to-use, full-experience tech in a non-geeky manner, Mira has created a beautiful product that could greatly accelerate the wide scale adoption of AR. It is easy to imagine the next social AR hit like Pokemon Go being played on a Mira headset. The challenge for a company with a technologically minimalist product like theirs is to build a competitive moat around the full stack ecosystem and software environment and find ways to enforce use of their headset versus the eventual knockoffs.

It is important for the Enterprise AR community to recognize the collective value in developing and validating solutions for AR’s current shortcomings. In an industry experiencing triple-digit growth each year, there is impetus to join the rising tide or risk being left behind. As Jay Sumit, Deloitte Independent Vice Chairman, said, “In 2018 you will see a bifurcation of businesses that embrace AR and those that cease to exist.” The companies that actively learn from shared resources and membership organizations stand to gain the most from the AR movement.

___

Dexter Ang is CEO of Pison, a company building the future of human computer interaction and bioelectric control. The company develops full-stack wearable technology for intuitive and powerful gesture control of augmented reality smart glasses. Pison has developed and patented electroneurography, or ENG, the first process for sensing peripheral nerve firings on the surface of the skin. Vertically-integrated solutions combine hardware, software, machine learning, and UI for AR industries. Investors and partners include Oculus, MIT, Draper, National Science Foundation, and HHS.




CES 2018 Recap: Atheer on the Flex AR Reference Design

One of the highlights of CES 2018 earlier this month was the introduction of an enterprise AR reference design from Flex. We spoke about it recently with Geof Wheelwright, director of marketing communications for Atheer, AREA member and a partner in the Flex announcement.

AREA: What is the purpose of the Flex enterprise AR reference design unveiled at CES?

WHEELWRIGHT: The purpose of the Flex AR reference design is to reduce time to market for companies making AR devices for enterprise and consumer applications. It includes a complete product specification, including a head-mounted display (HMD), an external processing unit (EPU), a gesture-based software platform (developed with Atheer) to manage interaction, and pre-installed Enterprise AR software. By customizing the rugged, stable and high-quality Flex AR reference design versus developing their own AR hardware, companies can significantly reduce product development costs and quickly scale manufacturing.

AREA: What is the significance of this announcement to the enterprise AR market?

WHEELWRIGHT: The significance of this announcement is that it provides a new standard for AR hardware and interaction – and a very real path to a much broader range of participants in the enterprise AR hardware market. It also goes beyond a mere hardware specification by including an interaction model that is multi-modal (i.e., it supports head motion, voice control and gestures) and a 30-day trial of Atheer AiR™ Enterprise. That means customers can immediately start using remote expert collaboration (“see what I see”) and authoring and deliver workflows and step-by-step task guidance for their unique needs. In addition, Flex will provide a full software development kit (SDK) to customers who are building on Android Nougat. The sum of all those parts means that OEMs have access to an AR offering that can provide real value to enterprise customers right out of the box.

Flex designed augmented reality headset and belt pack reference design (PRNewsfoto/Flex)

AREA: Can you give us an example of how the reference design reduces time to market?

WHEELWRIGHT: A typical hardware development cycle would involve bringing together a number of key standardized components (including operating system, processor, specialized hardware) around a particular design for a particular purpose. Hardware designers would then build and test prototypes, refine those prototypes (and then retest them as they add new components), field-test and debug the prototype. They would then have to figure out how they would manufacture the device. And all of that is before you run a single piece of third-party software on your new device.

Manufacturers using the Flex AR reference design get the advantage of a pre-designed system that is already tested and already works – cutting out a lot of the time typically involved in new hardware development. It includes cutting-edge technology from partners, including the Snapdragon 835 mobile platform from Qualcomm, designed to deliver full-color, 1080p augmented reality experiences. The Snapdragon 835 draws 25 percent less power than previous models, using an advanced 10-nanometer design.

AREA: What is Atheer’s role in the reference design?

WHEELWRIGHT: Atheer came to this project with unique experience in having designed our own smart glasses (the well-received Atheer AiR glasses) and was able to bring that to bear on helping Flex create the Flex AR reference design. Specifically, Atheer contributed our standardized multi-modal interaction model. “We know the challenge of designing a cutting-edge platform that can be mass produced,” said Soulaiman Itani, Chief Executive Officer and founder of Atheer, in his comments on the Flex announcement. “Through our work with Flex, we’ve seen their capabilities, and we’re pleased to help provide a UI system that supports gestures, voice, head motion and Bluetooth wearables for hands-free operation. We are looking forward to Flex enterprise customers being able to experience the out-of-the-box Augmented Reality tools in Atheer’s AiR Enterprise™ productivity solution for augmented reality.”

AREA: Why has Atheer partnered with Flex?

WHEELWRIGHT: Flex has the global reach, experience and respect in the electronics hardware manufacturing industry to help make our interaction model an industry standard – and bring enterprise users the real and immediate safety and productivity benefits of our flagship Atheer AiR™ Enterprise software.

AREA: Does this represent a change or an evolution in the Atheer business strategy?

WHEELWRIGHT: It represents an evolution. In 2012, Atheer was founded on a belief that AR technology could make a significant and measurable difference in how workers at industrial enterprises do their work. In the company’s initial stages, the Atheer team explored the ideal hardware needed to create impactful enterprise AR applications. It also affirmed the idea that, in order to be really useful, AR hardware would need to be based on popular, well-supported mobile operating system platforms (starting with Android).

That work led initially to the development of Atheer AiR Glasses, which later become the foundation for a reference design platform called AiR Experience that Atheer now sells (combined with a multi-modal interaction platform and access to Atheer’s partner engineering team) and is a key element of the work with Flex. The company now offers Air Experience alongside its flagship Atheer AiR™ Enterprise software, which provides real and immediate benefit for customers such as Porsche Cars North America, Inc. (PCNA). PCNA announced late last year the introduction of “Tech Live Look,” an AR technology designed to improve technical services at Porsche dealerships in the United States. “Tech Live Look” uses AiR Enterprise™ in conjunction with lightweight smart glasses.

AREA: Can we expect other similar partnerships to be announced in the near future?

WHEELWRIGHT: We are continually evaluating other partnership opportunities to help grow the market for AR solutions in the enterprise that leverage our experience and help bolster the development of key interaction standards for the AR industry.

AREA: How will this and other partnerships accelerate the adoption of AR in the enterprise?

WHEELWRIGHT: Enterprises want measurable value, power, interaction standards that make sense – as well as proven enterprise-grade applications using hardware from manufacturers they trust on operating systems they know. Our platform delivers all of those elements and helps to significantly lower barriers to adoption in a way that should move customers from limited, line of business-driven “proof of concept” lab trials to serious IT-supported evaluations that can be rolled out broadly throughout an enterprise.




AR finds a home in the Enterprise – Mobile World Live with AREA’s Mark Sage

AREA’s Executive Director Mark Sage was asked to comment for a blog piece about augmented reality finding a home in the enterprise, featured on Mobile World Live.  The article is certainly worth a read in full. Mark’s comments on AR in the enterprise are summarised below, although Mark’s comments are quoted throughout the article including opinion on the technology is not just about wearables.  The mixing of AR and VR is included as well as research from Deloitte and Forrester.

The enterprise opportunity
Technology research company ARtillry Insights (a division of the VR/AR Association) estimated in a report the enterprise AR market will hit $47.7 billion in 2021 from $829 million in 2016. Highlighting a stark contrast, the study estimated a return of $15.8 billion by 2021 in consumer AI, up from $975 million in 2016.

AREA, an organisation which claims to be the only global non-profit alliance dedicated to accelerating the adoption of AR in the enterprise, is equally confident the long-term business benefits will outstrip the consumer case.

“Enterprise AR has clear and long-term RoI benefits based on improvements and efficiency gains,” Mark Sage, executive director of AREA, told Mobile World Live: “While consumer AR will provide benefits and help educate people on its use, the potential scope and benefits available in the enterprise space will bring much greater returns.”

And, if Google’s early success is anything to go by, ARtillry Insight’s lofty projections and AREA’s own agenda could well prove on the money.

Google’s enterprise edition smart glasses, launched in July 2017, are now deployed on numerous factory floors across the world, with the company already boasting some big-name partners in DHL, GE and Volkswagen.

The device is designed to allow factory workers, for example, to “stay hands-on” by removing surrounding distractions, providing access to training videos which include images accompanied by instructions, as well as allowing fellow glasses wearers on the work floor to connect, collaborate and troubleshoot in real time. More importantly, the revamped Google Glass has a very real place in the enterprise space.

 




Behind the UK’s £33 Million Investment in AR/VR

When the British government published its Industrial Strategy White Paper last November, one of the report’s major announcements was a £33m investment in a challenge designed to “bring creative businesses, researchers and technologists together to create striking new experiences that are accessible to the general public” using immersive technologies, such as AR and VR. The goal is to “create the next generation of products, services and experiences that will capture the world’s attention and position the UK as the global leader in immersive technologies.”

One of the people guiding the effort is Tom Fiddian, Innovation Lead at Innovate UK, with whom the AREA spoke recently. Innovate UK is the UK’s innovation agency, a non-governmental public body that seeks to “drive productivity and growth by supporting businesses to realize the potential of new technologies, develop ideas and make them a commercial success.”

“The general fund is looking at challenges that can be solved by innovation,” explained Fiddian. “My job is to look after the creative industries: publishing, art, culture, film, and music. Not often do we have such an opportunity, where an emerging technology is going to disrupt the market across so many different creative sectors.”

The government investment is being allocated to several areas.

“The vast majority of the money will be available for businesses to apply for under different headings, running large-scale demonstrations in the creative industries,” said Fiddian. “We’re also looking at lowering the cost of creating content to help grow the market.”

While the £33 million is earmarked for the creative industries and not Enterprise AR, the fact that the UK government is investing so significantly in AR and VR is a testament to how much the nation’s leaders view the importance of AR and VR to Britain’s future economic position in the world.

“This is all about expanding the general AR/VR market,” said Fiddian. “I have no doubt that, even though we are focused on the creative industries, the overspill of new technologies and new methodologies will benefit the enterprise AR market, as well.”

Proposals and business cases will be evaluated over the next several months. Fiddian expects specific announcements of projects being funded will be coming in April of this year.

Tom Fiddian noted that while Innovate UK has sponsored other projects that were more broadly focused on immersive technologies for the enterprise, this new challenge is capturing significant interest.

“With the size of the investment, it’s definitely putting AR and VR on the map,” he said.




NVIDIA to watch during CES

AREA member Nvidia featured multiple times in an article on the AFR.com Financial Review website.  CES (Consumer Electronics Show) is currently on in Las Vegas.  Here’s a look at what analysts say investors will be looking for at the CES related to Nvidia and others:

  • Automotive – stocks most likely to respond to CES-related events should be Ambarella and Nvidia, Morgan Stanley analysts led by Joseph Moore and Craig Hettenbach said.
  • Nvidia is “at the centre” of many of the key innovations in consumer electronics.
  • While Nvidia will probably remain at the top of investors’ minds when thinking about AI, Intel will probably feature recent advances in the field at the event;
  • Look for updates from Nvidia regarding progress of Drive PX Pegasus platform, clarity on how the company plans to cut operating temperature and power consumption, and new partnerships and potential end-customers, KeyBanc analysts said.
  • Updates on Intel’s long-term approach to the the AR/VR ecosystem are of interest given the company’s recent move to wind down its headset reference design, MKM’s Roy said.
  • Nvidia is also expected to highlight its emerging AR/VR technologies.
  • Companies will probably showcase mobile-based augmented-reality applications, Bloomberg Intelligence analysts Jitendra Waral and Sean Handrahan said.
  • AR hardware prototypes are bound to be shown by larger companies and start-ups, but the hardware may still be years away as a supply chain and standards are still missing; until then, mobile AR will be at the forefront and CES may preview some of the ways that companies leverage AR to differentiate their products;

 




Gartner top 10 Strategic Technology Trends for 2018

Here’s what Gartner has to say about it:

Augmented Reality (AR), virtual reality (VR) and mixed reality are changing the way that people perceive and interact with the digital world. Combined with conversational platforms, a fundamental shift in the user experience to an invisible and immersive experience will emerge. Application vendors, system software vendors and development platform vendors will all compete to deliver this model.

Over the next five years the focus will be on mixed reality, which is emerging as the immersive experience of choice, where the user interacts with digital and real-world objects while maintaining a presence in the physical world. Mixed reality exists along a spectrum and includes head-mounted displays (HMD) for AR or VR, as well as smartphone- and tablet-based AR. Given the ubiquity of mobile devices, Apple’s release of ARkit and iPhone X, Google’s Tango and ARCore, and the availability of cross-platform AR software development kits such as Wikitude, we expect the battles for smartphone-based AR and MR to heat up in 2018.

Other technologies predicted to be trending in 2018 are: AI foundation, intelligent apps and analytics, intelligent things, digital twins, cloud to the edge, conversational platforms, blockchain, event-driven and continuous adaptive risk and trust.

Their full report can be read here.