1

The AREA Issues Call for Proposals for an AR Research Project

The AREA has issued a request for proposals for a funded research project that its members will use to better understand relevant data security risks associated with wearable enterprise AR and mitigation approaches.

Organizations with expertise in the field of data security risks and mitigation and adjacent topics are invited to respond to the invitation by January 30, 2017.

The goals of the AREA-directed research project are:

  • To clarify questions about enterprise data security risks when introducing enterprise AR using wearables
  • To define and perform preliminary validation of protocols that companies can use to conduct tests and assess risks to data security when introducing wearable enterprise AR systems

The research project will produce:

  • An AREA-branded in-depth report that: details the types of data security risks that may be of concern to IT managers managing AR delivery devices and assets; classifies the known and potential threat to data security according to potential severity levels; and proposes risk mitigation measures
  • An AREA-branded protocol for testing wearable enterprise AR devices for their hackability or data exposure threat levels
  • An AREA-branded report documenting the use of the proposed protocol to test devices for their security exposure threat levels.

All proposals will be evaluated by the AREA research committee co-chairs on the following criteria:

  • Demonstrated knowledge and use of industry best practices for research methodology
  • Clear qualifications of research organization and any partners in the domain of data security threats and mitigation, and AR, if possible
  • Review of prior research report and testing protocol samples
  • Feedback of references

The AREA will provide detailed replies to submitters on or before February 13, 2017. The research project is expected to be completed and finished deliverables produced by May 1, 2017.

Full information on the request for proposals, including a submission form, can be found here.

 




GE’s Sam Murley Scopes Out the State of AR and What’s Next

General Electric (GE) has made a major commitment to Augmented Reality. The industrial giant recently announced that it plans to roll out AR in three business divisions in 2017 to help workers assemble complex machinery components. In his role leading Innovation and Digital Acceleration for Environmental Health & Safety at General Electric, Sam Murley is charged with “leading, generating and executing digital innovation projects to disrupt and streamline operations across all of GE’s business units.” To that end, Sam Murley evangelizes and deploys immersive technologies and digital tools, including Augmented Reality, Virtual Reality, Artificial Intelligence, Unmanned Aerial Vehicles, Natural Language Processing, and Machine Learning.

As the first in a series of interviews with AREA members and other ecosystem influencers, we recently spoke with Sam to get his thoughts on the state of AR, its adoption at GE, and his advice for AR novices.

AREA: How would you describe the opportunity for Augmented Reality in 2017?

SAM MURLEY: I think it’s huge — almost unprecedented — and I believe the tipping point will happen sometime this year. This tipping point has been primed over the past 12 to 18 months with large investments in new startups, successful pilots in the enterprise, and increasing business opportunities for providers and integrators of Augmented Reality.

During this time, we have witnessed examples of proven implementations – small scale pilots, larger scale pilots, and companies rolling out AR in production — and we should expect this to continue to increase in 2017. You can also expect to see continued growth of assisted reality devices, scalable for industrial use cases such as manufacturing, industrial, and services industries as well as new adoption of mixed reality and augmented reality devices, spatially-aware and consumer focused for automotive, consumer, retail, gaming, and education use cases. We’ll see new software providers emerge, existing companies taking the lead, key improvements in smart eyewear optics and usability, and a few strategic partnerships will probably form.

AREA: Since it is going to be, in your estimation, a big year, a lot of things have to fall into place. What do you think are the greatest challenges for the Augmented Reality industry in 2017?

SAM MURLEY: While it’s getting better, one challenge is interoperability and moving from proprietary and closed systems into connected systems and open frameworks. This is really important. All players — big, medium and small — need to work towards creating a connected AR ecosystem and democratize authoring and analytical tools around their technology. A tool I really like and promote is Unity3D as it has pretty quickly become the standard for AR/VR development and the environment for deployment of AR applications to dozens of different operating systems and devices.

It’s also important that we find more efficient ways to connect to existing 3D assets that are readily available, but too heavy to use organically for AR experiences. CAD files that are in the millions of polygons need some finessing before they can be imported and deployed as an Augmented Reality object or hologram. Today, a lot of texturing and reconstruction has to be performed to keep the visual integrity intact without losing the engineering accuracy. Hopefully companies such as Vuforia (an AREA member) will continue to improve this pipeline.

For practical and wide-scale deployment in an enterprise like GE, smart glasses need to be intrinsically safe, safety rated, and out-of-the box ready for outdoor use. Programmatically, IT admins and deployment teams need the ability to manage smart glasses as they would any other employee asset such as a computer or work phone.

AREA: GE seems to have been a more vocal, public proponent of Augmented Reality than a lot of other companies. With that level of commitment, what do you hope to have accomplished with Augmented Reality at GE within the next year? Are there certain goals that you’ve set or milestones you hope to achieve?

SAM MURLEY: Definitely. Within GE Corporate Environmental Health & Safety we have plans to scale AR pilots that have proven to be valuable to a broader user base and eventually into production.

Jeff Immelt, our Chairman and CEO, in a recent interview with Microsoft’s CEO Satya Nadella, talked specifically about the use of Microsoft HoloLens in the enterprise. He put it perfectly, “If we can increase productivity by one percent across the board, that’s a no brainer.” It’s all about scaling to increase productivity, scaling to reduce injuries, and scaling based on user feedback. In 2017, we will continue to transform our legacy processes and create new opportunities using AR to improve worker performance and increase safety.

AREA: Do you have visibility into all the different AR pilots or programs that are going on at GE?

SAM MURLEY: We’re actively investigating Augmented Reality and other sister technologies, in partnership with our ecosystem partners and the GE Businesses. Look, everyone knows GE has a huge global footprint and part of the reward is finding and working with other GE teams such as GE Digital, our Global Research Centers, and EHS Leaders in the business units where AR goals align with operational goals and GE’s Digital Industrial strategy.

At the 2016 GE Minds + Machines conference, our Vice President of GE Software Research, Colin Parris, showed off how the Microsoft HoloLens could help the company “talk” to machines and service malfunctioning equipment. It was a perfect example of how Augmented Reality will change the future of work, giving our customers the ability to talk directly to a Digital Twin — a virtual model of that physical asset — and ask it questions about recent performance, anomalies, potential issues and receive answers back using natural language. We will see Digital Twins of many assets, from jet engines to or compressors. Digital Twins are powerful – they allow tweaking and changing aspects of your asset in order to see how it will perform, prior to deploying in the field. GE’s Predix, the operating system for the industrial Internet, makes this cutting-edge methodology possible. “What you saw was an example of the human mind working with the mind of a machine,” said Parris. With Augmented Reality, we are able to empower the workforce with tools that increase productivity, reduce downtime, and tap into the Digital Thread and Predix. With Artificial Intelligence and Machine Learning, Augmented Reality quickly allows language to be the next interface between the Connected Workforce and the Internet of Things (IoT). No keyboard or screen needed.

However, we aren’t completely removing mobile devices and tablets from the AR equation in the short term. Smart glasses still have some growing and maturing to do. From a hardware adoption perspective, smart glasses are very new – it’s a new interface, a new form factor and the workforce is accustomed to phones, tablets, and touch screen devices. Mobile and tablet devices are widely deployed in enterprise organizations already, so part of our strategy is to deploy smart eyewear only when absolutely needed or required and piggyback on existing hardware when we can for our AR projects.

So, there is a lot going on and a lot of interest in developing and deploying AR projects in 2017 and beyond.

AREA: A big part of your job is navigating that process of turning a cool idea into a viable business model. That’s been a challenge in the AR world because of the difficulty of measuring ROI in such a new field. How have you navigated that at GE?

SAM MURLEY: That’s a good question. To start, we always talk about and promote the hands-free aspects of using AR when paired with smart glasses to access and create information. AR in general though, is a productivity driver. If, during a half-hour operation or maintenance task out in the field, we can save a worker just a few minutes, save them from having to stop what they’re doing, go back to their work vehicle, search for the right manual, find the schematic only to realize it’s out of date, and then make a phone call to try and solve a problem or get a question answered, an AR solution can pay for itself quickly as all of that abstraction is removed. We can digitize all of that with the Digital Twin and supply the workforce with a comfortable, hands-free format that also keeps them safe from equipment hazards, environmental hazards, and engaged with the task at hand.

Usability is key though – probably the last missing part to all of this – to the tipping point. Our workforce is so accustomed and trained to use traditional devices – phones, tablets, workstations, etc. Introducing smart glasses needs to be handled with care and with an end-user focus. The best AR device will be one that requires zero to no learning curve.

It is important to run a working session at the very start. Grab a few different glasses if you can and let your end users put them on and listen to their feedback. You need to baseline your project charter with pre-AR performance metrics and then create your key performance indicators.

AREA: At a company like GE, you’ve got the size and the resources to be able to explore these things. What about smaller companies?

SAM MURLEY: That’s definitely true. I hope we see some progress and maturation in the AR ecosystem so everyone can benefit – small companies, large organizations, and consumers. The cost of hardware has been a challenge for everyone. Microsoft came out with the HoloLens and then announced a couple of months later that their holographic engine in the system was going to be opened to OEMs. You could have an OEM come in and say, maybe I don’t need everything that’s packed in the HoloLens, but I still want to use the spatial sensing. That OEM can potentially build out something more focused on a specific application for a fraction of the cost. That’s going to be a game changer because, while bigger companies can absorb high-risk operations and high-risk trials, small to medium size companies cannot and may take a big hit if it doesn’t work or rollout is slow.

Hopefully we’ll see some of the prices drop in 2017 so that the level of risk is reduced.

AREA: Can you tell us about any of the more futuristic applications of AR that you’re exploring at GE?

SAM MURLEY: The HoloLens demo at Minds + Machines mentioned earlier is a futuristic but not-that-far-off view of how humans will interact with data and machines. You can take it beyond that, into your household. Whether it’s something you wear or something like the Amazon Echo sitting on your counter, you will have the ability to talk to things around as if you were carrying on a conversation with another person. Beyond that, we can expect that things, such as refrigerators, washing machines, and lights in our houses, will be powered by artificial intelligence and have embedded holographic projection capabilities.

The whole concept around digital teleportation or Shared Reality is interesting. Meron Gribetz, Meta’s CEO, showcased this on stage during his 2016 TEDx – A Glimpse of the Future Through an Augmented Reality Headset. During the presentation, he made a 3D photorealistic call to his co-founder, Ray. Ray passed a digital 3D model of the human brain to Meron as if they were standing right next to each other even though they were physically located a thousand miles apart.

That’s pretty powerful. This type of digital teleportation has the potential to change the way people collaborate, communicate, and transfer knowledge amongst each other. Imagine a worker being out in the field and he or she encounters a problem. What do they do today? They pick up their mobile device and call an expert or send an email. The digital communication stack of tomorrow won’t involve phones or 2D screens, rather holographic calls in full spatial, photorealistic, 3D.

This is really going to change a lot of, not only heavy industrial training or service applications, but also applications well beyond the enterprise over the next few decades.

AREA: One final question. People are turning to the AREA as a resource to learn about AR and to figure out what their next steps ought to be. Based on your experience at GE, do you have any advice for companies that are just embarking on this journey?

SAM MURLEY: Focus on controlled and small scale AR projects to start as pilot engagements. Really sharpen the pencil on your use case and pick one performance metric to measure and go after it. Tell the story, from the start to the end about how and what digital transformation can and will do when pitching to stakeholders and governing bodies.

My other recommendation is to leverage organizations like the AREA. The knowledge base within the AREA organization and the content that you push out on almost a daily basis is really good information. If I were just dipping my toe in the space, those are the types of things that I would be reading and would recommend other folks dig into as well. It’s a really great resource.

To sum up: stay focused with your first trial, determine what hardware is years away from real-world use and what is ready today, find early adopters willing to partner in your organization, measure effectiveness with insightful metrics and actionable analytics, reach out to industry experts for guidance, and don’t be afraid to fail.




The AR Market in 2017, Part 4: Enterprise Content is Not Ready for AR

Previous: Part 3: Augmented Reality Software is Here to Stay

 

As I discussed in a LinkedIn Pulse post about AR apps, we cannot expect users to run a different app for each real world target they want to use with AR or one monolithic AR application for everything in the physical world. It is unscalable (i.e., far too time-consuming and costly). It’s unclear precisely when, but I’m confident that we will, one day, rely on systems that make content ready for AR presentation as a natural result of digital design processes.

The procedures or tools for automatically converting documentation or any digital content into AR experiences for enterprise use cases are not available. Nor will they emerge in the next 12 to 18 months. To begin the journey, companies must develop a path that leads from current procedures that are completely separate from AR presentation to the ideal processes for continuous AR delivery.

Leaders need to collaborate with stakeholders to focus on areas where AR can make a difference quickly.

Boiling the Ocean

There are hundreds of AR use cases in every business. All AR project managers should maintain a catalog of possible use cases. Developing a catalog of use cases begins with identification of challenges that are facing a business. As simple as this sounds, revealing challenges increases exposure and reduces confidence in existing people and systems. Most of the data for this process is buried or burned before it escapes. Without data to support the size and type of challenges in a business unit, the AR advocate is shooting in the dark. The risk of not focusing on the best use case and challenges is too high.

There need to be initiatives to help AR project managers and engineers focus on the problems most likely to be addressed with AR. Organizational change would be a likely group to drive such initiatives once these managers are, themselves, trained to identify the challenges best suited for AR.

In 2017, I expect that some systems integration and IT consulting companies will begin to offer programs that take a methodical approach through the AR use case development process, as part of their services to clients.

Prioritization is Key

How do stakeholders in a company agree on the highest priority content to become AR experiences for their top use cases? It depends. On the one hand there must be consistent AR technology maturity monitoring and, in parallel, the use case requirements need to be carefully defined.

To choose the best use case, priorities need to be defined. If users perceive a strong need for AR, that should weigh heavily. If content for use in the AR experience is already available, then the costs and time required to get started will be lower.

A simple method of evaluating the requirements appears below. Each company needs to define their own priorities based on internal drivers and constraints.

ch

A simple process for prioritizing AR use cases (Source: PEREY Research & Consulting).

Markers Won’t Stick

One of the current trends in enterprise AR is to use markers as the target for AR experiences. Using computer vision with markers indicates to the user where they need to point their device/focus their attention, consumes less power and can be more robust than using 3D tracking technologies in real-world conditions.

However, for many enterprise objects that are subject to sun, wind and water, markers are not a strategy that will work outside the laboratory. Those companies that plan to use AR with real-world targets that can’t have markers attached need to begin developing a new content type: trackables using natural features.

In 2017 more enterprise AR project managers will be asking for SDKs and tools to recognize and track the physical world without markers. For most, the technologies they will test will not meet their requirements. If well managed, the results of testing in 2017 will improve the SDKs as suggested in our post about AR software.

The AR Ecosystem and Technology are Immature

While the title of this post suggests that enterprise content is not in formats and associated with metadata to make AR experiences commonplace, the reverse statement is also true: not all the required AR components are ready for enterprise introduction.

Projects I’ve been involved with in 2016 have shown that while there are a few very solid technologies (e.g., tracking with markers on print), most components of AR solutions with which we are working are still very immature. The hardware for hands-free AR presentation is one area that’s changing very rapidly. The software for enterprise AR experience authoring is another. As more investments are made, improvements in the technology components will come, but let’s be clear: 2017 will not be the year when enterprise AR goes mainstream.

For those who have seen the results of one or two good proofs of concept, there will be many people who will need your help to be educated about AR. One of the important steps in that education process is to participate in the activities of the AREA and to share with others in your company or industry how AR could improve workplace performance.

When your team is ready to introduce AR, call in your change management group. You will need all the support you can get to bring the parts of this puzzle together in a successful AR introduction project!

Do you have some predictions about what 2017 will bring enterprise AR? Please share those with us in the comments to this post. 




The AR Market in 2017, Part 3: Augmented Reality Software is Here to Stay

Previous: Part 2: Shiny Objects Attract Attention

 

There are some who advocate for integrating AR directly and deeply into enterprise content management and delivery systems in order to leverage the IT systems already in place. Integration of AR features into existing IT reduces the need for a separate technology silo for AR. I fully support this school of software architecture. But, we are far from having the tools for enterprise integration today. Before this will be possible, IT groups must learn to manage software with which they are currently unfamiliar.

An AR Software Framework

Generating and presenting AR to users requires combining hardware, software and content. Software for AR serves three purposes:

  1. To extract the features, recognize, track and “store” (manage and retrieve the data for) the unique attributes of people, places and things in the real world;
  2. To “author” interactions between the human, the digital world and real world targets found in the user’s proximity, and publish the runtime executable code that presents AR experiences; and
  3. To present the experience to, and manage the interactions with, the user while recognizing and tracking the real world.

We will see changes in all three of these segments of AR software in 2017.

Wait, applications are software, aren’t they? Why aren’t they on the list? Before reading further about the AR software trends I’m seeing, I recommend you read a post on LinkedIn Pulse in which I explain why the list above does not include thousands of AR applications.

Is it an AR SDK?

Unfortunately, there is very little consistency in how AR professionals refer to the three types of software in the framework above, so some definitions are in order. A lot of professionals just refer to everything having to do with AR as SDKs (Software Development Kits).

In my framework AR SDKs are tools with which developers create or improve required or optional components of AR experiences. They are used in all three of the purposes above. If the required and optional components of AR experiences are not familiar to you, I recommend reviewing the post mentioned above for a glimpse of (or watching this webinar for a full introduction to) the Mixed and Augmented Reality Reference Model.

Any software that extracts features of the physical world in a manner that captures the unique attributes of the target object or that recognizes and tracks those unique features in real time is an AR SDK. Examples include PTC Vuforia SDK, ARToolkit (Open Source SDK), Catchoom CraftAR SDK, Inglobe ARmedia, Wikitude SDK and SightPath’s EasyAR SDK. Some AR SDKs do significantly more, but that’s not the topic of this post.

Regardless of what it’s called, the technology to recognize and track real world targets is fundamental to Augmented Reality. We must have some breakthroughs in this area if we are to deliver the benefits AR has the potential to offer enterprises.

There are promising developments in the field and I am hopeful that these will be more evident in 2017. Each year the AR research community meets at the IEEE International Symposium on Mixed and Augmented Reality (ISMAR) and there are always exciting papers focused on tracking. At ISMAR 2016, scientists at Zhejiang University presented their Robust Keyframe-based Monocular SLAM. It appears much more tolerant to fast motion and strong rotation which we can expect to see more frequently when people who are untrained in the challenges of visual tracking use wearable AR displays such as smart glasses.

In another ISMAR paper, a group at the German Research Center for Artificial Intelligence (DFKI) published that they have used advanced sensor fusion employing a deep learning method to improve visual-inertial pose tracking. While using acceleration and angular velocity measurements from inertial sensors to improve the visual tracking has been promising results for years, we have yet to see these benefits materialize in commercial SDKs.

Like any software, the choice of AR SDK should be based on project requirements but in practical terms, the factors most important for developers today tend (or appear) to be a combination of time to market and support for Unity. I hope that with support for technology transfer with projects like those presented at ISMAR 2016, improved sensor fusion can be implemented in commercial solutions (in the OS or at the hardware level) in 2017.

Unity Dominates Today

A growing number of developers are learning to author AR experiences. Many developers find the Unity 3D game development environment highly flexible and the rich ecosystem of developers valuable. But, there are other options worthy of careful consideration. In early 2016 I identified over 25 publishers of software for enterprise AR authoring, publishing and integration. For an overview of the options, I invite you to read the AREA blog post “When a Developer Needs to Author AR Experiences.”

Products in the AR authoring group are going to slowly mature and improve. With a few mergers and acquisitions (and some complete failures), the number of choices will decline and I believe that by the end of 2017, fewer than 10 will have virtually all the market share.

By 2020 there will be a few open source solutions for general-purpose AR authoring, similar to what is available now for authoring Web content. In parallel with the general purpose options, there will emerge excellent AR authoring platforms optimized for specific industries and use cases.

Keeping Options for Presenting AR Experiences Open

Today the authoring environment defines the syntax for the presentation so there’s really little alternative for the user than to install and run the AR execution engine that is published by the authoring environment provider.

I hope that we will see a return of the browser model (or the emergence of new Web apps) so that it will be possible to separate the content for experiences from the AR presentation software. To achieve this separation and lower the overhead for developers to maintain dozens of native AR apps, there needs to be consensus on formats, metadata and workflows.

Although not in 2017, I believe some standards (it’s unclear which) will emerge to separate all presentation software from the authoring and content preparation activities. 

Which software are you using in your AR projects and what are the trends you see emerging?

 

Next: Navigating the way to continuous AR delivery




The AR Market in 2017, Part 2: Shiny Objects Attract Attention

Previous: Part 1, Connecting the Dots

 

There’s a great deal of attention being paid to the new, wearable displays for Augmented Reality. Hardware permits us to merge the digital and physical worlds in unprecedented ways. Wearable hardware delivers AR experiences while the user is also able to use one or both hands to perform tasks. The tendency to pay attention to physical objects is not unique to AR industry watchers. It is the result of natural selection: genes that gave early humans the ability to detect and respond quickly to fast moving or bright and unusual objects helped our ancestors survive while others lacking those genes did not.

Although this post focuses on the hardware for Augmented Reality, I don’t recommend focusing exclusively on advancements in AR hardware when planning for success in 2017. The hardware is only valuable when combined with the software, content and services for AR in specific use cases.

Now, considering primarily AR hardware, there are important trends that we can’t ignore. This post only serves to highlight those that, in my opinion, are the most important at an industry-wide level and will noticeably change in 2017.

Chips accelerate changes

Modern Augmented Reality hardware benefits hugely from the continued reduction in size and cost in hardware components for mass market mobile computing platforms. We need to thank all those using smart phones and watches for this trend.

As the semiconductor manufacturers gain experience and hard-code more dedicated vision-related computation into their silicon-based mix, performance of complete AR display devices is improving. Combined with the technology Intel recently acquired from Movidius (which will produce significant improvements in wearable display performance beyond 2017), Intel RealSense is an example of a chip-driven technology to monitor. Other offerings will likely follow from NVIDIA and Apple in 2017.

When available for production, the improvements in semiconductors for wearable AR devices will be measurable in terms of lower latency to recognize a user’s environment or a target object, less frequent loss of tracking, higher stability in the digital content that’s rendered, lower heat and longer battery life. All these are gradual improvements, difficult to quantify but noticeable to AR experts.

As a result of optimization of key computationally-intensive tasks (e.g., 3D capture, feature extraction, graphics rendering) in lower cost hardware, the next 12 to 18 months will bring new models of AR display devices. Not just a few models or many models in small batches.

These next-generation wearable display models with dedicated silicon will deliver at least a basic level of AR experience (delivery of text and simple recognition) for an entire work shift. Customers will begin to place orders for dozens and even, in a few cases, hundreds of units.

Optics become sharper

In addition to semiconductors, other components will be changing rapidly within the integrated wearable AR display. The next most important developments will be in the display optics. Signs of this key trend were already evident in 2016 – for example, when Epson announced the OLED optics designed for the Moverio BT-300.

It’s no secret that over the next few years, optics will shrink in size, drop in weight and demand less power. In 2017, the size and weight of fully functional systems based on improved optics for AR will decline. Expect smart glasses to weigh less than 80gms. Shrinking the optics will make longer, continuous and comfortable use more likely.

Developers raised issues about color quality and fidelity when testing devices introduced in 2015 and 2016. Color distortion (such as an oil spill rainbow effect) varies depending on the type of optics and the real world at which the user’s looking (the oil spill pattern is particularly noticeable on large white surfaces). The 2017 models will offer “true” black and higher fidelity colors in a wider range of settings. Again, the experts will feel these improvements first and “translate” them to their customers.

Another key area of improvement will be the Field of View. Some manufacturers will announce optics with 50° diagonal (a few might even reach 80° diagonal) in 2017. When combined with advanced software and content, these changes in optics will be particularly important for making AR experiences appear more realistic.

Combined with new polychromatic materials in lenses, lower weight and stronger material in the supports, optics will be more tolerant of changes in environmental conditions, such as high illumination, and will fit in more ruggedized packages.

More options to choose from

Speaking of packaging, in 2016 there are three form factors for AR displays:

  • monocular “assisted reality” hardware that clips onto other supports (frames) or can be worn over a user’s ear,
  • smart glasses that sit on the user’s nose bridge and ears, and
  • head-worn displays that use straps and pads and a combination of ears, noses and the user’s skull for support.

The first form factor does not offer an immersive experience and isn’t appropriate for all use cases, but assisted reality systems have other significant advantages (e.g., lower cost, longer battery life, lighter weight, easy to store) so they will remain popular in 2017 and beyond.

At the opposite end of the spectrum, the highly immersive experiences offered by head-worn devices will also be highly appealing for different reasons (e.g., depth sensing, accuracy of registrations, gesture-based interfaces).

We need to remember that the use cases for enterprise AR are very diverse and so can be the displays available to users. The new wearable AR display device manufacturers entering the fray in 2017 will stay with the same three general form factors but offer more models.

In addition to diversity within these three form factors there will be extensions and accessories for existing products – for example, charging cradles, corrective lenses, high fidelity audio and materials specifically designed to tolerate adverse conditions in the workplace environment.

The results of this trend are likely to include:

  • those selling wearable displays will be challenged to clearly explain new features to their potential customers and translate these features into user benefits,
  • those integrating AR displays will be more selective about the models they support, becoming partners with only a few systems providers (usually leaning towards the bigger companies with brand recognition)
  • buyers will need to spend more time explaining their requirements and aligning their needs with the solutions available in their budget range.

Wearable display product strategists will realize that with so many use cases, a single user could need to have multiple display models at their disposal. One possible consequence of this trend could be reduced emphasis on display systems that are dedicated to one user. We could see emergence of new ways for multiple users in one company or group to reserve and share display systems in order to perform specific tasks on schedule.

Rapid personalization, calibration and security will offer new opportunities to differentiate wearable AR display offerings in 2017.

Enterprise first

All of these different form factors and options are going to be challenging to sort out. Outside enterprise settings, consumers will not be exposed to the hardware diversity in 2017. They simply will not invest the time or the money.

Instead, companies offering new hardware, even the brands that have traditionally marketed to mass market audiences, will target their efforts toward enterprise and industrial users. Enterprises will increase their AR hardware budgets and develop controlled environments in which to compare AR displays before they can really make informed decisions at corporate levels. Third party services that perform rigorous product feature evaluations will be a new business opportunity.

While this post highlights the trends I feel are the most important when planning for success with AR hardware in 2017, there are certainly other trends on which companies could compete.

To learn more about other options and trends in wearable AR displays in 2016, download the EPRI Technology Innovation report about Smart Glasses for AR in which I offer more details.

What are the trends you think are most important in AR hardware and why do you think they will have a significant impact in 2017?

 

Next: AR software matures and moves toward standardization




The AR Market in 2017, Part 1: Connect the Dots

In your profession, you’re one of those who are most aware of future technologies. The proof of this fact is that you have discovered Augmented Reality and decided that it’s sufficiently important to dedicate at least a few minutes or hours to getting oriented and staying informed about the trends.

That’s the first step. But you know enough not to believe everything you read or see in a YouTube video.

The next step, if you haven’t done so already, is to train yourself to separate the biggest hype from the facts. This is not easy, but you should be able to hit this milestone by attending industry events where AR is being demonstrated and you can put your hands on the products in action, even under highly controlled conditions. Visiting one or more of the AREA members in their offices or inviting them to visit your facility will be even more valuable.

You’ll see some mock ups and, if you ask tough questions, you will also see some of the weaknesses and begin to glimpse the complexity of the problems facing adoption of these technologies. Keep a log of these experiences you have with Augmented Reality and the impressions they leave on you.

If you really want to understand the strengths and weaknesses “up close” and have budget, you can develop a project or participate in a group project that focuses on a well-defined use case.

Share what you learn

Once you’ve seen and captured notes about more than 10 live demonstrations in your journal and have personally touched AR, you can begin to “translate” for others what you’re seeing and doing.

But, wait! The insights you’ve acquired could offer a strategic advantage to your company so, why would you share them? Even if you are thinking that you should keep what you’ve gathered to yourself, I encourage you to share because AR is more than just another new technology offering you or your group a competitive advantage. This is going to be a major crowd-sourced, multi-year project. When more people are looking into AR technology, it will improve faster than when only a few are focusing on and investing in it in isolation.

Once AR is good enough to be used weekly (or daily) in more than one use case, it is going to push operational performance to new levels. Then you will be able to use it to full advantage.

AR may become as transformational to your company and industry as the Web and mobile devices during your professional career. But it requires more than one or two examples and adopters in an industry. Reaching a threshold level of adoption in your industry will be necessary. And, to begin meaningful adoption there need to be a few experts. We need people like you to translate the theory and potential of AR in your industry to practice and reality.

I’ve found that I can translate for others what I’m observing by breaking it down into four interrelated topics: hardware, software, content and services. For over a decade I’ve used these four legs of the AR platform to organize projects, to review the history of AR and to capture current status.

In a series of AREA blog posts I am sharing developments I believe will be important in AR in 2017 using this simple framework.

Connecting the dots around us

One observer can’t see all the details of the entire AR landscape, certainly not in all industries where the technology will apply. Fortunately, the AREA is a network of very bright minds that are also looking forward as well as in other directions, at the same time.

Many AREA members are in the trenches of Augmented Reality take on a forward looking challenge when, at the end of each year, they begin preparing their forecast for the following year.

I hope that these posts will permit you to find your place, connect yourself and in your comments to these posts, you will compare and contrast what you’ve observed with my experience.

If we each take a few minutes, hours or a day in this last quarter of 2016 to connect our dots together we will all be better equipped to concretely plan for an exciting year ahead!

Next: What’s new for AR hardware in 2017?




Welcome Lockheed Martin to the AREA

The newest member of the AREA is one of the largest companies in the aerospace, defense, security, and technologies industry – and an Augmented Reality pioneer.

It’s Lockheed Martin. The Bethesda, Maryland-based company, which employs 98,000 people worldwide, joined the AREA as a Sponsor member in October. Lockheed Martin will be represented on the AREA board by Christi Fiorentini, a senior manufacturing applications engineer in Lockheed’s Marietta, Georgia Aeronautics organization.

Fiorentini traces Lockheed’s involvement in AR back about 15 years, when the company’s research and development team began exploring opportunities for the technology. Each of Lockheed Martin’s business units — Aeronautics, Space Systems, Missiles and Fire Control, and Rotary and Mission Systems – has experimented with the technology. About five years ago, Fiorentini’s unit, Aeronautics, began looking into augmented reality for remote subject matter expert applications.

“The technology then wasn’t quite up to par for use in a production environment, so it got put on the back burner,” recalled Fiorentini. “Around October last year, Aeronautics gained a new interest in the technology when we observed many start-ups and smaller businesses bringing AR to fruition.”

lm_logo_notagline_blue

Lockheed Martin is seeking to incorporate augmented reality throughout the product lifecycle, from the initial design phase all the way through sustainment, with a heavy interest in manufacturing.

“We’ve been investigating the technology, going to conferences, and developing proofs of concept to build business cases, because we need to prove that this technology can work within our own boundaries so that we can make the investment,” said Fiorentini. “If we’re going to shift into this realm of technology, it’s a big move, a big status quo change, and so while I do believe the ROI is there, we need to show that it works on our actual use cases to convince our leadership to invest in it.”

That’s why Lockheed Martin joined the AREA.

“I think more people across our business are starting to realize the potential of the technology and so we’re trying to formalize our approach across the entire enterprise,” Fiorentini noted. “We’re working to bring individual players from our different business areas together and define a more strategic approach to exploit this technology. We have some upcoming pilots that we’re working on with some of the leading AR vendors, and we’re members of DMDII, the Digital Manufacturing and Design Innovation Institute. As we engage more with these vendors and other enterprise members investigating this technology, we saw the AREA as being a good place to start pushing what we think should be best practice. We’re a big player in the aerospace and defense industry, so we’re looking at how we can use our influence to shape what the AR industry for enterprise is going to look like and the AREA is a great place to help convey that message.”




New Editor Joins the AREA

The AREA has a new editor. Jim Cassidy joined the organization in October and is tasked with supporting the research and preparation of the AREA’s content by developing content strategy, authoring original and thought leadership content, editing content from members and third parties, and producing newsletters. We sat down with Jim recently to learn more about him and his work with the AREA.

Welcome to the AREA, Jim. Tell us about your background.

Thank you, it’s a pleasure to be here. I’ve been a freelance marketing communications writer for more than 25 years, working with a wide variety of clients across many industries, from consumer packaged goods to healthcare to analytical laboratory instruments. Living and working in eastern Massachusetts, much of my work has been for information technology companies, such as NTT DATA and PTC, so I have a good foundation in both the enterprise technology environment surrounding AR and the vertical markets where it’s making an impact.

What interested you in joining the AREA?

This is a very exciting time for AR. When you look at the most promising applications – take field service, for example – much of the work processes still revolve around printed manuals and parts diagrams, even as products have become more complex and intelligent. AR will have massive impact on productivity in just that one area. In an interview earlier this year, General Electric CEO Jeffrey Immelt said that by helping field engineers fix machinery better the first time, AR could be worth billions of dollars to industrial companies like GE.

The potential is great, new innovations are coming to market on a daily basis, and there’s a lot of interest and anticipation in the market. At the same time, we’re still struggling with how to get from the pilot stage to widespread adoption. There’s a lot of AR information out there, but before people can really use it, it needs to be put into context and disseminated. The AREA can play a central role in that process, and as a professional communicator, I saw that as a great opportunity.

What sorts of content are you hoping to bring to the AREA?

From the enterprise perspective, we need to continue to deliver specific, practical information that helps accelerate adoption – technical dos and don’ts, but also business-focused content that identifies AR opportunities and supports organizations in arguing a business case for adoption. I’d also like to see more case studies of successful AR deployments, and more forward-looking, visionary content from the strategic thought leaders among our members. The more we can foster an open forum for sharing ideas, the more vibrant the AREA community will be – and that will benefit everyone.

Is there anything you would like to say to AREA members?

I’d like everyone – AR providers and enterprises – to know that I’m here to support them in bringing their ideas and experiences to the AREA. I’m available to explore story ideas, do interviews, and act as an editor to help them shape and deliver the content they’d like to share with other AREA members. I’d also like their feedback on what they value most about our content and what we can do better.




Augmented World Expo Europe 2016: A Review

The inaugural Augmented World Expo (AWE) Europe is now history. The big conference and exhibition that for seven years has served as a showcase for all the emerging realities in the AR community in the US is now an international affair, having held its second Asian edition in China last month, followed by AWE Europe October 18 – 19.

20161019_144223_hdr

The event took place at the Berlin Congress Centre, in the heart of the city in Alexanderplatz. The beautiful venue was a fantastic match for the exhibition with its two floors and the large convention hall that hosted two full days of speakers. The main stage saw a number of inspiring talks by names that have made history in both AR and VR like Bruce Sterling. The speakers’ agenda also included an impressive developers track; many providers took advantage of the event to create tutorials and demonstrations of their authoring technologies. One couldn’t help noticing the growing impact that Unity3D is having as authoring tool for AR experiences. In fact, many of the major software vendors showcased their Unity3D plugins to the developers attending.

20161019_111318_hdr

The exhibition hall featured more than 45 companies showcasing software solutions, optics, devices and applications. Interestingly, a large percentage of the exhibiting companies were those that focus their business models around enterprise solutions and industry-related technologies. This strengthens the belief that enterprise AR is a major driver for the success of the technology. A side hall hosted a number of startup companies promoting their innovative ideas (one of which, PuttView, won the “Best in Show” award for its golf practice solution).

Several European AREA members were represented:

  • Bosch showcased a number of solutions for AR-enabled automotive maintenance and servicing at one the largest booths in the show.
  • Catchoom brought in their image recognition and AR platforms demonstrating use cases for both enterprise and marketing use cases .
  • Joinpad centered their demos around industrial use cases, focusing especially on smart glasses solutions for MRO scenarios of complex equipment, developed using their Arrakis SDK for AR applications authoring.

20161019_111224_hdr

The audience attending the exhibition was a mix of tech enthusiasts and industrial customers interested in the benefits of AR and VR for their businesses. Although mostly European, many ticket holders travelled from Canada and the US to participate.

20161019_171313_hdr

All in all, AWE Europe felt like a promising first edition for the AR-focused conference that has set trends for AR development in the States. Even compared to the US edition in June, many demos had evolved to a more mature stage, especially with the proliferation of innovative devices like the Hololens, showing the rapidly development of the market. While AWE Europe is somewhat smaller than its American counterpart, we at the AREA are convinced that it is here to stay and will become a “must go” event for those interested in the potential of this technology.

The AWE organizing committee will share many of the talks on the main stage and the developers track on the AWE YouTube Channel.




When a Developer Needs to Author AR Experiences, Part 1

There’s an established process for creating a new Web page. If it’s not already available, you begin by defining and developing the content. Then, there’s the formatting. Often there’s some scripting to provide interactivity. When the “authoring” is done, a page is published.

It’s not all that different for AR. Once an Augmented Reality project’s use case is clear, the experiences come about through an authoring process that resembles that of preparing and publishing content for the Web.

authoring-cycle

Figure 1. An AR authoring system combines trackables (created using features of the real world and a tracking library) with digital content that is encoded into presentation data and then assigned interactive functions (e.g., see more details, show relevant info, move and freeze in position, hide/close). The AR authoring system uses databases to store the scene elements – trackables, presentation data and interactions. (Source: PEREY Research & Consulting)

Today, Content Management Systems for the Web support the steps for page development with grace. Systems like WordPress and Drupal are so easy to use and commonplace that we hardly notice their existence.

In contrast, there are many AR authoring systems from which a developer can choose and none are as mature as CMS for the Web. The choice of tool and approach depends on the project requirements, skills of the developer and the resources available.

Define the AR Project Requirements

Before choosing an AR authoring system, the requirements must be clear. An AR experience design process should generate a storyboard and, from the storyboard, the following factors are defined:

  • User settings (indoor, outdoor, noise levels, etc.)
  • Need for a user management system to provide experience personalization or tracking
  • Need for live communication with any remote experts during the experience
  • Type of recognition and tracking required (marker, 3D, SLAM, etc.)
  • Need to access device GPS and compass for geospatial context
  • Preferred display device (smartphone, tablet, smart glasses or another type of HMD)
  • Human interaction modalities (hands-free, touch, speech, gaze)

In addition to the above variables that can be deduced from the storyboard, there could be other factors to consider. For example, if the target device is connected by an IoT protocol or if there are any supplementary files (e.g., videos, PDFs, etc.), then these need to be provided to the developer as early as possible. The project manager should also specify the frequency and types of updates that may be required after the initial AR experience is introduced to users.

When these project requirements and parameters are defined, the developer can choose the tools best suited for the AR experience authoring.

Want to know more about your choices of authoring platforms? There’s more in the next post