1

APX Labs’ Milestones in Enterprise Smart Glasses

Portions of this article were published in SAP Startup Focus in its March 26 newsletter.

Since we began in this field in 2011, countless smart glasses prototypes, working samples and production units have passed through the APX R&D lab. Predating Google Glass, we had developed rapid prototyping capabilities to build smart glasses prototypes using available components. Having entered the smart glasses industry earlier than most, our early engineering efforts were broader than the enterprise software company we have become, with the nascent market necessitating a broader technical coverage spanning all aspects of hardware, software, user interface design, human-computer interaction methods and systems thinking. Dropping smart glasses device engineering and some of the low-level software from our core expertise subsequently opened a path for APX to do more with less. 

Large enterprises across the globe have spent many billions of dollars over decades to build out electronic knowledge bases of information needed to get work done. This means that mission critical data for the deskless and hands-on workforce already exists in the enterprise, and now the imperative is to enable a seamless, bidirectional flow of information between the Enterprise Resource Planning (ERP) ecosystem and users, while refining user interactions in a contextually aware and intuitive manner. Our Skylight product, an enterprise software platform for smart glasses, helps bridge the gap between enterprise information systems and smart glasses users in need of contextually relevant data, accessible heads up and hands free smart glasses.

Our skill today is in keeping up with rapidly changing technology. To illustrate how challenging this can be, let’s look back at the different hardware options available to the enterprise customer. I hope this visually guided tour of smart glasses marking milestone moments within APX’s history demonstrates how quickly technology has advanced in a short period of time, and brings excitement and anticipation for a diversifying ecosystem of emerging devices continuing the next industrial revolution driven by wearable technology.

US Army Smart Glasses, Multiple Generations (2011-2013)

Our company’s history goes back to when we were originally selected to build software for smart glasses used by the United States Army. The biometrics application, nicknamed Terminator Vision, used the onboard camera to capture faces within the soldier’s field of view, send the captured data to a server to determine the identity of the person(s) and display the information in a heads-up and hands-free manner to the user.

Advanced for its time in terms of delivering a fully embedded, single-device-does-all smart glasses solution, these smart glasses featured an end-to-end exchange of field-collected data from the user’s environment, which was analyzed by a back-end system and delivered to the user in real time.

Augmented Reality Smart Glasses Prototype (Late 2012)

Smartglasses2

In 2012 we broadened our software capabilities to address the non-military market targeting global companies with a deskless and hands-on workforce. We commissioned several prototypes to learn more about the nuances of the ideal hardware for enterprise smart glasses. The ones pictured above used two display modules, each containing a microdisplay, a rudimentary 50:50 beam splitter (light from the environment and the microdisplay are mixed evenly to create visible content to the user), and an illumination source. A 3D printed and painted frame for the headset was designed in-house along with the control module enclosure.

This particular prototype allowed us to experiment with different content presentation options (2D, ultrawide 2D and stereoscopic 3D modes), sensor payloads (visible and infrared camera, motion tracker, microphone, etc.) and computing platforms. It demonstrated there is no single perfect design covering all industrial scenarios and confirmed that enterprise smart glasses follow the same paradigm as all other tools used in the workplace—the right tools or glasses for the right job. 

Epson Moverio BT-100EC Prototype (February 2013)

Smartglasses3

For APX’s first prototype for the Epson Moverio BT-100, we added a 9-axis inertial measurement unit (accelerometer, gyroscope and magnetometer) coupled to an Arduino platform, along with a 5MP camera and microphone module enclosed in a 3D-printed module. This in turn was wired into a daughter board for Epson’s control unit containing a battery, a video signal converter, and a USB hub. Finally, we used an Android phone for additional control and management.

This prototype represented a milestone at APX—we  had the ability to produce devices inexpensively for our developers, partners and customers, albeit in a limited fashion (inexpensive at the time meant $3,000-5,000).

Made famous by coverage on and by demos at the YouTube Sandbox at Google I/O 2013, this cemented our presence in the industrial sector with one of the first Epson-derived prototypes. Essentially a functionally equivalent prototype to Epson’s BT-200 smart glasses released a year later, this was the first device APX prototyped in our partnership with Epson.

Google Glass (April 2013)

Smartglasses4

The release of Google Glass was a milestone for the smart glasses industry for many reasons, not least of which because one of the largest technology companies in the world had introduced a fully integrated smart glasses device at the relatively modest price of $1500. This sparked significant interest from startups, venture capital and large corporations. Overnight, smart glasses went from being exotic devices reserved for researchers and the military to publicly available goods.

The Glass product announcement in 2012 led to the acceleration of the development of and spurred others to take a deeper look at the nascent industry. Google’s entry had ripple effects in the hardware industry as well, considerably increasing the pace at which companies have introduced new devices since.

APX’s vision has always been that smart glasses will fundamentally transform the way the global workforce will build, fix and move goods, delivering enhancements in productivity, efficiency and safety. Glass’ innovations and the market presence it created represented an important step in that direction.

Glass of course has seen its ups and downs, recently bringing the consumer- and app developer-facing Explorer program to an end, but the Glass at Work program, of which APX was the founding partner in April 2014, continues to thrive.

Vuzix M100 (December 2013)

Smartglasses5

Vuzix is a very well-known name in the smart glasses industry, having developed see-through displays since 2005 (not surprisingly, also for the military). Its M100 product was the first industry-targeted generally available device, complete with an ANSI-rated safety glasses attachment, and has since paved the way alongside Google Glass in setting the standard of heads-up and monocular smart glasses.

APX and Vuzix have an official partnership with the M100 integrating the fourth release of Skylight and increasing the selection of devices available for enterprises to deploy across a diverse set of use cases.

Epson Moverio BT-200 (March 2014)

Smartglasses6

Epson’s second-generation Moverio product incorporates the sensors we had added to the BT-100EC prototype and are the first generally available stereoscopic see-through smart glasses. The device also integrates Skylight for industrial AR use cases such as two-way video conferencing and workflow information in the worker’s field of view.

These use cases were demonstrated live at SAP CEO Bill McDermott’s SAPPHIRE 2014 keynote address, marking a decisive change for smart glasses in enterprise, with the technology being publicly demonstrated by a major ERP player at its largest conference. With a suggested retail price of $699, the low cost of the device provided additional incentives for enterprises to pilot and experiment with the technology for their workplace scenarios.

Sony SmartEyeglass (February 2015)

Smartglasses7

Since early 2014, Sony has showcased several iterations of the SmartEyeglass concept at multiple industry conferences. At the 2015 Consumer Electronic Show, Sony, APX and SAP Startup Focus partnered together to demo an enterprise smart glasses solution. Sony provided the hardware for the smart glasses, SAP provided ERP data from Work Manager and HANA and APX’s Skylight furnished the user experience that extended the data to wearable devices. This combination enabled user-context awareness, mobile device management and information security rule enforcement, and brought advanced media to and from users equipped with smart glasses.

Recon Jet (March 2015)

Smartglasses8

Although the Jet smart glasses product from Recon Instruments is produced primarily for the sports industry, we believe its design balancing wearability, user comfort, function, robustness and price will have a positive influence on future smart glasses designs for enterprise. The Jet has attributes that are desirable for enterprise applications: a sleek and easily wearable design that can withstand harsh outdoor environments, consumer level pricing and availability, and an interchangeable lens design.

What’s Next?

In only four years, smart glasses technology has evolved from being a research prototype, limited in capability, availability and high cost to being broadly accessible, wearable and enterprise ready. We have seen growing interest from the largest global companies in building the connected workplace for their deskless and hands-on workforce, and we believe the market for smart glasses is just getting started.

Going back to the beginning of our product timeline, we initially invested heavily in smart glasses because we recognized their potential suitability to enterprise use cases. The latest version of our Skylight product is scalable, connects to enterprise data sources and supports commercially available models of smart glasses. We are also preparing for emerging trends that will interconnect smart glasses with other mobile devices. With smartwatches taking center stage in 2015, we are extending Skylight support to a growing wearable ecosystem.

We can’t be more excited, both by how far we’ve come and where we’ll go as the wearable market takes off in the enterprise.  We’re also very excited to work alongside an industry full of partners, customers, and research institutions as a founding member of the Augmented Reality for Enterprise Alliance (AREA).  The enterprise smart glasses market requires active participation of the full value chain of enterprise mobility.  Device manufacturers, software developers, system integrators, consulting agencies, academic and research institutions will all need to collaborate to deliver on the needs of customers.  There is an elevated sense of personalization that smart glasses and other wearable devices bring to users, and defining the optimized user experience will be a critical task for everyone in the industry.  While improving the user experience and capabilities of our own product line, we recognize that the evolution of the entire enterprise smart glasses value chain requires contributions from an entire industry.

The insights gathered from collaborating with other AREA members will help improve the quality of the experience for our customers, developers and partners.  APX is striving to work with fellow visionaries to accelerate the adoption of enterprise smart glasses technology, generating ripple effects much greater than the mere sum of the AREA members’ capabilities.




Getting Started with Industrial Augmented Reality

Necessity is the mother of invention, so the old saying goes. Nowhere is that more true than where the enterprise is concerned. Intense competition, shrinking margins and the rising costs of doing business make for a demanding operating environment where even small mistakes can end up being costly in the bigger picture. The commercial aerospace market knows this all too well—cancellations and delays cost the industry $45 million per day. Similarly, a typical oil refinery in the energy sector could lose $800K per day due to improperly maintained processing equipment.

It used to be that slow and steady improvements in operational performance would win you the race—not so anymore. Now, just meeting expectations depends on improving bottom line performance 10x, 100x or even more. While there used to be room to improve in optimizing cost efficiency through process improvements alone, order-of-magnitude improvements demand investment in new disruptive technologies that will fundamentally transform how businesses operate.

Enter Augmented Reality

Here at NGRAIN, we see Augmented Reality (AR) as technology that ultimately maximizes the performance of people, machines, and the interactions between them. With companies across the Global 2000 in aerospace, defense, energy, manufacturing and healthcare using NGRAIN in the field today, we have seen firsthand the benefit that virtual and augmented capabilities can bring to the enterprise. Industrial applications that we have deployed are ensuring that specialists on the front line are getting more done than they ever could before. Our work with companies including Lockheed Martin, Raytheon and Microsoft are uncovering the incredible potential of AR, from providing on-the-job support on the manufacturing floor, to enabling the success of critical operations in the field, to keeping complex vehicles running and heavy equipment online.

It can be daunting to bring AR into your business—where do you start? Who do you talk to? Here’s what you can do today: 

  1. Get in touch with the AREA. NGRAIN is a charter member of the AREA because we are committed to reducing the barriers to AR adoption across the enterprise, and the AREA provides a natural and mutually supportive network that can help you and your organization get on the path to learning more about AR and the value it can bring to your business. 
  1. Download some software and give AR a try. It’s easy to get started with industrial AR. There is a growing number of off-the-shelf software applications you can use to quickly experience AR on a mobile device, tablet, or even wearable smart glasses such as the Epson Moverio. NGRAIN offers a commercial product called Vergence, which provides a straightforward, but feature-rich introduction to creating your own AR experiences without having to write any code or invest in the development of custom software. 
  1. Attend an AR conference. One of the best ways to evaluate the application of industrial AR within your organization is to learn from the experiences of others who are activley involved in the deployment of these technologies; most are eager to share their knowledge about this emerging space. Consequently, the growing interest in enterprise AR has resulted in a host of conferences, trade shows, and events that focus on the use of AR in industrial markets. Though not focusing exclusively on enterprise AR, the annual Augmented World Expo (AWE), now in its sixth year, draws thousands of attendees who share an interest in bringing AR to market and is a great way to get connected with the broader AR community. For those interested in an event focused entirely on enterprise and industrial AR, join the AREA members at the ARise ’15 event at the Advanced Manufacturing Research Center in Sheffield, UK on July 1, 2015. A quick online search will also yield a significant number of results for events segmented by industry and interest.

Now is a great time to be exploring the potential of AR for your business. The technology is accessible and a thriving community and active developer ecosystem we quickly emerging. Most importantly, the barrier to entry is declining and is as easy as reaching out and having a conversation. We look forward to being in touch!




AR: A Natural Fit for Plant Floor Challenges

Much has been made recently of how Augmented Reality will soon merge our digital lives with our real ones, bestowing new powers to our social and working existence. Recent advances in technology have lulled us into believing that AR in the workplace is just around the corner. Many of us have looked forward to high-tech glasses, watches and other wearables finally delivering that promise, inspired by viral YouTube videos (here, and here ) showing workers wearing glasses with full field of vision AR displays. However, this has yet to materialize.

The recent withdrawal of Google Glass and the general failure of wearables to meet expectations have influenced public perception of enterprise AR as falling rapidly from Gartner, Inc.’s Peak of Inflated Expectations into the Trough of Disillusionment. 

AR Is a Natural Fit for Solving Plant Floor Challenges

Gartner has pigeonholed AR technology into the digital marketing niche. This is possibly the result of highly visible and successful AR brand engagement campaigns, such as for sports teams, automobile companies and even business-to-business marketing. The Augmented Reality feature provided in the IKEA catalog companion application demonstrates how AR can be useful as well as drive consumer brand engagement. These campaigns and useful applications primarily address the outward-facing communication needs of the brands and are measured in terms of greater sales or customer loyalty.

Turning towards business operations, those of us involved in the manufacturing and automation field see AR as a way to address many plant floor challenges. Here are a few examples of common plant floor issues, which we believe are a natural fit for enhancement with mobile AR.

Plant Floor Problem

How AR Helps

1.     When following a procedure, workers often spend time trying to identify the part of the machine or adjustment point that requires their attention.

Visually identify and direct workers to the specific part or adjustment port that requires their attention.

2.     Workers performing an unfamiliar or infrequent task spend time searching in manuals for procedures that match the task or asking for help from co-workers.

Provide contextual visual instructions to show workers how to correctly perform unfamiliar tasks. 

3.     Workers spend time searching for data and resources that uniquely identify the equipment on which they are working.

Identify equipment or processes and visually display relevant data and resources.

4.     Technical resources required to evaluate and efficiently respond to unplanned downtime events are not available in real time.

Provide visual communication tools to provide users and remote resources with a common, real time or “snap shot” view of the equipment or process.

Table: Potential AR Solutions to common plant floor problems

It’s very tempting for an engineering team to develop an eye-catching AR application for a demonstration and to suggest that the technology also easily addresses more complex problems. These solutions are usually implemented by experts using software toolkits, rather than implementing commercial off-the-shelf software. The final implementations delivered for the customer are usually highly customized. In these cases, ROI is difficult to define. iQagent’s approach to solving plant floor problems with AR involves first focusing on the problems to be solved, and then defining a good mobile AR solution to address the challenge.

Interventions are Collaborative Endeavors

One challenge we address is #4 from the table above: technical resources required to evaluate and efficiently respond to unplanned downtime events are not available in real time.

Production downtime costs are often measured in hundreds or thousands of dollars per minute. When a production line goes down, the operator must communicate with remote technical resources in order to get production running again quickly. One factor preventing effective communication is the education gap between the operator and engineer; operators aren’t engineers, and engineers aren’t used to operating the equipment through all phases of the process. Each has specialized technical and procedural knowledge that can contribute to a solution, but traditional channels such as phone, text or e-mail aren’t perfect tools for collaboration. The operator must decide which details are important to convey to the engineer, and the engineer must find the right questions to ask in order to get a clear picture of the problem. Due to the prohibitive cost of production downtime, this effort has a very small window in which to be effective. At some point, the decision must be made to get the human resource on-site in order to return the line to normal production.

We then considered why engineers and operators are more efficient in resolving production downtime issues when collaborating in person. The operator can directly show the problem to the engineer, referring to live process values and performance indicators relevant to the process from the local automation system. The engineer can analyze the problem in real time, asking the right questions of the operator in order to resolve the problem.

A successful mobile solution duplicates the benefits of in-person collaboration, allowing each participant to effectively contribute their specialized knowledge to a common view of the process, including live data and operational details from the automation systems that are relevant to the problem.

This particular solution is a great fit for AR-enhanced software on a mobile device.

Augmented Reality with iQagent

iQagent uses the device’s video camera to identify a unique piece of equipment by scanning a QR code. The software overlays relevant process values and associated data on the camera’s displayed video feed, which can also be recorded as a snapshot or video. This provides a common view of the process required. Operators can also annotate directly on the images or video, making notes and drawing attention to areas of interest for the engineer to analyze, in effect “showing” the problem. When finished, the user saves and e-mails the video to the remote technician, who now has a much more complete picture of the problem, and in many cases, can resolve the issues more efficiently.

We feel iQagent is a great solution to some common plant floor challenges. But having a great product isn’t an end but a beginning. To make any product a success, you have to get it in front of users who need it, and you must support and continually improve the product. This is why we joined the AR for Enterprise Alliance. The AREA enables us to collaborate with other like-minded AR solution providers, end users and customers. Through education, research and collaboration, we will help to move AR out of the Trough of Disillusionment, up the Slope of Enlightenment and onto the Plateau of Productivity.




Why Augmented Reality and Collaboration Make for a Safer and Better World

Augmented Reality (AR)-enabled systems show a mechanic how to repair an engine, or perhaps in the future will guide an inexperienced surgeon in a delicate heart operation. In my opinion, it’s when AR is combined with human collaboration that the magic begins. AR will soon work its way into a variety of applications that are bound to improve our lives, but more importantly, I am convinced it’s to become a catalyst for greater human understanding and world peace.

Augmented Reality Can Bring Us Closer

Everyone’s heart raced when Jake Sculley, the wheel chair-bound Marine in the movie Avatar, first connected his thoughts to those of his avatar, walked and then ran. His mission was to infiltrate the society of the natives, learn their customs and, having gathered that information help destroy their world. Of course, we all know how the story ends…It’s difficult to do harm to those we know. The first step in Hitler’s campaign to eliminate those he considered unworthy was to convince his followers that the others were less than human. In fact, this is a universal technique involved in incitement to violence against another group. It is only when we finally get to know someone that, even if we don’t agree, we can begin to understand and care about them.

Sharing Experiences

AR allows a user to see an enhanced view of reality, placing graphic images and 3D models over the real background. This will be great for building and repairing things by ourselves, but when we combine that capability with modern telecommunications, remote users will be able to participate in those processes with local users in real time, and appear to the wearer of the glasses as if standing alongside them. We won’t just see our grandkids in a Skype screen; we will take them with us on new adventures around the world or in our backyard. An astronaut in space will literally see the hand of the equipment specialist on earth pointing to the board to be replaced as they speak.

Gutenberg changed the world because the printed page could easily display the manuals that apprentices used for learning the trades that freed them from the fields. Radio and then television added sound, motion and recently 3D to the flood of information. Telecommunications has brought the cost of distributing it to practically zero. Now AR combines these capabilities and creates an infinite number of parallel worlds that you may create and visit, as well as acquire skills in from one-on-one instruction. It’s the closest thing to teleportation this side of Star Trek.

Non-verbal communication is said to account for between 55 and 97% (depending on the study) of communication between people. AR will provide practically the same information due to its enabling of “belly to belly” proximity. You will be able to virtually sit in a conference room and interact with other remote participants, watch a theater performance in your living room or tag along with a friend on an exotic trip to a foreign land. That friend will be able to see you, too.

New Ways of Displaying Information

Talk about disruptive. This is downright neutron bomb material. Why do you need a laptop or tablet when you see the screen suspended in mid-air, with the glasses projecting a keyboard on any surface? Gone are large-screen TVs, when everyone sat stationary watching the game from the same angle. Why wouldn’t they prefer it in perfect 3D? Forget glass cockpits in airplanes; why not have all the instruments projected in your field of view? How about infrared images of deer or pedestrians in fog or at night shown on the windshield of your car, to avoid hitting them in time?

Augmented Reality and Collaboration

But, again collaboration use cases will take the cake. The level of empathetic bonding that occurs when you’re in the room with another person will make current social messaging seem like sending smoke signals. Professionals in other countries will virtually know you and work together on projects as I am proposing using the Talent Swarm platform. Along with such proximity-enabled work will come a better understanding of other countries and cultures.

Collaboration is key, but it can’t happen at scale if everyone needs to buy and use exactly the same hardware and software. Collaboration across networks and companies as diverse as the places where humans live and work builds upon deep interoperability. Interoperability with existing and future systems will require a globally agreed-upon set of open standards. We will work within the AREA to strongly advocate for interoperable systems and push for global standards together with other AREA members. Once we have collaborative AR platforms, the benefits of this technology will rapidly serve all people of the world. Becoming an AREA founding sponsor member is, for Talent Swarm, not only common sense, but putting a stake in the ground, demonstrating our leadership for a more productive and peaceful world. We will avoid embarking on another wasteful battle such as VHS vs. Beta, nor allow a single company to reduce the opportunities or lock others out. Christine Perey, Executive Director of AREA, refers to it as our mandate: to ensure that an ecosystem of AR component and solution providers is in harmony with the customers’ needs, and able to deliver the diversity and innovation upon which economic success is based.

Path to the Future

With a concerted group goal centered on the advancement of AR, and with many technological developments both in the works and being introduced at an increasingly fast pace, we will one day look back to 2015 and say, how did we ever get along without Augmented Reality?




Christine Perey to Speak at the EPRI Information, Communication and Cyber Security (ICCS) European Engagement Summit

AREA member EPRI will be hosting its first European Engagement Summit on April 28 and 29 in Madrid, Spain. Executives from European utility companies will be in attendance at this invite-only conference to learn about a variety of topics in cyber security and information technology.

Presentations on the following utility industry topics will include:

  • Augmented Reality in utility operations
  • Risk management in the electric sector
  • Security posture assessment
  • Network and system management
  • Threat management
  • Advanced metering systems
  • Communications in demand response and distributed energy resources

AREA executive director Christine Perey will be presenting on the state of the art of Augmented Reality in 2015, and what is possible with the available technology. She will examine the macroscopic trends that propel advancements in Augmented Reality today, including how new components introduced in the past six months can contribute to the most powerful solutions ever developed. She will also address the internal and external barriers that remain to be overcome before AR can reach its potential impact in utilities and in all industries.

The conference will be held at the InterContinental Madrid and will be followed by a tour of the Gas Natural Fenosa Control Center.




Events are Beginning to Focus on Enterprise Augmented Reality

Look out! Your travel schedule is already overloaded but there are new events where the topic of enterprise Augmented Reality is front and center. There are also events that in previous years have not treated the topic at all but are now sitting up and paying attention. How do you choose where to put your resources?

Tough Choices

In the long run, Augmented Reality will be part of all industries. In the AREA, our mandate is to make sure that this prediction will come true and the projects of our member organizations, regardless of their industry, are successful and bear fruit.

Customers—the buyers and those who will deploy Augmented Reality in their workflows—are already “in” their industry. They know their suppliers and customers. They also have some use cases where they’re thinking AR will be valuable.

Providers of technologies to be deployed in industrial and enterprise environments, on the other hand, are scattered. Under the guise of trying to be responsive, many companies offering AR-enabling systems fail to become experts in the problems and opportunities of a specific industry.

Automotive. Aviation. Construction. Engineering. Aerospace. Defense. Energy. Utilities. AR introduction and adoption won’t happen at the same speed in all industries. Automotive is probably ahead of all the others, but will it hold that lead? For how long?

Which industries are going to embrace AR in a really serious way soonest?

tech-meeting-crowd

If you’re a provider of AR-enabling systems and have not already developed a sales force affiliated with an industry or just a few related industries, it’s time to do the soul searching you’ve been putting off. This doesn’t mean just looking at the total size of the industry or flipping through your contacts on LinkedIn to find someone that’s prepared to introduce you to the five top CTOs in an industry.

Scanning   

Take the time and in the first half of 2015, invest your resources to better understand industries and domains that you already know as well as those out of your comfort zone. One way to do this is to attend conferences to which an AR topic has recently added to the agenda. For example, the AREA’s calendar of events includes the World Air Traffic Management Congress in Madrid, the Bosch ConnectedWorld conference in Berlin and the CeBIT exhibition in Hannover. 

We’re going to meet people who are gathering in these places. They’ve probably got some major questions about what to expect, some fuzzy areas between AR and VR, and they’re looking for experts. If you are there as well, they may very well find you.

Some companies with a new product to showcase, like AREA member DAQRI, are organizing their own event and inviting their prospects to visit them. This strategy has real benefits because the host can control the message and those to whom the latest developments are revealed. On the other hand, it is unlikely to reach people who aren’t already on your list of prospects. The vendor-hosted event also has the drawback that the conversation isn’t focused on an industry’s pressing needs and greatest opportunities. Finally, those prospects you’ve gathered also sense that they are not sampling the full range of options. By attending one vendor’s event, like Metaio’s InsideAR or DAQRI 4D Expo, customers know that they are only getting what their host wants to put on the menu. 

crowd-tech

Balancing Act

So, before you rush out to the airport to yet another conference or tradeshow, consider the year as a whole and develop a balanced approach: a mixture of horizontal “AR-specific” events and those domain- or industry-specific events. Also consider mixing small events with very big ones. They will yield different types of relationships and your business is likely to benefit from having both.




Augmented Reality at CES 2015 is Better and Bigger Than Ever

There’s something for everyone at CES. Do you need a way to store your earbuds so the cables aren’t tangled? What about printing icing on a cake?

Roger Kay, a technology analyst who writes for Forbes, recommends breaking up the event into ten parts. It’s not about the horrendous taxi lines or other logistical issues of dealing with so many people in a relatively small area. I walk everywhere I go. I leisurely covered twenty-four miles on the flat Las Vegas ground in four days; there are buses to and from the airport. Kay wants his topics served out in concentrated exhibition floor zones.

Like for Kay, many of CES’ themes lie outside my areas of interest and despite the headaches caused by the crowds, having the option to see and sample the developments in a variety of fields is one of the reasons I return each year.

Finding what I need to see isn’t a matter I treat lightly. A month before heading to Las Vegas I begin planning my assault because the CEA’s web site is horrendously inefficient and their new mobile app pathetic. Using brute force, I locate all the providers of head-mounted personal displays, the providers of hardware that is or could be AR enabling, and the “pure” AR firms with whom I already have relationships. I also plan a long, slow visit through the innovation zones, such as Eureka Park. I know another half day will be dedicated to Intel, Samsung, Sony, LG Electronics and Qualcomm. Then I search for outliers by name.

A few days prior to the event I begin following the news feeds on social media and technology trade blogs. While there, I also scan the headlines for surprises. 

Highlights of my CES 2015

For reasons that don’t have to do with Google Glass, vendors are making good progress in the personal display space.  The first reason is that more companies are experimenting with new combinations of familiar technology components, particularly with hardware. Optinvent is recombining their optical technology with a set of headphones. Seebright is adding a remote control to your smartphone. Technical Illusions is combining reflector technology and projectors with new optics. It’s like gene mixing to produce new capabilities and life forms.

Vuzix demonstrated the new waveguide technology in their optical see-through personal displays for Augmented Reality.

Vuzix demonstrated the new waveguide technology in their optical see-through personal displays for Augmented Reality.

That’s not to say that designs for the “traditional” optical see-through display form factor are standing still. Getting new investments, such as Vuzix received from Intel, is a major accelerator. ODG’s sales of patents to Microsoft in 2014 produced sufficient revenues for the company to develop a new model of their device targeting consumers.

The second reason for the significant advances in the personal display product category is the evolution of components. I saw firsthand in many exhibits, the “familiar” components these displays are must include, such as motion and other sensors, eye tracking kits and optics. All are rapidly improving. For these components, “improving” means smaller size packaging and lower power consumption. 

It was good to focus—if only briefly—on the familiar faces of AREA members such as APX Labs and NGRAIN who were participating in the Epson developer ecosystem booth, and to see the latest Epson products, which seems to be increasingly popular in enterprise. I found APX again in the Sony SmartEyewear zone, where I was able to try on the Sony prototype. I also caught up with executives and saw impressive new AR demonstrations by companies whom I don’t often see attending my events. If you’re interested, I encourage you to click on these links to learn about MetaInfinityAR, Occipital, ScopeAR, Technical Illusions, LYTE, XOeye Technologies, FOVE, Jins Company, Elvision Technologies, Avegant  and Augumenta. I’m sorry if I neglected to include others that I saw at CES.

Although they were around and showing AR or AR-enabling technologies, and we may have crossed paths unknowingly, I didn’t have a chance to meet with Metaio, Lumus, Lemoptix or Leap Motion.

I spent more time than expected visiting and observing the booths of Virtual Reality headset providers who were at CES. There were several exhibition zones dedicated to Oculus VR, with the new Cresent Bay device.  The lines waiting to try on the new Razer OSVR (Open Source VR) system were stunningly long. It amazes me that a small company like Sulon could afford such a huge footprint in South Hall to set up private briefing rooms for its Cortex display for AR and VR, and yet exhibit openly outside.

Elsewhere there were hordes swarming at the Samsung Gear VR and the Sony Project Morpheus zones. What good are all these headsets without content? I stopped in at JauntVR, which seems to be getting a lot of attention these days. I’m sure there were dozens more showing VR development software, but VR is peripheral to my focus.

I was impressed by the NVIDIA booth’s focus on Advanced Driver Assistance Systems this year, demonstrating real time processing of six video feeds simultaneously on the Tegra K1 Visual Computing Module. There were also excellent demonstrations of enterprise use of AR in the Hewlett Packard exhibit. Intel dedicated a very significant portion of its footprint to Real Sense. And, similarly, the Vuforia zone in Qualcomm’s booth has expanded by comparison to 2014. The IEEE Standards Association offered an AR demonstration to engage people about their work.

Automotive companies were also showing Augmented Reality. I saw examples in the BMW pavilion, in Daimler’s area, the Bosch booth, and Hyundai’s prototype cars.

At the other end of the spectrum there were many exciting new products in the pico projector category. MicroVision and Celluon were both showing HD pico projectors for use with smartphones; such technology will certainly be considered for projection AR in enterprise. ZTE and Texas Instruments also introduced their latest pico projector models at CES 2015.

Digging in Deeper

Although no longer in Las Vegas and despite my careful advance planning, I continued with my CES homework for at least a week. For example, I watched the archive of the “New Realities” panel and played back other videos that cover AR and VR at CES on CNET, Engadget, Tested and Financial Times

The IEEE published an analysis of AR at CES in Spectrum that reaches the same conclusion I drew:  the “C” in CES is for Consumer but a lot of consumer technology is going into corporate IT.

I hope I will have digested all that I gathered at CES 2015 before I begin preparations for 2016.




CES 2015 Attracts AREA Members and Their Customers

Few trade shows and exhibitions compete with the annual Consumer Electronics Show in sheer magnitude. Over 170,000 people, including representatives of six AREA member companies, gathered between January 6 and 9, 2015, to see and touch the latest real (as well as imaginary) high-technology products first hand.

 Main Attraction

While CES is large in size, the prepared visitor can find and focus on the latest in specific segments. For example wearable technology, fitness, optics, cameras, robotics and automotive gadgets. For AREA members, the primary reason to attend CES is to try on — and, for Augmented Reality providers, to help potential customers to explore — options across the entire gamut of optical see-through hands-free displays for Augmented Reality before ordering prototypes and developer kits.

The growth of the optical see-through hands-free display product category was very rapid in 2014. As a customer, gathering information about products from web pages, brochures and media reports is insufficient. Research must include actually trying on hardware and speaking with product team members. If you’re not the largest automotive or airplane company in the world, these devices and teams do not come to your facilities.

After actually donning more than one optical see-through display it becomes clear that there’s a huge spectrum of experiences possible. Even one model of hardware can “feel” very different and support different use cases. Going from one station to the next in a pavilion such as those organized by Sony and Epson, and using the same device running different software offers a glimpse into the possible range.

Providing this hands-on experience to a wide variety of potential customers in a single place is what makes it important for software developers targeting these platforms and devices to participate and attend CES as well.

AREA members APX Labs and NGRAIN demonstrated in partner booths. Other companies manning stations nearby included Scope, XOEye Technologies and Metaio. Elsewhere at CES, ODG, Optinvent, Atheer Labs, Seebright, InfinityAR, FOVE, and Vuzix demonstrated optical and video see-through AR displays in their own dedicated stands. DAQRI and Technical Illusions, maker of CastAR, hosted guests and conducted demonstrations of their pre-release hardware and software in private suites.

Component Comparisons

For other CES visitors (including those who are considering or already embarking on integration of their own hardware-based solutions) meeting with providers of enabling technologies such as chip sets, power storage and transmission components, sensors and sensor hubs can help define the spectrum of choices before making recommendations or purchasing decisions.

For EPRI, an AREA member attending CES 2015, the challenge was to identify possible wearable computing platforms for field agents. Some field agents will be able to use wrist bands. Others may find efficiencies with 3D scanners and 3D printers.

There was no shortage of booths to visits and companies offering new technology to discover in domains adjacent to Augmented Reality at CES 2015.

AREA Mixer

Las Vegas is the city that never sleeps. After CES exhibition hours, business continues as people gather in the streets and in all the popular entertainment venues. The AREA hosted a reception of members and companies focusing on enterprise Augmented Reality in the Marriott Las Vegas, a short distance from the Las Vegas Convention Center.

All the AREA members at CES had representatives on hand to meet and greet guests including executives and engineers from OPTIS, SAP, Bosch, SmartReality, ZigBee Alliance, Vuzix and Epson. The first AREA case studies were shared with visitors as were AREA plans for growth.

As this organization grows it’s not difficult to imagine having a greater presence at CES and a large group of companies participating in a CES enterprise AR mixer in the future.




Terminology Monsters Alive and Well

Most enterprise IT managers use dozens of acronyms and a large specialized vocabulary for communicating about their projects. The mobile industry is particularly rich with layers of terminology. Last year mobile IT professionals were studying and defining policies for BYOD. Now wearable technology is at the top of every mobile technology portal.  

Confusion in Communication

Ironically, Augmented Reality promises to deliver improved communication to the user but is plagued with a potential for confusion in terminology. The glossaries—yes, there are several—have nearly 40 frequently misused terms (each) with only a few overlapping terms. An analysis of the differences between the AR Community Glossary v 2.2 and the glossary in the Mixed and Augmented Reality Reference Model has been performed by Greg Babb, the AREA editor. This analysis will be discussed with experts during the virtual meeting of the AR Community Glossary Task Force on November 24, 2014.

Who Needs a Glossary?

Simply refer to Milgram’s Continuum. There is a virtual world and a real world. The space between these two extremes is “Mixed Reality.”

B-13-11-14-Terminology

It sounds and looks like a simple concept but debate about the segments within Mixed Reality can consume entire meetings.

BP-13-11-14-Terminology2

Is the 1st & 10  in football Augmented Reality? No, it isn’t according to the debate among experts of the AR Community. And when the details of Mixed Reality need to be spelled out and implemented in a distributed computing architecture by many different people, the definitions are insufficient and the concepts blend together. This is an impediment to enterprise AR introduction and adoption.

Diminished Potential

Speaking of blending together, in early November, Hewlett Packard announced its spectacular plans for 2015 as bringing “Blended Reality” to new personal computing products. The sprout PC replaces a keyboard and mouse with touchscreen, scanner and other features that let users take actual objects and easily “put” them into a PC.

Seeing a connection with Augmented Reality, the author of this Business Insider article tried to define Virtual and Augmented Reality. “That’s what you get when you put on Google Glass and it projects Google-y facts or images on the world. Or you run an app like Star Chart on your smartphone, hold it up to the sky and it superimposes the constellations on your view of the sky,” wrote Julie Bort to hundreds of thousands of readers.

Forget the fact that Google Glass does not really provide Augmented Reality and ask the executive who is running a multi-billion dollar business if they want an app to project constellations on their warehouse or factory ceiling. Augmented Reality’s potential is not only unclear; it actually gets diminished by comparisons of this nature (nevertheless, let’s not confuse this with “diminished reality,” OK?).

The fact that HP is beginning to pay attention to Blended Reality, Mixed Reality or Augmented Reality should not come as a surprise, given the integration of the Aurasma group into the company and the variety of services that could be provided on HP servers for managing and delivering AR experiences. But the new sprout PC looks awfully similar in some ways to demonstrations of Intel’s Real Sense. If these similarities are deep, then perhaps it is time for Intel and HP to invest in educating their target audiences about these new technologies. And a consistent vocabulary would come in handy.

To make sure that people do not jump to the conclusion that Blended Reality is something invented in 2014 by HP, the Business Insider article points out that Blended Reality was first introduced in 2009 by the esteemed Institute for the Future (IFTF). “The IFTF envisioned it as a sort of tech-enabled sixth sense, which will be worn or maybe even implanted into our bodies and interface with our computers,” concludes the Business Insider piece.

If that is how HP is using the term, there are even bigger problems than the definition of Augmented Reality terminology.

Mixed and Augmented Reality Reference Model

One of the solutions for this obstacle to Augmented Reality deployment is the Mixed and Augmented Reality Reference Model. The candidate specification is available for review and will be voted on within ISO/IEC JTC1 SC 29 WG 11 (MPEG) in 2015.

To learn more about the Mixed and Augmented Reality Reference Model, visit this AREA blog post.




Future for Eyewear is Bright (If Enterprise Requirements Can be Met)

The topic of hands-free displays or eyewear for Augmented Reality was popular at InsideAR 2014. It was discussed using many names (e.g., smart glasses, eyewear and HMD, to mention a few) and shown at many of the exhibition stands. On the stage, there were no less than six presentations focused entirely on hands-free displays. Even those speakers not focused on displays mentioned the opportunities they would offer once customer requirements were met.

New Report Forecasts Four Waves

During his InsideAR remarks, Ori Inbar of AugmenedReality.org described the latest addition to an already significant body of market research on the topic of hands-free AR displays. As Ori mentioned in his preface, the other reports to date do not agree on the size of the market, the terminology or how to seize the opportunities they will offer. It is not clear if or how this report compares or addresses the uncertainty.

Entitled simply “Smart Glasses Report,” Ori’s new report compiles findings from tests and interviews conducted with companies providing products and components. The scope does not include devices designed for use with Virtual Reality content.

The purpose of the report is to answer two burning questions: Will smart glasses ever come into their own? And when will this happen? To the first question, the answer is that those contributing to the report felt it was inevitable. As to the second question, it depends.

FutureBright-glasses-1

Ori predicts there will be four waves of technology:

  1. Technology enthusiasts: 10 models of glasses will sell 1 million units within the next year or two
  2. Initial winners will emerge and sell a total of 10 million units by 2016
  3. The early majority market will be composed of fewer competitors and will sell 50-100 million and reach critical mass by 2018
  4. Mainstream adoption will occur between 2019 and 2023 with the shipment of one billion units

FutureBright-glasses-2

Business opportunities will depend on the wave and type of company. Ori predicts that by 2020, there will be one 800 pound gorilla and a few challengers. He also predicts that prior to, and even during 2016, most sales of eyewear will be for enterprise use.

That depends on those devices meeting the requirements of the enterprise customers.

Enterprise Requirements for Head-mounted Displays

In his InsideAR 2014 speech, Dr. Werner Schreiber of Volkswagen provided a very detailed analysis of the requirements that head-mounted display vendors need to meet if they are to achieve traction in enterprise. To set the stage for his analysis, Schreiber began by saying that AR is not going to be popular in production processes until head-mounted displays meet certain requirements. In other words, the use of tablets and smartphones is just not convenient when the tasks people are doing require both hands.

One of the most important requirements described (in fact the first of at least 10) is power consumption. The devices will need to begin with a battery life of at least 10 hours. Another requirement was field of view (FOV). In Schreiber’s analysis, the FOV must be at least 130 degrees, or a moving FOV that is 50 degrees.

Of course, support for corrective lenses is not negotiable nor are systems that minimize wiring. If there must be wiring, it needs to include easy connectors both at the display and other power or computing devices that may be needed.

If the hardware can be designed to meet the minimum requirements, there remain significant software challenges. Schreiber’s ideal platform would permit automatic:

  • Creation of computer data
  • Removal of unused details
  • Creation of workflow
  • Consideration of possible collisions
  • Selection of necessary application tools and materials
  • Publishing of user-generated annotations into the experience

That is a lot of requirements to meet before 2016.

Do you have a product or strategy that will address these needs? Let us know about your product or opinions on these requirements.