1

When a Developer Needs to Author AR Experiences, Part 2

This post is a continuation of the topic introduced in another post on the AREA site.

Choose a Development Environment

Someday, the choice of an AR development environment will be as easy as choosing a CMS for the Web or an engineering software package for generating 3D models. Today, it’s a lot more complicated for AR developers.

Most of the apps that have the ability to present AR experiences are created using a game development environment, such as Unity 3D. When the developer publishes an iOS, Windows 10 or Android app in Unity 3D, it is usually ready to load and will run using only local components (i.e., it contains the MAR Scene, all the media assets and the AR Execution Engine).

Although there’s a substantial learning curve with Unity, the developer community and the systems to support the community are very well developed. And, once using Unity, the developer is not limited to creating only those apps with AR features. The cost of the product for professional use is not insignificant but many are able to justify the investment.

An alternative to using a game development environment and AR plugin is to choose a purpose-built AR authoring platform. This is appropriate if the project has requirements that can’t be met with Unity 3D.

Though they are not widely known, there are over 25 software engineering platforms designed specifically for authoring AR experiences.

authoring-landscape

Table 1. Commercially Available AR Authoring Software Publishers and Solutions (Source: PEREY Research & Consulting).

The table above lists the platforms I identified in early 2016 as part of a research project. Please contact me directly if you would like to obtain more information about the study and the most current list of solutions.

Many of the AR authoring systems are very intuitive (featuring drag-and-drop actions and widgets presented through a Web-based interface), however most remain to be proven and their respective developer communities are relatively small.

Some developers of AR experiences won’t have to learn an entirely new platform because a few engineering software publishers have extended their platforms designed for other purposes to include authoring AR experiences as part of their larger workflow.

Or Choose a Programming Language

Finally, developers can write an AR execution engine and the components of the AR experience into an app “from scratch” in the programming language of their choice.

To take advantage of and optimize AR experiences for the best possible performance on a specific chip set or AR display, some developers use binary or hexadecimal instructions (e.g., C++) which the AR display device can run natively.

Many developers already using JavaScript are able to leverage their skills to access valuable resources such as WebGL, but creating an application in this language alone is slow and, depending on the platform, could fail to perform at the levels users expect.

To reduce some of the effort and build upon the work of other developers, Argon.js and AWE.js are Open Source JavaScript frameworks for adding Augmented Reality content to Web applications.

Results Will Depend on the Developer’s Training and Experience

In my experience, it’s difficult to draw a line between the selection of an AR authoring tool or approach and the quality or richness of the final AR application. The sophistication and quality of the AR experience in an app is a function of both the tools chosen and the skills of the engineers. When those behind the scenes (a) ensure the digital content is well suited to delivery in AR mode; (b) choose the components that match requirements; and (c) design the interactions well, a project will have the best possible impacts.

As with most things, the more experience the developer has with the components that the project requires, the better the outcomes will be. So, while the developer has the responsibility for choosing the best authoring tool, it is the AR project manager’s responsibility to choose the developer carefully.




Digitally Assisted Assembly at Factory 2050

In a previous article, we introduced the University of Sheffield’s Advanced Manufacturing Research Centre (AMRC), a member of the AREA that develops innovative techniques and processes for high-precision manufacturing. A subsidiary, the AMRC with Boeing, collaborates with a variety of research partners in areas such as informatics, automation, robotics and Augmented and Virtual Reality. Besides aerospace, the results of this research into manufacturing are used in the automotive, construction and other high-value industries.

Earlier this year, the AMRC opened the doors of its newest manufacturing facility, Factory 2050, a glass-walled reconfigurable factory in Sheffield Business Park. The facility investigates and showcases new technologies and processes relating to Industry 4.0, including projects to explore digitally assisted assembly technologies to fill a looming skills gap in the aerospace industry.

Augmented Reality in Digitally Assisted Assembly

The Digitally Assisted Assembly (DAA) project focuses on techniques for delivering work instructions to factory operators, including the use of optical projection AR and wearables. According to the AMRC’s digital manufacturing specialist, Chris Freeman, the project allows partner companies to experience visual work instructions through a number of delivery mediums. Research includes:

  • Optimizing AR tracking methods for effectively getting a part’s position to generate a frame of reference.
  • Designing user experiences for work instructions that are projected or overlaid onto a part within the user’s field of view. These include instructions that guide users for tasks such as gluing sequences, fastener insertion, inspection, wiring looms, complex routines and more. The aim of this research is to reduce cognitive load and optimize the user experience for delivery across a variety delivery modes (e.g., projection AR) and devices from tablets to smart glasses.
  • Using location-based services to add contextualized task and environmental information in relation to the user’s position or progress within a task.

With the technology still in its infancy, one of the aims of DAA is to simply demonstrate what can be achieved with the technology. Although smart glasses and wearables aren’t proven or certified for use in manufacturing, they are nevertheless being baselined for further research and possible future production usage. The AMRC are currently following a strategy of first identifying the “low-hanging fruit” from the current state of hardware and software, which means that research associates want to find some of the most obvious and perhaps least expensive options up front.

Skype for HoloLens

Although the AMRC is studying a variety of smart glasses brands such as ODG and Vuzix, remote collaboration use cases with Skype for HoloLens is an interesting application for meeting the needs of certification processes. This use case includes methods for lineside support and remote verification to complement or replace expensive quality management activities requiring the presence of a supervisor. It may even include assistance by remote colleagues when assembly or repair problems are encountered.

Freeman notes that though such use cases aren’t spectacularly advanced in terms of tracking in comparison with other scenarios such as overlaying geometric 3D models on objects being assembled, they are nevertheless disruptive of current manufacturing practices.

Projecting Work Instructions on Large-Volume Objects

Projected Augmented Reality, sometimes referred to as “spatial Augmented Reality,” features one or more optical projectors projecting a beam of light onto a specially designed work surface or even on the parts being assembled. Thus work instructions are displayed directly on surfaces to guide operators. The DAA is currently researching methods for effectively using projection AR in conjunction with both fixtures and robotic arms in work cells.

For example, an operator assembles aircraft parts with the assistance of robots to present a part at a better angle than if it were lying on a work surface. A robotic arm can swivel or position the part as needed by the operator, and projected AR is able to guide operators through a series of specific manufacturing procedures.

Defining Success

As has been discussed in other industry contexts, return on investment on any new technology can be challenging to define (whether it’s for AR or any other). Typical ROI calculations seek to determine the amount of savings a project can bring and when that investment will pay off. In the case of AR, relevant questions include how to quantify the value of conceptualized data and geometries for its usage in performance metrics.

Further research into AR will eventually uncover such answers, but in the near term, human factors and ergonomic studies can also quantify the technology’s effectiveness. For example, the AMRC is currently conducting AR-related training scenarios to determine a variety of metrics such as memory retention and AR’s overall effectiveness, as well as usability and operator response.

Beyond Aerospace

Although research being conducted at Factory 2050 aims to advance the state of the art in aerospace manufacturing, many of the techniques and procedures derived by DAA and other projects will eventually be used in other industries, such as energy and construction. For example, assembly techniques for large-volume aerospace parts can also be applied to assembling prefabricated homes at a factory as part of modular building manufacture. Having recently opened its doors, it’s apparent that the new facilities of Factory 2050 will have an impact on both present and future manufacturing in multiple domains for many years to come.




Calculating ROI for AR Investments: One Approach

In a field as young as AR, organizations seeking to justify investments have had little historical data available to help calculate ROI. The team at AREA member Catchoom have addressed this challenge by putting together a white paper that provides a step-by-step means of calculating ROI for its CraftAR image recognition software based on an actual Catchoom customer in the healthcare industry.

Download the white paper from Catchoom to learn more.




New EPRI Report Offers Insights for Wearable AR Display Customers

Innovation in wearable technology continues to accelerate. Smart watch vendors are making so many announcements there are portals dedicated to helping customers sort through the details. There is also a portal to help customers compare the features of wearable displays for AR.

And new wearable segments are being defined. For example, Snap recently introduced its $130 Spectacles

Is this all good?

Thinly veiled behind the shiny new products is a vicious cycle.

The continual stream of announcements confirms for readers of this blog that the wearable AR display segment is still immature. This means that those customers with limited budgets seeking to select the best hands-free AR display for their projects in 2016 are likely to be disappointed when an update or new model appears, making the model they just brought in-house out of date. Risk-averse organizations may put their resources in another product category.

On the other side of this conceptual coin, the companies developing components and building integrated solutions for wearable AR must continue to invest heavily in new platforms. These investments are producing results — but without clear customer requirements, the “sweet spot” for which the products should aim is elusive. And when customers lack clear requirements, differentiating the latest offerings while avoiding hype is a continual challenge.

Breaking the cycle with specific requirements

When customers are able to prioritize their needs and provide specific product requirements and budgets, there’s hope of breaking this cycle.

The Electric Power Research Institute (EPRI) and PEREY Research & Consulting, both AREA Founding Sponsor members, have collaborated on the preparation of a new report entitled Program on Technology Innovation: State of the Art of Wearable Enterprise Augmented Reality Displays.

Targeting the buyers of wearable technology for use when performing AR-assisted tasks in utilities (and by extension, in other enterprise and industrial environments), the report seeks to demystify the key product features that can become differentiators for wearable AR solutions.

Based on these differentiators, the first multi-feature wearable AR display classification system emerges.

Perey

Source: Program on Technology Innovation: State of the Art of Wearable Enterprise Augmented Reality Displays. EPRI, Palo Alto, CA: 2016. 3002009258.

The report also discusses challenges to widespread wearable AR display adoption in technology, user experience, financial, and regulatory/policy domains.

Descriptions of a few “lighthouse” projects in utilities companies, logistics, manufacturing, and field service provide readers valuable insight into how early adopters are making the best of what is currently available.

This report is available for download at no charge as part of the EPRI Program on Technology Innovation.

If you have comments or feedback on the report, please do not hesitate to address them to the authors, Christine Perey and John Simmins.




How Optical Character Recognition Makes Augmented Reality Work Better

Today, companies in many industries seek to develop AR and VR applications for their needs, with the band of existing Augmented Reality solutions extending from gimmicky marketing solutions to B2B software. Helping production companies train their workers on the job by augmenting service steps onto broken machines is one of those solutions.

Augmented Reality could assist designers or architects to see a product while it is still in development. It could facilitate a marketing and sales process, because customers can already “try on” a product from a digital catalog. Or it could assist warehouse systems so that users get support in the picking and sorting process

The list of opportunities is endless and new use cases are constantly arising. The whole point of using AR is to make processes easier and faster. While at first, Augmented Reality and devices like smart glasses seemed way too futuristic, new use cases make them increasingly suitable for everyday life in the workplace.

Recognizing Objects and Characters

Augmented Reality is based on a vital capability: object recognition. For a human being, recognizing a multitude of different objects is not a challenge. Even if the objects are partially obstructed from their view they can still be identified. But for machines and devices this can still be a challenge. For Augmented Reality this is crucial though.

A smartphone or smart glasses can’t display augmented overlays without recognizing the object first. If needed for correct augmentation, the device has to be aware of its surroundings and adapt its display in real time according to each situation, all the while changing the device’s camera viewing angle. Augmented Reality applications use object detection and recognition to determine the relevant information needing to be added to the display. They also use object tracking technologies to continually track an object’s movements rather than redetecting it. That way the object remains in the frame of reference even if the device is moved around.

Character recognition is also crucial for a device’s understanding of the environment, as it not only needs to recognize objects, but according to the use case, it might also have to “read” it. This provides an even better discernment of the types of information that are important to process.

OCR Anyline

Optical Character Recognition

Optical Character Recognition (OCR) deals with the problem of recognizing optically processed characters, such as those in the featured image above. Both handwritten and printed characters may be recognized and converted into computer readable text. Any kind of serial number or code consisting of numbers and letters can be transformed into digital output. Put in a very simplified way, the image taken will be preprocessed and the characters extracted and recognized. Many current applications, especially in the field of automation and manufacturing, use this technology.

What OCR doesn’t take into account is the actual nature of the object being scanned. It simply “looks” at the text that should be converted. Putting together Augmented Reality and OCR therefore provides new opportunities; not only is the object itself recognized, but so is the text printed on that object. This boosts the amount of information about the environment gathered by the device, and increases the decision-support capabilities offered to users.

The Potential of OCR

Data import still requires high processor power and camera resolution and is expensive. Nevertheless OCR offers a viable alternative to voice recognition or input via typing.

Using OCR with smart glasses offers improvements for different kinds of business processes. Imagine a warehouse worker who needs both hands free to do his job efficiently. Using smart glasses to overlay virtual information on his environment can make him more efficient. But the ability to automatically scan codes printed on objects just by glancing at them frees his hands for other tasks.

Another example would be the automation of meter reading. When a device identifies the meter hanging on a wall, as well as its shape and size, and then automatically scans its values, a greater amount of meters can be read per day. This use case could be useful to energy providers.

When you look around, you will realize how many numbers, letters and codes need to be either written down or typed into a system every single day. Such processes, which can be very error prone, can become much less painful using OCR.




A Partnership Model for Augmented Reality Enterprise Deployments

Due to the potential to radically change user engagement, Augmented Reality has received considerable and growing attention in recent months. Pokémon Go certainly has helped and, in turn, generated many expectations for the advancement of AR-based solutions. In fact, the game has provided the industry with a long overdue injection of mass appeal and as a result, significant investment from (and among) tech giants around the world.

From corner shops to large utility providers, the spike in popularity of this technology has everyone buzzing about how it could improve their business. The flexibility of implementation, from improving processes to stand-out marketing solutions, has also altered the expectations of these prospective clients as they seek personalized enterprise-level AR-based solutions. Consequently, the time has come for vendors and suppliers to consider a new model when it comes to managing customer expectations.

When deploying Augmented Reality solutions in an enterprise context, it is essential to build strong partnerships with your customers, and in many cases to take on the role of a trusted advisor. This becomes more important through the stages of delivering a project—starting with defining a proof of concept (POC) to implement bleeding edge solutions with operational teams and ultimately end users, who in fact are the actual users of the technology.

While the primary value of Augmented Reality systems is to allow for the contextual overlay of information to enable better decision making, the visual data overlay and various data sources and devices that trigger location sensors all come into play—converging in the form of a complex mesh. Vendors must note that partnerships are key to solving the pieces of this puzzle.

Service Delivery—Creating Value from the Complex Mesh

This complex mesh is what ultimately garners value as the assimilation of these technologies creates new and innovative social and business ecosystems and associated processes. When addressing enterprise adaptation, one must be aware of the following questions:

  • How best can value be driven into workable solutions in an enterprise?
  • How well does it integrate with existing legacy systems?
  • Would new skills be required to introduce and manage the change?
  • Does the solution deliver increased productivity or efficiencies, i.e., better utilization of resources or allow for better decision making through information?
  • Does the solution enable new revenue models for the organization that are consistent with the existing product and service offerings?
  • In turn, how does this solution affect the profitability of the organization?
  • Last, but not least, is the business rationale clear for the implementation of such a solution?

The move towards customer-centric systems means that your customer (or your customer’s customer) is at the center of all decision making. This may be a shift from their existing system practices, meaning it’s even more critical that the chosen change management process be well aligned to the client’s corporate culture.

The Client’s Point of View—Questions to Ask When Going Beyond the POC

Some of the questions that vendors need to consider when it comes to implementing the solution beyond the POC are:

  • What is changing?
  • Why are we making the change?
  • Who will be impacted by the change?
  • How will they react to the change?
  • What can we do to proactively identify and mitigate their resistance to the change?
  • Will the solution introduce new business or revenue models?

Working as one with your customers through innovations to operations is a key factor for success. The complex mesh of AR, VR, IoT and Big Data technologies makes this even more critical as enterprises see an integration of their digital content, systems and processes.

It is essential to take a partnership mindset—where the Augmented Reality innovation solution is built both for and with the customer, and through a customer-implemented change management process—to quickly and easily create ROI as well as tangible, actionable outcomes.




What Pokémon Go Means for Enterprise Augmented Reality

Since its release on July 6th, Pokémon Go has become a global phenomenon—with downloads of the mobile app exceeding 75 million within the first three weeks. Many reviewers credit the game’s meteoric success to its innovative use of “Augmented Reality.”

To those of us in the enterprise AR community, of course, Pokémon Go is no more augmented reality than Atari’s 1972 “Pong” arcade game was table tennis. In Pokémon Go, the merging of the virtual and real worlds is confined to the projection of 2D monsters on real-life backgrounds. While it’s a novel effect for a mobile game, it barely scratches the surface of what Augmented Reality can do—or its tremendous potential for enterprises to achieve greater operational efficiencies.

Still, we at the AREA have to view the success of Pokémon Go as an important milestone in the development and adoption of AR for the following reasons:

  • Pokémon Go is familiarizing the world with the basic concept of Augmented Rreality. Hopefully our members can spend less time having to explain what Augmented Reality is—or how it differs from Virtual Reality. Potential customers will already understand the basic concepts and be ready to learn more.
  • Pokémon Go is proving that AR is no longer a futuristic concept. If an AR game is already a commercial success, can widespread enterprise AR solutions be far behind? Companies that had previously taken a wait-and-see approach to AR may now be more motivated to explore the possibilities for their businesses.
  • Pokémon Go is proving that people are engaged and excited by the technology. The game makes it vividly clear that AR is a powerful and compelling tool people enjoy using. That enthusiasm can only help fuel the growth and development of the AR market.

Many media outlets and bloggers agree and are driving the conversation in our direction. Just look at some of these recent headlines:

  • “Is Pokémon Go Really Augmented Reality?”
  • “How Pokémon Go Took Augmented Reality Mainstream”
  • “Why Pokémon Go is a Game Changer for Augmented Reality and Marketers”
  • “Pokémon Go is Nice, But Here’s What *Real* Augmented Reality Will Look Like”

Our challenge now is to leverage the Pokémon Go phenomenon to accelerate the adoption of AR in the enterprise. That means taking the opportunity—as the AREA members Gaia Dempsey of DAQRI and Scott Montgomerie of Scope AR have done recently—to make sure inquiring media outlets understand that the impact of enterprise AR will be even more significant and lasting than the current Pokémon Go craze.

To find out more about the AREA contact Mark Sage, Executive Director.




Two Months In: An Update From the Executive Director

Further to my last post about the AWE ’16 conference, I want to share some thoughts and areas for future focus from my first two months as the Executive Director of the AREA.

It’s exciting to be involved in such a vibrant and dynamic ecosystem of AR providers, customers and research institutions. I’m amazed at the sheer breadth of the kinds of member organizations, their offerings, skills, achievements and desire to work with the AREA to help achieve our shared mission of enabling greater operational efficiencies through smooth introduction and widespread adoption of interoperable AR-enabled enterprise systems.

Success and Challenges

Through my initial conversations with the members, I’ve learned of many success stories and also the challenges of working in a relatively young and rapidly changing industry.

For example, AREA members talk about the prototypes they’re delivering with the support of software, hardware and service providers. However, I would like to see more examples of wider rollouts, beyond the prototype stage, which will encourage more buying organizations to investigate AR and understand its massive potential.

The AWE conference in Santa Clara in June, and the subsequent AREA Members Meeting added emphasis to my initial thoughts. The AR in Enterprise track of AWE, sponsored by the AREA, highlighted a number of organizations who are already using AR to create real benefits, ranging from the enabling of real time compliance, better use of resources, applying the most relevant data and the reduction of time, errors and costs. It was great to see that many member companies understand the benefit of working together to enable the whole AR ecosystem to become successful.

Carrying on the Momentum

My continued focus over the coming weeks will be to carry on the great momentum that has been started. I’m briefing more organizations from all over the world about the benefits of becoming an AREA member. I’ll continue the focus on developing and curating thought leadership content including case studies, frameworks and uses cases, and deliver them via the AREA website, webinars and social media. We’re enhancing our value proposition through our development of research committees that increase the capabilities of the industry.

This is an exciting time for the enterprise AR industry and the AREA; I’m very interested in any feedback or comments you may have so please contact me at mark@thearea.org. I look forward to hearing from you and working with our growing membership to meet our goals of realizing the potential of Augmented Reality in the workplace.




Interview with JoinPad

AREA member JoinPad provides cloud-based and contextually aware software that simplifies processes in a number of industries. The company’s BrainPad product integrates enterprise resource systems and sensor networks to add Augmented Reality visualization and contextual computing to existing business processes.

This month we interview Nicolas Pezzarossa, Global Sales and Business Development Director of JoinPad, about the enterprise use cases his company is encountering for its products and services.

In which industries are you finding the greatest interest for your products and services?

We see strong interest from providers of energy supplies and infrastructure. Oil and gas has the largest proportion of such companies. Besides this, we’re finding companies in other industries getting involved with Augmented Reality:

  • Energy
  • Automotive
  • Manufacturing
  • IT hardware, infrastructure and services
  • Retail
  • Tourism

We’ve also provided solutions for use cases in these industries.

What are the reasons for AR’s popularity in these companies?

We believe it’s due to growing awareness of the value that Augmented Reality brings in conjunction with digital transformation. The ROI of individual AR use cases is becoming evident, and there’s an increasing maturity of hardware platforms for this environment.

JoinPad

We’ve also received much interest in our smart glasses SDK, and as well in our Smart Assistance solution that offers guided assistance as an “augmented operator’s manual” and expert collaboration in real time.

With whom do you partner most often?

We partner with well-consolidated players active in the field of consulting to large industrial companies in IT infrastructure for manufacturing processes, where we can supply the AR-related components in an OEM-type of integration.

Has employee performance in the workplace prior to AR introduction been studied by your customers?

Most of our customers have detailed statistics about performance or time taken to complete specific tasks, and to which we can correlate our solution. In other cases we’ve performed a detailed analysis of their work processes. Our product also contains a module for work order management that enables generation of KPIs for specifically measuring this type of work performance for comparison purposes.

What are common metrics, and do you recommend customers choose their own?

We find that in most cases the most important factor is time to complete a task (for increasing efficiency). But others include the ratio of possible to actual mistakes and the value of avoided damage, as well as the level of fatigue or satisfaction of operators.

As we are discussing the consequences of a disruptive technology, another important factor is the possibility of enabling new work processes. Although this is more difficult to measure, it offers large potential for increasing efficiency.

We always emphasize the importance of evidence for a return on investment in all phases of a project. This is also essential for advocating internally to stakeholders and management for the adoption of AR.

What is your company’s recommended approach to introducing AR in an organization? Are there steps or a model or method you follow?

We take a phased approach and in a preparatory phase offer a workshop for defining possible use cases and analyzing current work processes. We then propose a proof-of-concept phase in which we offer a basic solution with limited functionality. This allows the customer to experience the new solution and see its potential. We subsequently initiate a pilot phase with actual data exchange, followed by a roll-out phase where the application is introduced into actual work processes.

JoinPad

How is data prepared for your customer projects?

All data must be processed to efficiently support the use case. In particular, when connecting to an ERP system it’s important to choose the data sets specifically supporting the use cases.

Do you get involved in the design of the content that will be used in pilot projects?

Normally the customer asks us to provide the content as well as the design of the user interface. In the case of smart glasses this can involve an innovative interaction design. Key to project success is to propose visualizations that help solve the specific problem at hand and improves visual perception.

Our experiences working with customers have allowed us to develop specific templates for smart glasses applications that ensure efficient intake of the relevant information.

What is the profile of the typical person who performs the selected tasks prior to AR, and what are their attitudes?

Augmented Reality, particularly when used with smart glasses, has the major benefit that even untrained operators can perform complex tasks. But also highly trained operators benefit from availability of real time data where it matters.

In most cases operators are satisfied about working with innovative tools that they appreciate as supporting their work tasks. But the impact of new technologies on human resources and work safety must nonetheless be carefully monitored.

Do you study project risks, and do customers perform user studies?

Risk analysis is always part of our use case analysis, just like recommended fallback scenarios.

Although most customers don’t plan user studies themselves, we offer a questionnaire process both before and after a pilot for evaluating improvements for purposes of the roll-out phase.

What are the system components the customer must provide for a successful project?

This is highly dependent on the use case but there is in fact no requirement that customers provide us with system components. However at various times they do provide us with components ranging from full packages of 3D files to databases and API access.

What type of recognition and tracking technologies do you support, and what are the effects of lighting?

We work with all recognition and tracking principles (e.g., image, bar code, natural features, SLAM, depth sensing, etc.), but based on our proprietary core algorithms.

Lighting represents a challenge that in many cases can be overcome, yet it influences tracking stability. It’s always possible to correct this influence using other types of sensors, or to reduce its impact with fallback scenarios.

JoinPad

Do you use IoT, and is AR content locally archived or accessed over a network?

We have specifically developed and deployed an IoT module in our AR platform BrainPad that is used today by one of our customers in the energy industry to retrieve data from sensors on industrial equipment in the field in real time. We thus fully support IoT data integration.

For AR content, there are different scenarios involving both kinds of access and integration, depending on the workflow.

What are the greatest challenges you face in current projects?

One of the largest challenges is in the need to prove ROI on every single use case, which is often complex as many industrial and manufacturing processes are highly intertwined with other processes.

What are the future plans or next steps for JoinPad?

The next steps are to further grow our activity and supply more publishable customer use cases to further support the adoption of the technology in industry. In particular, Joinpad will intensify its education effort to spread knowledge about the value and design of AR applications by conducting workshops offered to technology experts and managers, as well as in academic initiative.

Joinpad




3D Studio Blomberg at Augmented World Expo 2016

Our team at 3D Studio Blomberg, along with key partners, travelled to Santa Clara, California, to attend the Augmented World Expo. The event is the largest annual conference and exhibition about Augmented Reality worldwide, with over 4000 attendees and 250 exhibitor booths. During the two days, I had the opportunity to make interesting new contacts, meet other AREA members, see and try a variety of innovative AR and VR solutions and attend the enterprise AR tracks hosted by the AREA.

Larger Players Entering the Market

Judging by the offerings on display at AWE, the ecosystem for enterprise AR products and services is expanding. Players like PTC (Vuforia), Osterhout Design Group (ODG) and Microsoft through HoloLens had observably increased their footprint at the event, and even the presence of VR products at an AR show confirmed the overall trend of a growing ecosystem. Microsoft presented its HoloLens product hosted by Vuforia and its technical capabilities are impressive. We view all this as a positive development as it will bring increased competition and more innovative market offerings.

AR in Enterprise Sessions

The AREA-hosted AR in enterprise track featured speakers and AREA members on a diverse range of topics from IoT to security. The sessions were interesting but they highlighted the array of challenges still facing companies seeking to implement Augmented Realty in the workplace. One fundamental takeaway was that widespread adoption of AR in industry isn’t solely a technological issue of AR, but rather is the result of steady improvements in the surrounding mix of technologies such as IoT, Big Data, etc. As these enabling features and technologies improve, they make the value proposition of AR even more compelling.

Another insight from the sessions was the idea of mental models and how we imagine innovations should work—but that actually turn out to be quite different in reality. We need to avoid this pitfall when thinking about AR and the problems it solves.

Lastly, partnerships are essential for expanding the ecosystem and assuring its success. For example, ODG makes great smart glasses but they need partners that create virtual content in order to get the most out of their products. All of these key ingredients will produce the necessary lifting power to make AR a killer app.

Conclusion

AWE was a rich, rewarding experience that we and our partners in attendance enjoyed immensely. As content providers for AR-enabled enterprise systems, we appreciated the opportunity to meet a variety of potential partners to which we add value. We’re looking forward to turning the ideas gained from the conference into reality, and to contributing to the exciting and growing marketplace for Augmented Reality.