1

Augmented Reality Boosts Efficiency in Logistics

Fulfilling customer orders at a warehouse, or order picking, can be costly. A well-known study on warehouse management cited the typical costs of order picking as being nearly 20% of all logistics costs and up to 55% of the total cost of warehousing. The use of technology to streamline order picking offers an important opportunity to reduce cost.  

While great strides have been made in automating warehouse processes, customer expectations also continue to rise. For example, Amazon offers same-day delivery in many US metropolitan areas and this is becoming a standard elsewhere. Increasing fulfillment and delivery speeds may result in increased errors that are not caught prior to shipment.

Four panel image

Augmented Reality can significantly increasing order picking efficiency. An AR-enabled device can display task information in the warehouse employee’s field of view. Logistics companies such as DHL, TNT Innight and others have been collaborating with providers of software and hardware systems to test the use of Augmented Reality in their warehouses.

A recent study by Maastricht University conducted in partnership with Realtime Solutions, Evolar and Flos brings to light the impact smart glasses can have on order fulfillment. The research sought to:

  • Confirm prior research that smart glasses improve efficiency compared with paper-based approaches
  • Study usability, required physical and mental effort and potential empowerment effects of the technology in a real world environment
  • Assess the impact of an individual’s technology readiness on previously introduced performance and well-being measures

Design of the Study

Sixty-five business students at the University of Maastricht participated in a three-day study conducted in a controlled environment. Study participants were given instructions to pick individual items from bins containing items and place them into appropriate customer bins:

  • One group picked items from 28 bins using item IDs printed on paper and then matched those to IDs on customer bins. The study assessed order picking efficiency by measuring the ability and speed of participants to place the items in the correct customer bins.
  • The other group used AR-enabled smart glasses to scan barcodes in item bins and follow the displayed instructions to place them in the customer bins.

The researchers evaluated metrics such as:

  • Performance measures of error rates and picking times per bin
  • Health and psychological measures such as heart rate variability, cognitive load and psychological empowerment
  • Usability measures such as perceived ease of use
  • “Technology readiness” on a scale measuring personal characteristics such as optimism for, and insecurity with new technologies

View through smartglasses

Faster with Smart Glasses

The researchers found that smart glasses using code scanners permitted users to work 45% faster than those using paper-based checklists, while reducing error rates to 1% (smart glasses users made ten times less picking errors than the control group).

The smart glasses group also expended significantly less mental effort to find the items with the same heart rate variability as the group using paper.

Overall the usage of smart glasses empowers users and engenders positive attitudes toward their work and the technology: in comparison with the group following checklists, they felt the successful completion of tasks was more attributable to their own behavior. This corroborates other studies in efficiency gains such as this one, and demonstrates the level of impact Augmented Reality can have in the workplace.

You can read about more Augmented Reality research from Maastricht University and other university partners at this portal.

Maastricht University Logo




Factories of the Future

In a blog post last month, Giuseppe Scavo explored the Industrial Internet of Things (IIoT) and the growing trend of connected devices in factories. Smart devices and sensors can bring down production and maintenance costs while providing data for visualization in Augmented Reality devices.

Connecting AR and IIoT requires applied research. In this article we’ll look at the EU-sponsored SatisFactory project, which is focusing on employee satisfaction in factories by way of technology introduction.

Innovation in Industrial Production

In 2014, the European Union launched Horizon 2020, a seven-year research and innovation program (ending in 2020) dedicated to enhancing European competitiveness. Horizon 2020 is a partnership between public and private entities and receives nearly $90 billion in public funds. As the program’s website describes, Horizon 2020 aims to drive smart, sustainable and inclusive growth and jobs.

Factories

Within this push is the Factories of the Future initiative, a roadmap providing a vision and plan for adding new manufacturing technologies to the European production infrastructure. Objectives of Factories of the Future initiative include:

  • Increasing manufacturing competitiveness, sustainability, automation
  • Promoting energy-efficient processes, attractive workplaces, best practices and entrepreneurship
  • Supporting EU industrial policies and goals

To meet these objectives, ten partner companies and institutions from five European countries founded the SatisFactory consortium in 2015. SatisFactory is a three-year project aiming at developing and deploying technologies such as Augmented Reality, wearables and ubiquitous computing (e.g., AR-enabled smart glasses, etc.) and customized social communication and gamification platforms for context-aware control and adaptation of manufacturing processes and facilities.

SatisFactory-developed solutions seek higher productivity and flexibility, job education of workers, incident management, proactive maintenance and above all a balance between workers’ performance and satisfaction. The solutions are currently being validated at three pilot sites (one small- and two large-scale industrial facilities) pending release for use at industrial facilities throughout Europe.

Factories

Industry 4.0

SatisFactory’s vision of Industry 4.0 includes a framework with four sets of technologies:

  • Smart sensors and data analytics for collecting and processing multi-modal data of all types. The results of this real time data aggregation will include diagnosing and predicting production issues, understanding the evolution of the workplace occupancy model (e.g., balancing numbers of workers per shift) and enhancing context-aware control of production facilities (e.g., semantically enhanced knowledge for intra-factory information concerning production facilities, re-adaptation of production facilities, etc.).
  • Decision support systems for production line efficiency and worker safety and well-being. These systems can take many forms, ranging from Augmented Reality for human visualization of data to systems for incident detection and radio frequency localization.
  • Tools for collaboration and knowledge sharing, including knowledge bases and social collaboration platforms. Augmented Reality for training by remote instructors will provide flexibility and increase engagement. Collaborative tools also allow employees to exchange information and experiences, and these tools are combined with learning systems.
  • Augmented Reality and gamification can increase engagement. SatisFactory will use tools previously developed by consortium partners and, in pilot projects, explore use of smart glasses and human-machine interfaces. Interaction techniques and ubiquitous interfaces are also being explored.

satisfactory8-jaune

Pilot Sites

SatisFactory solutions are being tested at the pilot sites of three European companies:

  • The Chemical Process Engineering Research Institute (CPERI) is a non-profit research and technological development organization based in Thessaloniki, Greece. The company provides a test site for continuous manufacturing processes.
  • Comau S.p.A is a global supplier of industrial automation systems and services and is based in Turin, Italy. The company provides manufacturing systems for the automotive, aerospace, steel and petrochemical industries.
  • Systems Sunlight S.A. is headquartered in Athens, Greece, and produces energy storage and power systems for industrial, advanced technology and consumer applications.

In the next post, we’ll look at activities at the sites and how the project is applying Augmented Reality at the different production facilities.




Data Visualization with 3D Studio Blomberg

AREA member 3D Studio Blomberg (3DS) excels at visualization of data and especially at enterprise solutions for Augmented Reality. The AREA asked Pontus Blomberg, founder and CEO of 3DS, about his company’s history and projects in the space.

Q. Where do you have the greatest number of projects or customers?

Our customers are mainly in heavy industry, and include both large and mid-sized companies. We are also targeting the educational and consumer sectors for our AR solutions.

Q. How did 3DS become popular as a supplier to the industries you just identified?

Since the company’s founding we’ve led the way to digital transformation through advanced content delivery systems to promote process efficiency, expert knowledge and overall quality.

In 2006 we recognized the potential of AR to boost productivity in industrial workplaces and introduced the technology to Wartsila, a major Finnish power equipment supplier in 2008. At that time we evaluated ALVAR, Vuforia, and Metaio to survey their functionality from a visualization standpoint and assess their capabilities in handling 3D scenes and animations. In 2012 we delivered a proof of concept to Wartsila, and in 2013 we joined a Finnish national R&D program to study the potential of AR in knowledge sharing solutions for field service personnel.

3DS Wartsila

This study showed that research and practical industry applications were not in sync, and many players were concerned with achieving efficiency through dynamic AR content and data integration. We entered an AR solutions provider partnership with Metaio in 2014 but realized the platform focused on technology functionality rather than on system utilization and process implementation, which is our focus today. We are currently studying the potential of Osterhaut Design Group’s R-7 smart glasses and continue to perform proof of concept projects with emphasis on process analysis, system development and AR in production use.

Q. What are the most common metrics used to assess task performance or project success?

We recommend that customer metrics be in line with their quality management system for effective reference and comparison. Broadly speaking, examples of common metrics include:

  • Improvements in product and service quality
  • Effectiveness
  • Safety and risk reduction

Taking simple definitions of effectiveness (“doing the right thing”) and efficiency (“doing the thing right”), we believe it’s possible to work efficiently but it doesn’t contribute to productivity until we’re able to efficiently do the right things at the right time.

Q. What is your approach to AR introduction at customer sites?

As AR is new to most organizations, we recommend detailed analysis of the customer’s business strategy. In order to achieve digital transformation in line with the AR solution, the project needs to be aligned with the business strategy all the way to the board room. We also recommend demos and proof of concept projects to help organizations gain knowledge and understanding.

Q. How is data prepared for your customer projects?

It’s all a question of knowledge and experience gained through project implementation. Initially data has to be prepared manually, but at later stages of the project we’re better able to develop ways of handling new types of content in existing enterprise content systems.

Q. Do you get involved in the design of content that goes into pilot projects?

Yes, this is where our long experience and advantage really shines. Our expertise in visualization, combined with the customer’s industrial product and process expertise, play a significant role in achieving digital transformation through AR solutions. But no large-scale transformations can occur before new knowledge and tools are in place that allow for productivity and dynamic content.

Q. Do you study project risks with the customer or project leader?

There have been no major studies until now but naturally new technologies bring risks with them. Imagine driving your car with GPS assistance in heavy traffic and suddenly you can’t get a signal.

Q. Do you know if your customers perform user studies prior to and following use of the proposed system?

Yes, the fact that we start to see significant achievements in implementing AR solutions drives these kinds of studies. We’ve also had the chance to work together with partners in bigger collaborative research projects.

Q. What are the attitudes of those in the workplace where AR projects are successfully introduced?

Employees at the customer site are very positive and even surprised. We often encounter statements similar to, “Wow! I’ve seen this on YouTube and the Internet. It’s incredible to see that it really works.”

Q. Describe the technologies at play. What types of components do you offer?

Through our key partner network we offer the entire pipeline of smart glasses, mobile solutions, UIs, server-client databases and content development.

We use worldclass tracking technologies today but expect that Simultaneous Localization and Mapping (SLAM) technologies will gain ground. We realize this type of technology isn’t applicable in unique or dynamic situations at larger scales, although we’ve performed several demos and proof of concept projects with SLAM and the results are promising.

At the moment we see marker-based (or with code/ID) and geo-tracking as the most stable and flexible ways to acquire user context. We’ve built upon these technologies in our products and platforms.

At the same time we realize significant investment is needed in the modification of existing customer processes and new competences. To be successful, we aim to help our customers drive this change through systematic long-term cooperation.

Q. What must customers provide in terms of system components?

For rapid familiarization with the technology we recommend providing data to achieve a real look and feel. We recommend not overdoing it with complex UIs and information flows. Developing proof of concept projects with small, incremental steps for easy evaluation and quick changes is important to identify precisely the drivers of an AR introduction.

Q. With whom do you partner most often?

We partner with technology providers (hardware, software and tracking technologies), and we also see content providers as strategic because of their long-term customer relationships. To get all these complex systems to work together with business process changes is a team effort. It will take a few years. We aim to use what’s already been applied in an enterprise because we want to leverage the significant investments that have already been made in IT and visualization.

Q. What are the environmental conditions where customer projects are being conducted?

We’ve experienced both laboratory and real environmental conditions, especially in terms of lighting, vibrations and sound. Many of our customers use ruggedized solutions for their projects, which means unique and custom solutions for harsh, dynamic environments.

Q. What are your other offerings?

In terms of training, 3DS also provides competence development in combination with process development. For data, we use the customer’s cloud and offer commercial cloud solutions.

Q. What are the greatest challenges you currently face in AR introduction projects?

Customers often don’t have sufficient insight into the possibilities that emerging visualization technologies and content can provide. Therefore a clear understanding of customer expectations, goals and their business is needed. Customers also need a certain amount of trust that their expectations will be met.

Many times the only way forward is to agree on a proof of concept or demo that shows the technology, content, functionality, added value and supplier capabilities.

From the customer point of view, there are also uncertainties about the new types of content that will be needed to enrich the current PLM process to allow for visualization on a large scale. How will this information be connected and utilized together with the new visual content? We offer expertise in these questions and they need to be processed in very close cooperation with the customer as they touch the very core of their business.

Q. What are the future plans or next steps for your company?

We’ll continue to systematically monitor and build our international client base and partner network and develop state-of-the-art products and services.




Unity Gives Augmented Reality the Nod during Vision Summit 2016

If you saw the headlines coming out of Unity’s Vision Summit, you probably noticed a trend: Virtual Reality was the star of Vision Summit 2016. Valve’s Gabe Newell gave everyone an HTC Vive Pre. The Oculus Rift will come with a four-month Unity license. Unity is getting native support for Google Cardboard. At the summit, the expo floor had long lines for the “big three” head-mounted displays (HMDs): Sony’s PlayStation VR, Oculus Rift and HTC Vive.

It’s not that Augmented Reality was absent from what was billed as “The Definitive Event for Innovators in VR/AR,” but rather that the technology was in the minority of tools. This is the year of Virtual Reality, with the big three VR providers launching major products in March (Oculus), April (HTC) and sometime in the fall (Sony). The event was hosted by Unity, which caters almost exclusively to game developers needing comprehensive cross-platform development tools, and gaming in VR is expected to be huge. Virtual Reality was even the focus of the keynote, but astute observers might have noticed something.

Best Days of Augmented Reality Are Ahead

Unity’s own keynote referenced a report by Digi-Capital which predicts that the AR industry will have negligible revenue in 2016, but will surpass VR in 2019. In 2020, the AR industry is predicted to triple the value of VR. Take this with a grain of salt; Unity is in the business of selling licenses for their cross-platform game development toolset, so they’re incentivized to predict massive growth, but even reducing these numbers to a cynical level shows massive promise in a new field.

Most of this growth may be in gaming, but the AR presence on the expo floor leaned toward enterprise use. Epson was demonstrating their Moverio line of smart glasses, which has been around since 2012. Vuzix had their M-100 available to try, and they were eager to tout their upcoming AR3000 smartglasses.  In its booth, Vuforia demonstrated a Mixed Reality application on Gear VR that allowed the viewer to disassemble a motorcycle and view each part individually, which could be handy for vehicle technicians.

Of course, you can learn the most from hands-on experience with enterprise AR, which is exactly what NASA presented. They showed how they replaced complicated written procedures with contextual, relevant, clear instructions with AR using HoloLens. They also had a suite of visualization tools for collaborating on equipment design.

I presented the results of a year-long collaboration between Float and the CTTSO to develop an AR application designed to assist users in operational environments. We discussed the ins and outs of developing a “true AR” experience from the ground up, in addition to all of the lessons we learned doing image processing, using Project Tango, and more. At the end, I demonstrated the finished app, with integrated face recognition, text recognition, and navigation assistance supported either on an Epson Moverio or the Osterhout R-6.

An Increasing Focus

Vision Summit 2016 may have been a largely focused on VR, but that’s not a reflection of a lack of interest in AR. In our own research, we estimated that AR was lagging behind VR in terms of the technology readiness level by a few years. This was confirmed at the Vision Summit, but there’s still plenty of AR to get excited about. Valve even stated that they’d let developers access the external camera on the HTC Vive “in the long run” for Augmented and Mixed Reality applications. Expect next year’s Vision Summit to have a much larger focus on AR as this industry begins to truly take shape.

Did you attend Vision Summit 2016? What did you observe? Do you plan to attend the Unity event in 2017?




Augmented Reality in the Aerospace Industry

There are many use cases for Augmented Reality in the aerospace industry and the leaders in this industry have a long history with the technology. In this post, we review some of the milestones and provide highlights of the recent AREA webinar.

In 1969, while working in the Human Engineering Division of the Armstrong Aerospace Medical Research Laboratory (USAF), Wright-Patterson AFB, Thomas Furness presented a paper entitled “Helmet-Mounted Displays and their Aerospace Applications” to attendees of the National Aerospace Electronics Conference.

Over 20 years later the paper was one of eight references cited by two Boeing engineers, Thomas Caudell and David Mizell. In their 1992 paper published in the Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, Caudell and Mizell coined the term “Augmented Reality.” The degree to which the team drew from the work of Furness, who had started the Human Interface Technology Lab at University of Washington in 1989, is unclear but the focus of the Boeing team was on reducing errors when building wire harnesses for use in aircraft and other manual manufacturing tasks in aerospace. 

Aerospace

While the technology was not sufficiently mature to leave the lab or to deliver on its potential at the time, they suggested that with an AR-assisted system an engineer would in the future be able to perform tasks more quickly and with fewer errors. 

Proof of Concepts

Approximately fifteen years later, in 2008, Paul Davies, a research & development engineer at AREA member Boeing began working with Boeing Technical Fellow, Anthony Majoros. Together, Davies and Majoros picked up where the Caudell and Mizell paper left off. They used commercially-available technologies such as Total Immersion’s D’Fusion platform to show how technicians building satellites could perform complex tasks with Augmented Reality running on tablets.

Airbus has also been experimenting with Augmented Reality for over a decade. In this paper published in the ISMAR 2006 proceedings, Dominik Willers explains how Augmented Reality was being studied for assembly and service tasks but judged too immature for introduction into production environments. The paper, authored in collaboration with the Technical University of Munich, focused on the need for advances in tracking. 

Since those proof of concept projects, AR technology has advanced to the point that it is being explored for an increasing number of use cases in the aerospace industry. In parallel with the expansion of use cases, the pace of applied research into AR-enabling technology components has not abated.

Augmented Reality in Aerospace in 2016

While today AR may not be found in many aerospace production environments, the promise of the technology to increase efficiency is widely acknowledged.

On February 18, David Doral of AERTEC Solutions, Jim Novack of Talent Swarm, and Raul Alarcon of the European Space Agency joined Paul Davies and me to discuss the status of Augmented Reality in their companies and client projects.

Each participant described the use cases and drivers for Augmented Reality adoption. For Boeing, the key metrics are reduction of errors and time to task completion. Use cases include training and work assistance. AERTEC Solutions, which works closely with Airbus, and Talent Swarm are both focusing on use cases where live video from a head-mounted camera can bring greater understanding of a technician’s context and questions, and permit more rapid analysis and resolution of issues.

The European Space Agency sees a variety of use cases on Earth and in space. Inspection and quality assurance, for example, could benefit from the use of Augmented Reality-assisted systems.

Turbulence Ahead 

During the discussion, webinar panelists explored the obstacles that continue to prevent full-scale adoption. In general, most barriers to adoption can be considered as technological in nature. But there are also significant obstacles stemming from human factors and business considerations. We also discussed the degree to which other industries may be able to apply lessons learned from aerospace.

To learn more about the state of AR in the aerospace industry, please watch the webinar archive.

Do you have use cases and projects that you would like to share with the AREA and our audiences? Please let us know in the comments of this post.

 




Augmented Reality: the Human Interface with the Industrial Internet of Things

Are you noticing an emerging trend in manufacturing? After years of hype about Industry 4.0 and digital manufacturing, companies with industrial facilities are beginning to install Internet-connected sensors organized in networks of connected devices, also known as the Industrial Internet of Things (IIoT), in growing numbers.

Industrial IoT Is Not a Fad

According to a recent report published by Verizon, the number of IoT connections in the manufacturing sector rose 204% from 2013 to 2014. These connect data from industrial machines to services that provide alerts and instructions on consoles in control rooms to reduce plant downtime. The same Verizon study provides many examples of IIoT benefits in other industries as well: companies that move merchandise are reducing fuel consumption using data captured, transmitted and analyzed in near real time. Connected “smart” streetlights require less planned maintenance when their sensors send an alert for needed repairs. Other examples include smart meters in homes, which reduce the cost of operations for utilities. An analysis from the World Economic Forum describes other near-term advantages of globally introducing IIoT such as operational cost reduction, increasing worker efficiency and data monetization. These are only the tip of the iceberg of benefits.

Many predict that as a result of IIoT adoption, the global industrial landscape is shifting towards a more resource efficient, sustainable production economy. Part of the equation includes combining IIoT with other technologies. Companies that deploy IIoT must also build and maintain advanced systems to manage and mine Big Data.

Big Data

To act upon and even predict factory-related events in the future, companies need to mine Big Data and continually detect patterns in large-scale data sets with Deep Learning technologies. Combined with vast processing power “for hire” in the cloud, these technologies are putting cost-saving processes like predictive maintenance and dynamic fault correction within reach of many more companies. With predictive technologies, managers can optimize responses better and adapt their organizations more quickly to address incidents. A study from General Electric in collaboration with Accenture highlights that for this reason, two managers out of three are already planning to implement Big Data Mining as a follow up to IIoT implementation.

Data and Objects Also Need Human Interfaces

Having post-processing analytics and predictive technologies is valuable to those who are in control centers, but what happens when a technician is dispatched to the field or in the factory to service a connected machine? Augmented Reality provides the human workforce with an interface between the data from these sensors and the real world.

The real time visualization (or “consumption”) of sensor data is an important component of the larger equation. Sensor tracking protocols are not new. In fact, SCADA can be traced back to the ‘70s but when combined with Augmented Reality, new options are available. As industrial equipment becomes more and more complex, workers constantly face long procedures that often involve monitoring and decision-making. When assisted by Augmented Reality during this process, the worker with the contextual guidance as well as all the up-to-date information required for successful decision-making can perform tasks more quickly and with lower errors.

How It Works

Let’s examine a compelling use case for AR and IIoT: maintenance of Internet-connected machines. A worker servicing a machine facing a fault needs to access the real time data readings of the internal variables of all the machine components in order to diagnose the problem and choose the right procedure to apply. In current scenarios the worker needs to phone the central control room in order to access the data or, in some cases, retrieve the data readings from a nearby terminal, then return to the machine. With an AR-enabled device, the worker can simply point the device at the machine, visualize the real time internal readings overlaid on top of the respective components, and decide the best procedure (as shown in the ARise event presentation about data integration). The same device can then provide guidance for the procedure, informing the worker with the contextual data needed at every step.

Another use case that can benefit from the combination of AR and IoT is job documentation. Through the interaction with real time sensor data, workers can document the status of machines during each step, feeding data directly into ERP systems, without having to fill out long paper-based forms as part of their service documentation. Procedures can be documented with greater precision, eliminating the possibility for human error during data gathering.

Big Data and Augmented Reality

When deploying IoT in industrial contexts, entrepreneurs should take into account the two faces of the value of the data produced by this technology. The offline processing capabilities of Big Data Mining algorithms provide a powerful prediction and analysis tool. In parallel, implementing Augmented Reality allows those who are in the field to reap the benefits of having real time onsite contextual data. 

Some AREA members are already able to demonstrate the potential of combining sensors, Big Data and Augmented Reality. Have you heard of projects that tap IIoT in new and interesting ways with Augmented Reality? Share with us in the comments of this post.




Connecting Experts and the Field with XMReality

AREA members have a great deal of experience with implementing enterprise AR projects. We sat down with Niklas Rengfors, VP of Sales at XMReality, to learn how his company’s solutions and approach to AR introduction are helping to improve field service organizations with advanced remote assistance technologies.

What types of companies are using your solutions today?

We have the privilege to work with companies like Tetra Pak, Wärtsilä, Bombardier and Bosch Rexroth who have large, geographically dispersed field service organizations. Service professionals are called upon to perform routine service but sometimes they encounter situations that they don’t expect. Our systems can also be used to help those in two factories or two service centers visualize conditions and support one another using a live video enhanced with Augmented Reality.

XMReality_tablet

What are the reasons these customers have chosen to work with XMReality as a supplier?

One important factor is that we focus on industrial users, mainly asset-heavy companies with a worldwide support commitment and provide all the hardware, software and services they need to deploy for remote assistance. Since our standard solution is truly “plug and play,” they can quickly begin to get experience and results. Then we collaborate with our customers in order to provide additional Augmented Reality functionality.

How has the employee performance in the workplace where you’ve introduced Augmented Reality been impacted?

We always work with a customer to put a business case together before we know the size of the deployment and the investment required. Working with service organizations, they monitor a lot of metrics. For example, they know precisely how much time they spend travelling, how much of the service they provide is under warranty, etc.
The most popular KPIs are

  • First time fix ratio
  • Travel costs
  • Manhours to complete a task
  • Uptime on the asset/machine

What is your company’s recommended approach to AR introduction? Are there steps or a model/method you follow?  

It is very important to have a plan and to follow the plan when new technology is being introduced. We have developed our own methodology. XMWork is a project planning framework we provide for both proof of concepts and also roll-outs, on which we collaborate with the customer.

Do you get involved in the design of the content that will be used in the introduction project/pilots?

Yes, that’s part of our full turnkey service. It is important to align the customer expectation with the technical possibilities and sometimes the customer does not have the skills or tools in-house to make the changes that are required.

How is data prepared for your customer projects?

Once the customer identifies the data they want to use, in meetings and sometimes in workshops, they provide it to us. Our engineers will then modify and enhance it for use in remote assistance using our technology. Sometimes this involves breaking the information down into smaller parts. Sometimes we need to prepare an animation or illustration. It depends on the project and the data we are provided.

What is the profile of a typical user who performs the selected tasks using your product? Are they highly trained professionals?

The users of our systems are technicians and field engineers, so-called “blue collar workers.” There’s little training required for our solution so users don’t need special certification for that.

Do you study project risks with the customer or project leader?

Yes, it is important that customers share and decide the risk level that is acceptable. We see in some cases where smart glasses are worn and might require extra precautions. For example, the person using the glasses needs to detect potential danger such as forklifts in the vicinity. Also some technicians need to climb into machines so they must see where they put their feet. These are questions that typically emerge which we are evaluating project risks.

Do your customers perform user studies prior to and following the use of the XMReality system?

Absolutely! Customers prepare a business case to get funding prior to the project but then they must update these calculations once they have more experience with the technology and use cases. It is very important for us and the customer to study acceptance rates and we frequently help the customer in this study or in creating the business case.

What are the attitudes of those in the workplace where AR projects are successfully introduced?

It depends a lot on the personality and age of the user. Younger people tend to adopt new technology more quickly. Others are a bit more conservative when asked to use new technology. When the user sees the efficiency increase, though, even the more skeptical ones are eager to adopt this type of technology.

LikeBeingThere

Considering the three ingredients of enterprise AR (hardware, software and content), what are the components of the system(s) you offer?

Core in our offering is the software. Customers are able to use their own devices but we also offer our own hardware, hands-free displays that we call “video goggles” and also tablets. For some, hands-free operations is of big importance, for some not. We can also provide accessories such as tool belts in order to improve accessibility of all the tools and technologies technicians require.

What are the greatest challenges you face in current introduction projects?

At this time, it’s quite a challenge to find and secure the right project sponsors. Then we have to support them to obtain project funding and a qualified project manager. We collaborate and consult a great deal to make sure everyone is comfortable with the project scope and that the solutions we offer will meet or exceed the expectation of the project.

What are the future plans/next steps for your company?

We are continually developing our Remote Guidance solution and also expanding the type of Augmented Reality projects we can do. Part of this requires our establishing partnerships with manufacturers of smart glasses so that the customer’s requirements are satisfied. We are always interested in meeting new potential partners and working with them to bring more complete solutions to our customers.




Efficiency Climbs Where Augmented Reality Meets Building Information Management

At Talent Swarm we envisage that by using pre-existing platforms and standards for technical communication, our customers will reach new and higher levels of efficiency. Our vision relies on video calling to make highly qualified remote experts available on demand, and the data from Building Information Management (BIM) systems will enhance those live video communications using Augmented Reality.

Converging Worlds

There have been significant improvements in video calling and data sharing platforms and protocols since their introduction two decades ago. The technologies have expanded in terms of features and ability to support large groups simultaneously. Using H.264 and custom extensions, a platform or “communal space” permits people to interact seamlessly with remote presence tools.  The technology for these real time, parallel digital and physical worlds is already commonplace in online video gaming. 

But there are many differences between what gamers do at their consoles and enterprise employees do on job sites. As our professional workforce increasingly uses high-performance mobile devices and networks, these differences will decline. Protocols and platforms will connect a global, professionally certified talent pool to collaborate with their peers on-site. 

Enterprises also have the ability to log communications and activities in the physical world in a completely accurate, parallel digital world.

Growth with Lower Risk

We believe that introducing next generation Collaborative Work Environments (CWE) will empower managers in many large industries, such as engineering, construction, aviation and defense. They will begin tapping the significant infrastructure now available to address the needs of technical personnel, as well as scientific research and e-commerce challenges. When companies in these industries put the latest technologies to work for their projects, risks will decline.

Most IT groups in large-scale engineering and construction companies now have an exhaustive register of 3D models that describe every part of a project. These are developed individually and used from initial design through construction. But these have yet to be put to their full use. One reason is that they are costly to produce, and companies are not able to re-use models created by third parties. There are no codes or systems that help the companies’ IT departments determine origins of models or if the proposed model is accurate. The risks of relying on uncertified models, then learning that there is a shortcoming or the model is not available when needed, are too great.

Another barrier to our vision is that risk-averse industries and enterprises are slow in evaluating and adopting new hardware. Meanwhile, hardware evolves rapidly. In recent years, video conferencing has matured in parallel with faster processors and runs on many mobile platforms. Specialized glasses (such as ODG´s R-7s, Atheer Air and, soon, Microsoft’s HoloLens), helmets (DAQRI´s Smart Helmet), real time point-cloud scanners (such as those provided by Leica or Dot Products) or even tablets and cell phones can capture the physical world to generate “virtual environments.”

With enterprise-ready versions of these tools coupled with existing standards adopted for use in specific industries, the digital and physical worlds can be linked, with data flowing bi-directionally in real time. For example, a control room operator can see a local operator as an avatar in the digital world. By viewing the video streaming from a camera mounted on the local operator’s glasses, the remote operator can provide remote guidance in real time. 

Standards are Important Building Blocks

At Talent Swarm, we have undertaken a detailed analysis of the standards in the construction industry and explored how to leverage and extend these standards to build a large-scale, cloud-based repository for building design, construction and operation.

We’ve concluded that Building Information Management (BIM) standards are reaching a level of maturity that makes them well suited for developing a parallel digital world as we suggest. Such a repository of 3D models of standard parts and components will permit an industry, and eventually many disparate industries, to reduce significant barriers to efficiency. Engineers will not need to spend days or weeks developing the models they need to describe a buttress or other standard components.

Partnerships are Essential

The project we have in mind is large and we are looking for qualified partners in the engineering, construction and oil and gas industries, and with government agencies, to begin developing initial repositories of 3D models of the physical world.

By structuring these repositories during the design phase, and maintaining and adding to this information in real time from on-site cameras, we will be able to refine and prove CWE concepts and get closer to delivering on the promise.

Gradually, throughout the assembly and construction phases we will build a database that tracks the real world from cradle to grave. Analyzing these databases of objects and traces of physical world changes with Big Data tools will render improvement and maintenance insights previously impossible to extract from disjointed, incomplete records. We believe that such a collaborative project will pave the way towards self-repairing, sentient systems.

We look forward to hearing from those who are interested in testing the concepts in this post and collaborating towards the development of unprecedented Collaborative Work Environments.  




New Augmented Reality Case Studies Suggest Productivity Improvement

In the future, Augmented Reality could play a role in a variety of production or assembly processes. On the one hand it can provide support for those working on individual, custom products made in mom-and-pop shops or by specialized welders on location. At the other extreme, Augmented Reality can also play a role in high-volume, low-mix manufacturing in factories full of automated and specialized machines.

In highly automated production facilities, workers are few and far between. Their role is to anticipate and respond to the needs of machines. These machines usually have dozens or even hundreds of sensors continually capturing information about the machine’s activities in the real world.

In today’s factories, most sensor data is sent directly to a control room. Human operators receive alerts or make decisions based on raw readings or on algorithms that analyze the sensor observations, and then go to the machine to perform planned and unplanned procedures on the equipment. The operator travels between the control room and the production machinery to determine the status as procedures are implemented. There may be changes in the data while the operator is in transit. The operator may make mental errors, forget or invert data when transcribing observations or once at the machine.

New case studies recently released by AREA member DAQRI provide a glimpse into the future.

Kazakhstan Seamless Pipe Steel Operators See More

A team of DAQRI solution architects visited the Kazakhstan Seamless Pipe Steel (KSP Steel) factory in Pavlodar, Kazakhstan and studied the problems facing machinery operators up close. They then developed and demonstrated an application for Hot Rolling Mill Line optimization using the DAQRI Smart Helmet.

Live machine performance data could be seen in real time by those using the DSH when on the shop floor. The factory supervisor remarked that this technology has the potential to “decentralize” the control room and reduce the time for workers to respond to machinery performance data.

The results of the demonstration suggest that using Augmented Reality in the manner implemented by this project could reduce downtime by 50% and increase machine operator productivity by 40%.

More information about this project and a video of the DSH in use are available on the DAQRI web site.

HyperLoop Welders Receive Support on the Spot

A project involving the DSH on the HyperLoop, a transportation system invented by Elon Musk and being prototyped in 2016, demonstrates another use case that has a great deal of potential to offer productivity gains.

In a proof of concept with HyperLoop engineers and the DSH Remote Expert application, experts in a central “command” center view live video coming from remote robotic welders. The supervising engineer in the Los Angeles office sees construction progress and provides audio and telestration guidance while a welder performs a very specific spot weld. The description of the project and a video of the DSH in use are also available on DAQRI’s web site.

Tip of the Iceberg

These case studies reveal the potential for dramatic productivity improvements when workers are equipped with Augmented Reality-assisted systems such as the DSH.

Other enterprise customers are testing the use of Augmented Reality for manufacturing and production of a wide range of products. Stay tuned! New case studies with details about the potential for significant customer benefit will soon be coming to light.

If you have a case study that you would like to share, provide a link to it in the comments of this post or contact the AREA’s editorial team. We will be happy to support the preparation and publication of your case studies and testimonials.

Daqri_logo_Horizontal-sm

 




Customers Are in Focus at Augmented World Expo

By Christine Perey and Ketan Joshi

Every enterprise AR project is a tremendous learning experience. While every enterprise AR project requires a team, there’s always that shining hero without whose commitment the project would not have come into existence. These heroes of enterprise AR will be the focus of attention during a full day of sessions of the Augmented World Expo 2016 Enterprise AR track.

The in-house managers of the first enterprise AR projects at customer organizations are a special breed. They are special by virtue of their vision, their passion, their persistence and their ability to span many disciplines and stakeholders.

On the one hand they must master dialects of an emerging “Augmented Reality” language that vendors speak, from the nitty gritty details of tracking technology to the subtleties of interactions like hand gestures and voice commands. On the other, they must know when and how to manage their company’s internal IT department priorities and constraints.

And they are rarely recognized for their role in bringing Augmented Reality from science project to enterprise-ready solution.

Bringing the Best and Brightest to the AWE Stage

The AREA is hosting the AWE Enterprise AR track. June 2 will be dedicated to presentations by, and discussions with extraordinary enterprise project managers as they share their important AR project achievements.

AWE

While AREA members will bring these pioneering enterprise project managers to the AWE stage, we are sure there are many others who have gone unnoticed.

  • Are you a leader in a company that has been testing enterprise AR?
  • Did you sacrifice nights, weekends and holidays to make sure that your project stayed on course and could continue?
  • Do you feel you’ve had to reset every goal and yet have never forgotten the ultimate benefits that your company could gain from enterprise AR introduction?

We hope you will let us know if you are one of this special breed, or if you know a manager at a customer company who has such experiences to share.

A Simple Framework

During these AREA-hosted Enterprise AR track sessions, AWE delegates will learn about a variety of unique enterprise Augmented Reality pilot projects and deployments. The presentations will follow a framework that will provide practical guidance to those who will follow in their footsteps.

The case studies will cover:

  • Use cases
    • Tasks or processes prior to AR implementation and selection criteria
  • Custom or off-the-shelf tools and services used in the project
    • Selection process of project partners
  • Project time and resource requirements
  • Demonstration or a video of the solution in action
  • Project outcomes and their measurement
  • Future plans

With your support, we are looking forward to identifying and bringing together the heroes of enterprise AR projects and celebrating their achievements on June 2.