1

Building an immersive pharma experience with XR technology

In the world of pharma manufacturing, precision is key. To execute flawlessly, pharmaceutical scientists and operators need the proper training and tools to accomplish the task. User-friendly augmented reality (AR) and mixed reality (XR) technology that can provide workflow guidance to operators is invaluable, helping name brand companies get drugs, vaccines, and advanced therapies to patients faster.

AR has been a cost-effective way to improve training, knowledge transfers, and process execution in the lab during drug discovery and in the manufacturing suite during product commercialization. Apprentice’s AR Research Department is now seeing greater demand within the pharma industry for XR software capabilities that allow life science teams to use 3D holograms to accomplish tasks.

For example, operators are able to map out an entire biomanufacturing suite in 3D using XR technology. This allows them to consume instructional data while they work with both hands, or better understand equipment layouts. They can see and touch virtual objects within their environment, providing better context and a much more in-depth experience than AR provides.

Users can even suspend metadata in a 3D space, such as the entrance to a room, so that they can interact with their environment in a much more complete way, with equipment, objects and instruments tethered to space. Notifications regarding gowning requirements or biohazard warnings for example will automatically pop up as the operator walks in, enriching the environment with information that’s useful to them.

“It’s all about enhancing the user experience,” Linas Ozeratis, Mixed Reality Engineer at Apprentice.io. “At apprentice, our AR/XR Research Team has designed pharma-specific mixed-reality software for the HoloLens device that will offer our customers an easier, more immersive experience in the lab and suite.”

Apprentice’s XR/AR Research Team is currently experimenting with new menu design components for the HoloLens device that will reshape the future of XR user experiences, making it easier for them to interact with menus using just their fingers.

Apprentice’s “finger menu” feature allows users to trigger an action or step by ‘snapping’ together the thumb and individual fingers of the same hand. Each finger contains a different action button that can be triggered at any time during an operator’s workflow.

“Through our research, we’ve determined that the fingers are an ideal location for attaching AR buttons, because it allows users to trigger next steps without their arm or hand blocking the data they need,” Ozeratis added.  It’s quite literally technology at your fingertips.”

Why does the pharma industry want technology like this? Aside from the demand, there are situations where tools like voice commands are simply not feasible. The AR Research Team also learned that interactive finger menus feel more natural to users and can be mastered quickly. Life science teams are able to enhance training capabilities, improve execution reliability and expand the types of supporting devices they can apply within their various environments.

“Introducing these exciting and highly anticipated XR capabilities is just one stop on our roadmap,” Ozeratis adds. “There are bigger and bolder things ahead that we look forward to sharing as the pharma industry continues to demand more modern, intelligent technologies that improve efficiency and speed.”




Rokid displayed their AR glasses to AWE 2022

Liang Guan, General Manager at Rokid, enthusiastically stated:
“Numerous top-tech companies currently explore AR, XR, or the metaverse. As early as 2016, Rokid has been proactively expanding our AR product pipeline across leading technological areas of optics, chips, smart voice, and visual image. Today, we have X-Craft deployed in over 70 regions and Air Pro has been widely used in 60+ museums around the world. Moving forward, Rokid will keep delivering real value to enterprises through its line of AR products.”

Rokid products empower the frontline workforce, providing real-time analysis, views, and documents to the control center. Many media and participants were surprised after trying Rokid products. Saying that the various control modes provided by Rokid AR glasses are very convenient for users to operate and can effectively improve work efficiency.

Rokid X-Craft, demonstrated live at the AWE 2022, has officially received ATEX Zone 1 certification from TUV Rheinland Group. Becoming the world’s first explosion-proof, waterproof, dustproof, 5G, and GPS-supported XR device. This is not only a great advance in AR and 5G technology but also a breakthrough in AR explosion-proof applications in the industrial field. Many users at the event said after the trial that safety headsets are comfortable to wear and are highly competitive products in the market. It not only effectively ensures the safety of front-end staff, but also helps oil and gas fields increase production capacity.

Rokid Air Pro, a powerful binocular AR glasses, features voice control to help you enjoy a wide variety of media including games, movies, and augmented reality experiences. Rokid Glass 2, provided real-time analysis, views, and documents to the control center, and successfully improved traffic management and prevention to ensure the long- term stability of the city.

 

 




Contextere launches Madison, an insight engine for the frontline industrial workforce

Each day, millions of men and women in industrial organizations throughout the world spend over 30% of their workday on non-productive time (NPT) activities[i]. They are not idly wasting time, rather they are actively trying to find the right information, waiting for guidance, or attempting to coordinate with other work teams. While this condition exists throughout companies and across industries, it is particularly endemic within technical maintenance and operations activities.  Compounding the situation is that once these workers have the information they need, they are still likely to do the job incorrectly 25% of the time[ii]. This elevated human error rate (HER) results in costly rework as well as increased potential of catastrophic equipment failure and human injury.

The causes of high NPT and HER on the industrial frontline can be as varied as the companies that experience the problem. In most organizations, data is trapped in silos and contextual relevance across functional domains and activities is often lost. Information technology systems remain disconnected from operational technology systems which keeps critical insights from reaching workers on the last tactical mile. And despite massive corporate investment in data capture and analytics, the application of that information remains constrained to headquarters operations – enterprise efficiency and production optimization, and capital equipment investment planning. Frontline workers rarely have access to information that may be relevant to their own decision-making and activities.

Exacerbating the obstacles outlined above is a fundamental structural issue that continues to impact companies – a workforce skills gap. The adoption of new technologies and shifts in demographics have been radically transforming the way that organizations conduct business and the type of skills needed in their workforces. Accelerating workforce retirement and an overly long time to proficiency when onboarding new staff is resulting in the loss of tacit expert knowledge and a lack of skilled personnel. This skills gap worsens NPT and HER as smaller teams of more inexperienced workers must maintain, repair, and operate increasingly complex equipment with less knowledge and fewer resources available to them.

The Madison Insight Engine, recently launched by AREA member Contextere, is the first solution to combine data extraction, machine learning, and natural language understanding to provide insights and decision support to frontline technical workers maintaining, repairing, operating, and manufacturing complex equipment. This capability enables industrial workers to get the job done right the first time, develop their knowledge and skills on the job, and improve their productivity and safety.

In recognizing Contextere in its 2020 Cool Vendors for the Digital Workplace report[iii], Gartner noted that analytics and insights engines “typically focus on the needs of desk-based workers in large organizations,” whereas the Madison Insight Engine is unique in that it “uses context alone to proactively deliver all of the relevant information needed to complete a task” regardless of user location or domain.

Madison applies machine learning together with conversational natural language processing to deliver curated guidance proactively and predictively to a technician or analyst in an industrial setting based on their evolving local real-time context. The focus of Madison algorithms is to determine and deliver just the right piece of information – a reductionist approach to curating the vast amount of available enterprise data.

Industrial organizations across the globe are seeking to address productivity issues and a widening skills gap in their frontline workforces. By providing critical information proactively, when and where it is needed, the Madison Insight Engine enables each industrial worker to continuously grow their knowledge and competency on the job, perform their tasks safely, and be their productive best. In turn, companies receive the benefit of effective workforce development, maximum equipment uptime, and optimal human-machine performance. To learn more and see a demonstration of Madison, go here.

[i] Slaughter, A, Bean, G., & Mittal, A. (2015, August 14). Connected barrels: Transforming oil and gas strategies with the Internet of Things. Retrieved from http://dupress.com/articles/ internet-of-things-iot-in-oil-and-gas-industry/

[ii] Lyden, S. (2015). First-Time Fix Rate: Top 5 Field Service Power Metrics. Retrieved from https://www.servicemax.com/uk/fsd/2015/04/13/first-time-fix-rate-field-service-metrics-that-matter/

[iii] https://www.gartner.com/en/documents/3985043




AREA ED Explores Immersive Technologies on Mouser Podcast

What does the term “Immersive Technologies” encompass? And how are these technologies evolving to solve more and more business needs? Mouser Electronics’ The Tech Between Us podcast took up these questions – and more – recently when host Raymond Yin spoke with AREA Executive Director Mark Sage.

 

Mark and Raymond take a closer look at everything from remote assistance and guidance to digital twins and remote collaboration. Immerse yourself in this lively discussion. Listen here.




XR at Work Podcast is Here to Talk Shop with AR Practitioners

XR at Work Podcast is Here to Talk Shop with AR Practitioners

We got together with Scott and Dane recently to learn more about the podcast and what they hope to accomplish with it.

AREA: Before we get into XR@Work, could you tell us what you do for a living?

Scott: I’m a Principal XR Product Manager for WestRock, a global consumer packaging manufacturing company. I’m responsible for all things XR-related for our 300 factories and our customer interactions.

Dane: I’m on the business transformation team for INVISTA, a polymer manufacturing company and subsidiary of Koch Industries. I lead XR and digital twin within INVISTA and I also lead the Republic of Science, a community of practice across Koch for XR technologies.

AREA: How did you two meet up?

Dane: We were both on a panel at AWE on real-life practitioners and Scott and I hit it off really well. There’s a fair number of people looking to get into the XR space that don’t have anybody else to reach out to, other than a vendor. Scott and I had conversations about how hard it is getting started and that’s what led to the podcast.

AREA: And when did the podcast start?

Scott: I think it was November of last year.

AREA: What’s the mission of XR at Work?

Scott: What Dane said is absolutely true. New folks starting off in Extended Reality in the workplace are being asked to do something that is still emerging, that can be confusing, and that has a lot of misinformation around it. So our goal is to do two things with XR at Work. Number one, we want to provide insight and guidance to XR practitioners in enterprise. And second, we want to foster and build a community of Extended Reality professionals that work in industrial environments – everything from oil and gas to manufacturing to automotive to logistics. The idea is to get us together to share ideas and best practices.

AREA: So your focus is really complementary to what the AREA focuses on. We’re both serving the enterprise, but XR at Work is more exclusively targeting industrial companies.

Scott: Yeah, I think that’s a fair assessment.

AREA: Where do interested people go to check out XR at Work?

Scott: We have two main places where people can connect with us. Number one is LinkedIn. We have an XR at Work company page where we invite folks to follow us. On that LinkedIn page, we will post when we have a new podcast up or we speak somewhere or we see new opportunities. The second place is YouTube.

AREA: For people who haven’t seen the podcast, what can viewers expect? What’s the range of topics discussed?

Dane: We’ve started with pragmatic discussions around core AR/VR applications and topics, such as remote assistance, guided workflows, and how to scale. More recently, we’ve started doing interviews with people who work in the industry. No offense to vendors, but our goal is to keep it community-focused around the practitioner side of the house. We want to hear from people who are already working with XR – what’s working for them, what’s not, where the field is heading, the whole metaverse concept. We’re also thinking about adding things like hardware reviews, although we want to be careful to keep it community-focused and not be beholden to somebody because they sent us a headset. That’s the key to us – to be authentic.

AREA: It sounds like the range of content really goes from helping people get started in XR to sharing tips and techniques for people who already have some proficiency. What are your long-term goals for the podcast?

Scott: In addition to the stuff Dane talked about, we’re looking at taking part in some larger events, doing a live broadcast from an event this year. We want to be seen as everyman’s XR thought leaders. We live and breathe in the factory and rugged environments, putting devices on the heads and in the hands of industrial workers. Our goal is to be seen as the go-to friendly voice in the wilderness for a community that’s trying to find real answers – not the answers they get from sizzle reels or market videos or salespeople.

AREA: I would presume you’re also hoping to learn from this – so that you can apply new ideas to your “day jobs.”

Dane: XR at Work does give us access to other people who are doing things. A lot of the stuff in the XR space is really hard. How do you manage headsets at 300 facilities like Scott’s doing? How do we go ahead as a business if our favored headset is being discontinued? There are a lot of challenges you run into as you’re managing this across a business. This gives us a chance to talk to other people who have maybe thought differently about it and we can learn from. We also like to understand what’s coming in the hardware space, so my hope is that we can be a partner to people building products to offer them insights to support product development.

Scott: We look forward to building a community and interacting more with the members of the AREA.




Masters of Pie Wants to Hear About Your XR Collaboration Experiences and Plans

Survey

The Masters of Pie team is especially interested in hearing from IT managers and C-level executives knowledgeable about the broad application of XR collaboration use cases across their businesses. They’re seeking input from leading companies in a broad range of industries, including manufacturing/engineering, construction, healthcare, defense, and energy. Even organizations that are just beginning to adopt immersive technologies are invited to participate.

 

To take part, please visit the survey site and submit your information by April 20. Thank you for helping further the AR ecosystem’s understanding of how XR collaboration is gaining traction.




AREA Member Apprentice.io Raises $100M for Pharma AR Platform

AREA Member Apprentice.io Raises $100M for Pharma AR Platform

Tempo brings the transformative power of technology to an industry that is still largely paper-based. It accelerates the entire drug production lifecycle by orchestrating manufacturing across global teams and sites with one shared platform.

 

Tempo also expands Apprentice’s footprint in the AR space. It enables manufacturing operators to use AR to:

  • Reduce human error as operators follow audio or text instructions enhanced with added photo, video, or AR overlay directions that are specific to their work environment or equipment, making each workflow step clear.
  • Increase efficiency and overcome production delays by supporting cross-team collaboration and remote support through video conferencing that utilizes AR directional tools such as live drawing, arrows, laser and pointers.

 

Apprentice leverages AR headsets to empower operators and scientists in the lab and manufacturing to work with greater efficiency and speed, without having to reference cumbersome paper-based procedural manuals or record handwritten documentation. Using voice commands and intelligent data capture, operators can easily access their procedures using their headsets. They can intelligently collect, store or reference critical data as they go, without any interruption to their workflow. With 1,500+ devices deployed, Apprentice believes it has the largest wearables deployment in enterprise manufacturing.

 

“This recent funding is a testament to the power of Augmented Reality,” says Angelo Stracquatanio, CEO of Apprentice. “AR and wearables have long held the promise to change the way we work. With pharma manufacturing, we’ve found a meaningful application of this technology that truly helps the operator execute better – for the benefit of patients everywhere.”

 

Apprentice is also expanding into Europe and Asia and continues to grow the company to further fuel its 12-fold revenue growth and sixfold growth in employees. Learn more here.




Jon Kies Explores the Potential of the AREA Human Factors Committee

AR and Human Factor

AREA: What does Human Factors in Augmented Reality encompass?

Kies: Human Factors is the study of humans, from both cognitive and physical perspectives. We investigate how humans interact with devices, applications, and services, and incorporate those insights into the design of systems. In the case of AR, it’s especially important because you may be wearing a device on your head, and interacting via an interface overlaid on the real world.  This is arguably one of the most challenging design problems.

 

AREA: Do we still have a lot to learn about the Human Factors implications of AR?

Kies: That’s absolutely the case. The technology is still evolving. Many current devices can’t be used for a significant amount of time. It’s going to get there, but there are some technical hurdles that need to be resolved. That’s why it’s super-important that human characteristics become part of the requirements and are factored into the device design process.

 

AREA: How much of our past user experience knowledge is relatable to AR, and how much is starting from scratch?

Kies: We’re not entirely starting from scratch. A lot of people in the field have experience designing for 2D interfaces like smartphones. But you then have to translate that to a spatial computing paradigm where everything is not only in 3D, but also superimposed on the real world. That’s unlike a smartphone or a PC, where the interface is primarily contained in a rectangle. That’s what makes AR enormously challenging compared to working with other computing platforms. But there has been a lot of research in AR and VR in the military and universities, so there’s a lot to glean from those areas, and established human-centered design processes are still relevant.

 

AREA: What’s your top priority for the AREA Human Factors Committee this year?

Kies: Our overriding goal is to identify and develop best practices to help ensure the best possible AR user experience. In pursuit of that goal, our number-one priority is to engage more with academic research labs – to invite them to share their findings with the AREA membership. They are often experimenting with or building the latest technologies and they’re learning a great deal from their studies. Another thing we’re discussing is compiling a set of unique human-centered design practices that are pertinent to AR systems. And of course, we always want to get more AREA members involved in the Committee.

 

AREA: What’s your pitch for why AREA members should get involved in the Human Factors Committee?

Kies: My bias is toward conversation. Having meetings that act as a forum where people can talk about the challenges they’re facing, the successes they’ve had, and just connect – that’s a compelling reason to participate. By participating in Human Factors Committee meetings, end-user members have an opportunity to hear about other members’ experiences and lessons learned and apply that knowledge to their own efforts. For AR solutions providers, it’s an opportunity to get direct feedback from the AR user community.  We also hope that concrete deliverables, like guidance on design, will enable AREA members to optimize their enterprise AR solutions for their target users.

 

It’s all about making connections and enabling dialogue – between users and providers, between the AR ecosystem and academic institutions – to everyone’s benefit. We’d like to build out a vibrant AR Human Factors community where people are learning from each other, contributing ideas, highlighting new discoveries, and finding solutions.

 

If you’re an AREA member and would like more information about joining the AREA Human Factors Committee, contact Jonathan Kies or AREA Executive Director Mark Sage. If you’re not yet an AREA member but interested in AR human factors and design, please consider joining; you can find member information here.

 




AREA Safety Playbook Offers Step-by-Step Guide to Protect Workers

The Augmented Reality Best Practice Safety Playbook discusses:

  • Risk factors to consider when using AR systems in work environments
  • Risk assessment tools and methods
  • Usability considerations
  • User medical evaluation criteria
  • Cleanliness and disinfection procedures
  • Safety awareness training, and more

 

“Enterprise AR often brings new devices, new working methods, and new modes of user interaction into the workplace. With that in mind, organizations adopting AR need a thorough understanding of health and safety risks and how best to mitigate them,” said Mark Sage, Executive Director, the AREA. “The playbook helps organizations avoid safety issues before they occur and helps ensure AR solution meet an organizations expectation for productivity and cost savings.”

 

The AREA Safety Committee provided expert input and insight to produce the playbook.

 

Download the Augmented Reality Best Practice Safety Playbook for more information and a list of contributors. To learn more about AREA membership and the work of the AREA Safety Committee, please get in touch with AREA Executive Director Mark Sage at mark@thearea.org.

 

About AREA

The Augmented Reality for Enterprise Alliance (AREA) is the only global non-profit, member-based organization dedicated to the widespread adoption of interoperable AR-enabled enterprise systems. Whether you view it as the next computing paradigm, the key to breakthroughs in manufacturing and service efficiencies, or the door to as-yet unimagined applications, AR will have an unprecedented impact on enterprises of all kinds. AREA is a managed program of Object Management Group® (OMG®). Visit https://thearea.org for more information.

Note to editors: Object Management Group and the OMG acronym are registered trademarks of the Object Management Group. For a listing of all OMG trademarks, visit https://www.omg.org/legal/tm_list.htm. All other trademarks are the property of their respective owners.

 

Media Contact:

Karen Quatromoni

Karen@omg.org




Microsoft Power Apps Make AR Part of Core Business Solutions

Moreover, Uitz and Pile expect enterprise users to add those Augmented Reality / Mixed Reality capabilities themselves – without help from AR solutions developers.

 

The key is Microsoft Power Apps.

 

“Power Apps is a low-code, no-code application platform,” explained Uitz. “It enables anyone to quickly and easily build and deploy sophisticated applications by using drag-and-drop controls to pull in data from any data source.” Introduced in 2018, Power Apps got its first Mixed Reality capabilities in 2020.

 

“We added straightforward Mixed Reality capabilities that enable you to build sophisticated, device-centric applications that leverage a phone’s built-in sensors to use MR to see images and models in a space as well as measure things,” said Uitz. “We’ve seen many customers leveraging their mission-critical Power Apps business applications for huge improvements in their workflows.”

 

According to Uitz and Pile, the AR-enhanced Power Apps applications tend to focus on three areas. The first is sales team enablement. For example, salespeople are using Power Apps’ MR capabilities to help their customers visualize their products in their environment before they buy. A consumer packaged goods company salesperson could use AR to show a retailer how their product would look when installed in their stores and what it would mean in terms of incremental sales. That visualization can help close deals.

 

The AR visualization capabilities can be useful post-sales, as well. For example, a company can provide the installation team with images from various angles showing exactly where a product – visualized at real-world scale in their customer’s site through AR – needs to be installed.

 

Microsoft is also seeing its customers embrace the new AR capabilities for measuring applications. Armed with just a mobile phone, a flooring contractor can quickly measure an area to provide an accurate estimate of the amount of flooring needed for the space. Because it’s integrated with Power Apps, Dynamics 365, and Dataverse, it can be set up to support the entire business workflow.

 

“The user can open the Power Apps application, press ‘measure,’ take the measurements, press ‘submit,’ and the pricing calculations are done automatically, captured in the system, and an estimate email automatically sent to the customer,” said Uitz.

 

Another popular measuring use case is auditing. For example, users can use the measuring capability to confirm that a building is compliant with their local building code, including enough space for egress and sufficient lines of sight. This can save hours of time and effort doing physical measurements and recording data by hand.

 

“We’re all about democratizing Mixed Reality – making it another tool in the worker’s toolbox,” said Uitz. On top of that, Power Apps provides enterprise security and enterprise scalability, so a user-developed AR-enabled application can easily ramp up from a small, local trial to an enterprise-wide deployment without difficulty.

 

The democratizing of Mixed Reality extends to the hardware requirements, as well. Power Apps’ MR capabilities do not require a HoloLens; they work with any iPhone, iPad, or Android phone that supports ARKit / ARCore.

 

“The goal is to get companies to integrate MR capabilities into their mission-critical workflows to make their lives better,” said Uitz.

 

As these capabilities become better known, Uitz and Pile are expecting more Power Apps users to take advantage of the ability to: view and manipulate 3D content; overlay 3D content and 2D images onto the feed from the camera; measure distance, area, and volume using a device with AR; and identify spaces in the real world through an AR overlay.

 

Meanwhile, Microsoft is continuing to enhance the software to add additional industrial-strength features, and the Power Apps team is open to working with customers to add capabilities for their particular use cases.

 

“More often than not, it’s not a new thing that they want to do,” explained Pile. “It’s something that they’ve always done, but they want to do it faster, or at lower cost, or integrate into existing workflows. That’s where our primary focus is.”

 

Another key focus is getting the word out. Uitz, Pile, and the rest of the Power Apps team have been offering a variety of resources to increase awareness among customers and get them thinking about what AR capabilities can do for their operations. Readers interested in learning more can go here and here.

 

If the Power Apps team is successful, more enterprises will get their first AR experience, not from super-sophisticated “gee-wizardry” AR pilots, but from AR enhancements that deliver immediate value to their everyday solutions.