1

AREA Requirements Committee Advances Work at F2F Meeting

The future success of enterprise AR depends on vendors and enterprises having a shared understanding of the hardware, software, and use case requirements for each type of AR solution. Establishing those requirements is the work of the AREA Requirements Committee – and on August 11th, the group convened in Boston for two days of face-to-face meetings to advance their work.

Requirements are essential because they enable enterprises to evaluate what they need to implement an AR solution. At the same time, requirements provide AR hardware and software developers with the input they need to build products that fulfil enterprise needs.

Over the past three months, the Requirements Working Group has been meeting on a regular basis to develop and agree on a set of Global Enterprise AR Requirements. The face-to-face meeting in Boston was tasked with finalizing the first phase of the Global Enterprise AR Requirements.

The Working Group included the following AREA members:

 

AREA-Requirements-Committee-Advances-F2F

The two-day workshop was a great success – and highly productive! Bringing together AREA members from all parts of the AR ecosystem (end users, hardware providers, software providers, standards organizations and academics) created a rich, diverse, focused and expert view of the Requirements needed to successfully deploy an enterprise AR solution.

The team focused on three key areas:

  • Hardware Requirements
  • Generic Software Requirements
  • AR Use Case Requirements (based on the defined AREA Use Cases)

The first order of business was to conduct a detailed review and update of the Hardware and Generic Software Requirements that the Working Group had previously drafted. The Working Group then turned to defining the individual Use Case Requirements. Over the two days, the team succeeded in prioritizing the Use Cases and identifying a common set of requirements.

There was also an opportunity to review the updated AREA Statement of Needs (ASoN) tool, a purpose-built online capture, store, update and publish AR Requirements tool. A review of the functionality and reporting was made, and suggested improvements captured.

At the end of the event, all the participants agreed it was a very useful and informative workshop that needs to be run on a regular basis. My thanks to the attendees and the amazing team at PTC who provided the space and amazing facilities for the workshop.

Watch this space for more information about next steps and the upcoming launch of the AREA Global Enterprise AR Requirements.




RealWear Launches Cloud Offering

RealWear Cloud is a new multi-purpose software offering for IT and business operations. Through the new dashboard, IT and Business Operations can remotely and securely streamline control of their RealWear device fleet. As companies grow their fleet of RealWear devices, RealWear Cloud allows for convenient low-touch, over-the-air firmware updates, keeping the devices secure and company data protected. Working alongside organizations’ existing EMM or MDM software such as Microsoft Endpoint Manager (InTune), the offering further provides teams more real-time data and metrics to optimize operational efficiency. RealWear Cloud complements existing EMM/MDM solutions and enables device-specific control and configuration capabilities. Also, it is the only way to gain trusted and secure access to certified third-party apps designed for our product portfolio.

In addition, RealWear is introducing RealWear Cloud Assistance as part of the offering.  RealWear Cloud Assistance provides real-time remote technical support and troubleshooting to frontline workers to quickly identify, diagnose and fix device issues. Reducing device downtime through remote troubleshooting will have a growing impact on company bottom lines. According to VDC research, individual incidences of device failure result in 72 minutes of lost or disrupted productivity for frontline workers. Remote support, firmware updates, and data analytics will not only increase productivity but will be necessary as businesses face ongoing talent shortages, the scarcity of which Gartner notes was exacerbated in 2021.

“As a deployment of RealWear devices grows across sites and countries, it’s critical that we provide great IT tools and real-time metrics for those ultimately responsible for the successful deployment of the devices in the field,” said Andrew Chrostowski, Chairman and CEO of RealWear. “We’re capturing data that will drive better decisions. It’s exciting to see RealWear transitioning from a device-centric company to a platform solution company with the introduction of our first software-as-a-service (SaaS) offering.”

RealWear’s previous lightweight device management tool, will transition to RealWear Cloud. Current Cloud customers will automatically be enrolled in the Basic plan.

“Wearable technologies are becoming more and more mainstream in the enterprise, and making deployments simple and frictionless is one of our key goals,” continued Chrostowski. “Wearables are no longer viewed as a novelty but are now trusted by enterprises to bring value and solve real-world problems.”

About RealWear

As the pioneer of assisted reality wearable solutions, RealWear® works to engage, empower, and elevate the modern frontline industrial worker to perform work tasks more safely, efficiently, and precisely. Supporting over 65,000 devices, RealWear gives workers real-time access to information and expertise while keeping their hands and field of view free for work. Headquartered in Vancouver, Washington and used by 41 of the Fortune 100 companies, RealWear is field-proven in a wide range of industries with thousands of world-class customers, including Shell, Goodyear, Mars, Colgate-Palmolive, and BMW.




Taqtile Completes AR Programme for IAG Airports

Manifest, AR Remote Guidance

The AR technology firm provided direct onboarding during a trial with British Airways, which introduced Taqtile’s products to onsite workers.

Dirck Schou, the CEO of Taqtile, added,

“This unique accelerator program has been a great way to introduce airlines to cutting-edge technologies like Manifest which can help them improve the performance of technicians and engineers immediately”

Additionally, the 10-week programme taught staff how to utilize the Manifest platform. Taqtile’s service operates on spatial devices such as Magic Leap and Microsft’s Hololens 2; the product also works across multiple devices, including tablets and smartphones.

An onsite worker can access guidance resources such as digital manuals, video guides, holograms, and 3D models, all presented as detailed AR visualizations. Manifest displays digital resources in the field of view (FoV) of a worker’s headset, and the wearer can navigate a spatial interface hands-free.

Schou continued, stating,

“Through demonstrations of our AR-enabled work instruction platform over the 10-week program, airline industry leaders have gained a better understanding of the tangible benefits Manifest is capable of delivering”

For frontline airport staff, Taqtile’s solution helps workers learn invaluable company-centric knowledge and enhance their efficiency when performing maintenance tasks.

Taqtile explained how airports could leverage its Manifest solution by dispersed teams providing live guidance from an operations centre to frontline employees.

Manifest supports several file types, including photos, videos, real-time 3D (RT3D) content, computer-aided designs, and PDFs. Taqtile recently teamed up with Microsoft this month to integrate the Azure Remote Rendering platform into Manifest.

The move enables firms to perform large-scale onboarding, training, and operational duties with increased efficiency and engagement. Taqtile and Microsoft achieve this by integrating Azure-powered streaming to enhance RT3D content distribution across Manifest-ready devices.




Magic Leap 2 – Pricing Released


Magic Leap 2 Base

$3,299 (US only)

Magic Leap 2 Base targets professionals and developers that wish to access one of the most advanced augmented reality devices available. Use in full commercial deployments and production environments is permitted. The device starts at an MSRP $3,299 USD (US only) and includes a 1-year limited warranty.


Magic Leap 2 Developer Pro

$4,099 (US only)

Magic Leap 2 Developer Pro provides access to developer tools, sample projects, enterprise-grade features, and monthly early releases for development and test purposes. Recommended only for internal use in the development and testing of applications. Use in full commercial deployments and production environments is not permitted. Magic Leap 2 Developer Pro will start at an MSRP $4,099 USD (US only) and includes a 1-year limited warranty.


Magic Leap 2 Enterprise

$4,999 (US only)

Magic Leap 2 Enterprise is targeted for environments that require flexible, large scale IT deployments and robust enterprise features. This tier includes quarterly software releases fully manageable via enterprise UEM/MDM solutions. Use in fully commercial deployments and production environments is permitted. Magic Leap 2 Enterprise comes with 2 years of access to enterprise features and updates and will start at an MSRP $4,999 USD (US only) and includes an extended 2-year limited warranty.

Most Immersive

Magic Leap 2 is the most immersive AR device on the market. It features industry leading optics with up to 70° diagonal FOV; the world’s first dynamic dimming capability; and powerful computing in a lightweight ergonomic design to elevate enterprise AR solutions.

Built for Enterprise

Magic Leap 2 delivers a full array of capabilities and features that enable rapid and secure enterprise deployment. With platform-level support for complete cloud autonomy, data privacy, and device management through leading MDM providers, Magic Leap 2 offers the security and flexibility that businesses demand.

Empowering Developers

Magic Leap 2’s open platform provides choice and ease-of-use with our AOSP-based OS and support for leading open software standards, including OpenGL and Vulkan, with OpenXR and WebXR coming in 2H 2022. Our platform also supports your choice of engines and tools and is cloud agnostic. Magic Leap 2’s robust developer portal provides the resources and tools needed to learn, build, and publish innovative solutions.




AREA Human Factors Group Developing an AR & MR Usability Heuristic Checklist

Usability is an essential prerequisite for any successful AR application. If any aspect of the application – from the cognitive impact on the user to the comfort of the AR device – has a significant negative impact on usability, it could discourage user acceptance and limit projected productivity gains and return-on-investment.

But how can organizations pursuing an AR application evaluate a solution’s usability? To answer that question, the AREA Human Factors Committee has undertaken the development of an AR and MR Usability Heuristic Checklist. Driven by Jessyca Derby and Barbara S. Chaparro of Embry-Riddle Aeronautical University and Jon Kies of Qualcomm, the Checklist is intended to be used as a tool for practitioners to evaluate the usability and experience of an AR or MR application.

The AR & MR Usability Heuristic Checklist currently includes the following heuristics:

  • Unboxing & Set-Up
  • Help & Documentation
  • Cognitive Overload
  • Integration of Physical and Virtual Worlds
  • Consistency & Standards
  • Collaboration
  • Comfort
  • Feedback
  • User Interaction
  • Recognition Rather than Recall
  • Device Maintainability

The team is in the process of validating these heuristics across a range of devices and applications. So far, they have conducted evaluations with head-mounted display devices (such as Magic Leap and HoloLens), mobile phones, educational applications, and AR/MR games; see their recent journal article for more information.

To further ensure that the breadth of the AR and MR Usability Heuristic Checklist remains valuable across domains and devices, they are in the process of conducting further validation that will consider:

  • Privacy
  • Safety
  • Inclusion, Diversity, and Accessibility
  • Technical aspects of designing for AR and MR (e.g., standards for 3D rendering)
  • Standards for sensory output (e.g., tactile feedback, spatial audio, etc.)
  • Applications that involve multiple users to collaborate in a shared space
  • A range of devices (e.g., AR and MR glasses such as Lenovo’s Think Reality A3)

In the coming months, the team will move on to identifying and obtaining applications and/or hardware that touch on the areas outlined above. They will then conduct heuristic evaluations and usability testing with the applications and/or hardware to further refine and validate the Checklist. The final step will be to establish an Excel-based toolkit that will house the Checklist. This will enable practitioners to easily complete an evaluation and immediately obtain results.

Upon completion of the project, the AR and MR Usability Heuristic Checklist will become a vital resource for any organization considering the adoption of AR. If you would like to learn more or have an idea for an application that could be included in this validation process, please contact Dr. Barbara Chaparro or Jessyca Derby.




Vuforia Engine 10.8

Key updates in this release:

  • Advanced Model Target Improvements: Training times for Advanced Model Targets have been optimized and now depend on the number and size of views. Recognition performance for advanced, close-up views has also been improved.
  • Area Target Improvements:
    • The target’s occlusion mesh is now exposed in the C API which allows native apps to render occluded virtual content in combination with Area Targets as you move through the space.
    • Textured authoring models are now created by the Area Target Creator app and the Area Target Capture API providing an improved authoring experience in Unity. These scans can be loaded into the Area Target Generator for clean-up and post-processing.
    • Area Target tracking data is now compressed and takes up to 60% less space.
  • Unity Area Target Clipping: Area Target previews in the Unity Editor can be clipped based on height, for faster previewing and better visibility of the scanned space during app development.
  • Engine Developer Portal (EDP) Self-Service OAuth UI: OAuth Engine credentials can now be managed from the EDP, eliminating the need for the command line.
  • Notices
    • High Sensor-Rate Permission: Due to new Android permission requirements, developers should add the “high sensor rate” permission to all native Vuforia Engine apps running on Android 12+ for all releases, otherwise VISLAM may not work. Read more about VISLAM tracking here.
    • Permission Handling: The Vuforia Engine behavior of triggering OS-specific user permission requests at runtime is deprecated in 10.8 and will be removed in an upcoming release.  All native apps should be updated to manage permissions themselves. The 10.8 sample apps share best practices for this.



Magic Leap and NavVis Announce Strategic Partnership to Enable 3D Mapping and Digital Twin Solutions in the Enterprise

Combining Magic Leap’s advanced spatial computing platform with NavVis’s mobile mapping systems and spatial data platform, the two companies aim to enhance the use of AR applications across key industries, including automotive, manufacturing, retail and the public sector.

As part of this strategic partnership, NavVis will bring its NavVis VLX mobile mapping system and NavVis IVION Enterprise spatial data platform to Magic Leap’s new and existing enterprise customers with an initial focus on manufacturing. Magic Leap customers will be able to leverage NavVis’s expansive visualization capabilities to generate photorealistic, accurate digital twins of their facilities at unprecedented speed and scale.

The market opportunity for digital twins and other forms of advanced visualization is significant – with demonstrated potential to transform the world of work as we know it. While attention around the potential of the metaverse has put a greater focus on all types of mixed reality technology, AR represents an immediate opportunity for businesses to enhance productivity and improve operational efficiency. Magic Leap’s open, interoperable platform will also enable the metaverse to scale for enterprise applications.

While the Magic Leap 2 platform offers cutting-edge scanning and localization capabilities in real-time on the device itself, NavVis’s technology will allow Magic Leap customers to pre-map and deploy digital twins in large, complex settings that can cover up to millions of square feet – including but not limited to warehouses, retail stores, offices and factories – for a variety of use cases, such as remote training, assistance and collaboration. Such applications will enable companies to reduce operational costs, enhance overall efficiency and democratize the manufacturing workforce of tomorrow.

“We are seeing significant demand for digital twin solutions from our enterprise customer base and are thrilled to partner with NavVis to make our shared vision for large-scale AR applications a reality,” said Peggy Johnson, CEO of Magic Leap. “Coupled with our Magic Leap 2 platform, NavVis’s advanced visualization capabilities will enable high-quality, large-scale and novel AR experiences that business users demand.”

The NavVis partnership is an essential component of Magic Leap’s strategy to cultivate an ecosystem of best-in-class technology partners that will deliver on the promise of enterprise AR, leveraging Magic Leap 2’s powerful, open platform. With a global customer base of more than 400 companies, including the likes of BMW, Volkswagen, Siemens and Audi, NavVis has a proven track record of delivering immediate and long-term value to enterprises looking to modernize their operations.

“Enterprise AR solutions for larger-scale activations will open the door for greater innovation in the workplace,” said Dr. Felix Reinshagen, CEO and co-founder of NavVis. “Our own experience shows that 3D mapping and digital twins are a fundamental foundation for large-scale persistent AR applications. We’re experiencing strong demand across many verticals with industrial manufacturing as a clear front runner. Magic Leap is a world leader in delivering impactful, innovative experiences in these verticals, and we are excited to collaborate with the company to advance this mission and further enable the future of work.”

About Magic Leap

Magic Leap, Inc.’s technology is designed to amplify human potential by delivering the most immersive Augmented Reality (AR) platform, so people can intuitively see, hear, and touch digital content in the physical world. Through the use of our advanced, enterprise-grade AR technologies, products, platforms, and services, we deliver innovative businesses a powerful tool for transformation.

Magic Leap, Inc. was founded in 2010, is proudly headquartered in South Florida, with eight additional offices across the globe.

About NavVis

Bridging the gap between the physical and digital world, NavVis enables service providers and enterprises to capture and share the built environment as photorealistic digital twins. Their SLAM-based mobile mapping systems generate high-quality data with survey-grade accuracy at speed and scale. And with their digital factory solutions, users are equipped to make better operational decisions, boost productivity, streamline business processes, and improve profitability. Based in Munich, Germany, with offices in the United States and China, NavVis has customers worldwide in the surveying, AEC, and manufacturing industries.




Blippar brings AR content creation and collaboration to Microsoft Teams

LONDON, UK – 14 June 2022 – Blippar, one of the leading technology and content platforms specializing in augmented reality (AR), has announced the integration of Blippbuilder, its no-code AR creation tool, into Microsoft Teams.

Blippbuilder, the company’s no-code AR platform, is the first of its type to combine drag and drop-based functionality with SLAM, allowing creators at any level to build realistic, immersive AR experiences. Absolute beginners can drop objects into a project, which when published will stay firmly in place using Blippar’s proprietary surface detection. These experiences will serve as the foundation of the interactive content that will make up the metaverse.

Blippbuilder includes access to tutorials and best practice guides to familiarise users with AR creation, taking them from concept to content. Experiences are built to be engaged with via browser – known as WebAR – removing the friction of, and reliance on dedicated apps or hardware. WebAR experiences can be accessed through a wide range of platforms, including Facebook, Snapchat, TikTok, WeChat, WhatsApp, alongside conventional web and mobile browsers.

Teams users can integrate Blippbuilder directly into their existing workflow. Designed with creators and collaborators in mind, whether they be product managers, designers, creative agencies, clients, or colleagues, organisations can be united in their approach and implementation – all within Teams. The functionality of adaptive cards, single sign-on, and notifications, alongside real-time feedback and approvals,  provides immediate transparency and seamless integration from inception to distribution. The addition of tooltips, support features, and starter projects also allows teams to begin creating straightaway.

“The existing process for creating and publishing AR for businesses, agencies, and brands is splintered. Companies are forced to use multiple tools and services to support collaboration, feedback, reviews, updates, approvals, and finalization of projects,” said Faisal Galaria, CEO at Blippar. “By introducing Blippbuilder to Microsoft Teams, workstreams including team channels and group chats, we’re making it easier than ever before for people to collaborate, create and share amazing AR experiences with our partners at Teams”.

Utilizing the powerful storytelling and immersive capabilities of AR, everyday topics, objects, and content, from packaging, virtual products, adverts, and e-commerce, to clothing and artworks, can be ‘digitally activated’ and transformed into creative, engaging, and interactive three-dimensional opportunities.

Real-life examples include:

  •  Bring educational content to life, enabling collaborative, immersive learning
  •  Visualise and discuss architectural models and plans with clients
  •  Allowing product try-ons and 3D visualization in e-commerce stores
  •  Create immersive onboarding and training content
  •  Present and discuss interior design and event ideas
  •  Bring print media and product packaging to life
  •  Artists and illustrations can redefine the meaning of three-dimensional artworks

In today’s environment of increasingly sophisticated user experiences, customers are looking to move their technologies forward efficiently and collaboratively. Having access to a comprehensive AR creation platform is a feature that will keep Microsoft Teams users at the forefront of their industries. Blippbuilder in Teams is the type of solution that will help customers improve the quality and efficiency of their AR building process.

Blippar also offers a developer creation tool, its WebAR SDK. While Blippbuilder for Teams is designed to be an accessible and time-efficient entry point for millions of new users, following this validation of AR, organisations can progress to building experiences with Blippar’s SDK. The enterprise platform boasts the most advanced implementation of SLAM and marker tracking, alongside integrations with the key 3D frameworks, including A-Frame, PlayCanvas, and Babylon.js.




Factory layout Experience – Theorem Solutions

Optimize designs in immersive XR

The Factory Layout Experience enables a planning or layout engineer, working independently or with a group of colleagues, locally or in remote locations, to optimize Factory layouts through the immersive experience of eXtended Reality (XR) technologies. Seeing your data at full scale, in context, instantly enables you to see the clashes, access issues and missing items which a CAD screen cannot show.

On the shop floor there are literally 1000’s of pieces of equipment- much of it bought in and designed externally. Building designs may only exist as scans or in architectural CAD systems, and robot cells may be designed in specialist CAD systems. There will be libraries of hand tools, storage racks and stillage equipment designed in a range of CAD systems, and product data designed in house in mechanical CAD. To understand the factory and assess changes, all of that has to be put together to get a full picture of where a new line, robot cell or work station will fit.

A catalogue of 3D resources can leverage 2D Factory layouts by being snapped to these layouts to quickly realize a rich 3D layout. Advanced positioning makes it very easy to move, snap and align 3D data. Widely used plant and equipment is readily available, there is no need to design it from scratch for every new layout. Simplified layout tools enable you to position, align and snap layout objects quickly, which can be used by none CAD experts, enabling all stakeholders to be involved in the process, improving communication.

Testing Design and Operational Factors

Human centred operations can be analysed using mannequins that can be switched to match different characteristics. You can test design and operational aspects of a variety of human factors, to determine reachability, access and injury risk situations, ensuring compliance with safety and ergonomic standards.

It enables companies to avoid costly layout redesign by enabling all parties involved to review the layout collaboratively, make or recommend changes, and capture those decisions for later review by staff who could not attend the session.




AREA Issues RFP for Research on AR-Delivered Instructions for High-Dexterity Work

To date, AREA members have funded 10 AR research projects on a wide range of timely topics critical to the adoption of enterprise AR. Now the AREA is pleased to announce a call for proposals for its 11th research project, which will evaluate the effectiveness of AR for delivery of work instructions for tasks requiring high dexterity. Building on prior research, the project will expand the current state of knowledge and shed light on AR support for tasks that involve high levels of variability and flexibility in completion of a set of manual operations, such as that found in composite manufacturing.

This project will answer questions, such as:

  • How does AR for high dexterity tasks differ from other instruction delivery methods?
  • How are users impacted by the delivery of instructions using AR in high dexterity tasks?
  • What are the key factors informing decision-making and driving return-on-investment in delivering work instructions for particularly dexterous, manual tasks?
  • Can AR-assisted work instructions help improve quality, productivity, or waste reduction and/or rework of manufactured parts?

This AREA research project will produce: a report on the efficiency and effectiveness of AR work instruction for tasks requiring high levels of dexterity; a research data set; a video summary highlighting the key findings; and an interactive members-only webinar presenting the research findings to the AREA.

The AREA Research Committee budget for this project is $15,000. Organizations interested in conducting this research for the fixed fee are invited to submit proposals. All proposals must be submitted by 12 noon Eastern Daylight Time on July 1, 2022.

Full information on the project needs, desired outcomes and required components of a winning proposal, including a submission form, can be found here.

If you have any questions concerning this project or the AREA Research Committee, please email the Research Committee.