1

Standalone AR/VR headsets are finally ready to make a big leap forward

Key points in the article include:

  • In 2019, Qualcomm foreshadowed that XR2 was on the cusp of being adopted by AR/VR headset makers.
  • The Snapdragon XR2 and Niantic’s XR2-powered AR glasses were both announced by their respective companies without a product or timeline/imagery to go with them.
  • However, Geekbench 5 released results last week for the HTC Vive Focus model with the XR2, and its likely configuration matched with the specs.
  • Completed XR2-powered headsets will vary depending on the company despite Qualcomm providing the reference platforms and chipset.
  • Horwitz, the writer of the article, believes that the general trend will favour higer-resolution VR displays.
  • Pico’s high-resolution Neo 2 is powered by a Snapdragon 845, therefore Horwitz expects that XR2 headsets will surpass Pico’s quality – the new headsets will likely use 90Hz refresh rates, creating display speeds of PC standard, therefore reduced nausea.
  • Facebook’s Quest achieved complex visuals from the Snapdragon 835, and Qualcomm’s suggestion that the XR2 has twice as much GPU and CPU power as the 835 means that XR2-powered titles will rival previous visuals.
  • Since “current-generation” and “entry-level” are changeable, there is no way mobile-class XR2 headsets will entirely eliminate a demand for high-spec technology, however, the visual delta between tethered and untethered headsets will be less of a priority.
  • Snapdragon XR2’s Artificial Intelligence processing capabilities could also be a key factor in enhancing MR headset performance.
  • Although quantity does matter in regards to AI performance, as do quality and system-level engineering and software considerations making use of competencies.
  • AI can further impact MR headset performance via generating solutions for partly original problems, empowering computer opponents, enabling richer voice controls, and segmenting live visuals to blend with digital content.

The article concludes by acknowledging that nothing is currently certain in regards to release of XR headsets due to COVID-19, however, it appears that Snapdragon XR2 headsets will be in stores relatively soon.




AREA member RealWear’s Firmware Release 11.2

Release Highlights:

  • Cloud Sync
    • A new application that enables customers to easily authenticate to cloud storage drives including Microsoft OneDrive, upload tagged photos / videos captured in My Camera and browse cloud drives in My Files
  • Ease of Use
    • Tetrominos, a fun Tetris-like game which helps users get familiar with RealWear’s user interface
    • Wi-Fi Band Control which allows end users or IT Admins to lock their RealWear devices to either the 2.4 or 5 GHz band
    • My Controls Grid View to easily navigate the growing functionality in My Controls
  • Security
    • Android Security Patches from March – July 2020 integrated into the RealWear device firmware
    • Updated Lock Screen which leverages a secure keyboard instead of head tracking
  • Equity and Inclusion
    • Changes in software and documentation terminology in support of equity and inclusion
  • Full Language Support for Traditional Chinese
  • Bug Fixes – As with any release, bug fixes and minor enhancements are incorporated.

See Realwear’s AREA member profile here

Visit RealWear’s website here




Enterprise AR software provider Taqtile graduates from 5G Open Innovation Lab

Founded by Intel, T-Mobile, and NASA, the 5G OI Lab is an international ecosystem of enterprises, startups, developers, government institutions, and academia. Startups aiming to deploy 5G networks are given access to technology, industry resources, and advanced engineering.

General Partner of the 5G OI Lab, Jim Brisimitzis, is quoted to have said that engagement from Taqtile and other members was “phenomenal” throughout. He believes that working with innovative organisations such as Taqtile will increase global adoption of 5G, as well as general technological advances.

Taqtile creates AR software for enterprise, equipping frontline workers with the ability to easily perform complex tasks, capture knowledge, and remotely collaborate with experts. The company has claimed that participating in the 5G OI Lab will allow it to greater leverage 5G power, further increasing throughput, improving customer security, and helping to resolve latency issues.

Dirck Schou, CEO of Taqtile, is quoted to have said that August has been a significant month for the company, beginning with the Mixed Reality 2020 Microsoft Partner of the Year Award. Now, after partaking in the 5G OI Lab, they can progress even further. Taqtile is a current leader in an up and coming section of software that will ensure continuous operation via the cloud, spatial computing, and networking, and essentially change how industrial organisations enable workers.




Porsche Triples Down on AR

Mike Boland of AR Insider claims that there are key lessons to be learned from Porsche’s investment, as it indicates that AR is working. Also, Porsche got past ‘pilot purgatory’, which is when enterprises integrate AR but fail to gain mass adoption, often a result of poor communication with front line workers uncertain in utilising such technology. Via Atheer’s influence of “thinking like a marketer”, Porsche has now avoided pilot purgatory.

Another way in which Atheer has guided Porsche is by deploying AR in the most impactful areas. For example, Atheer has reported that AR is more useful in guidance rather than training, despite VR’s ability to increase knowledge retention in training. Therefore, AR has a greater impact in non-repetitive jobs.

Amar Dhaliwal, CEO of Atheer, is quoted to have said that they start by assessing what it is that Porsche is trying to do; if this is training-related, then Atheer will advise against deploying AR, as it will not have the suitable ROI.




AirV Labs Wins Prestigious Manufacturing Innovation Award for Novel Virtual Reality Technology

AirV Labs is a Silicon Valley-based technology company specialising in analytics and VR-based solutions. Its award-winning tool, the AirVu™ Platform, allows a 360° video of a given scenario to be merged with three-dimensional data such as medical charts, automation, and CAD, as well as enterprise-level information such as inventory, bill of materials, and circuit diagrams. The company is able to efficiently prototype advanced cyber learning platforms, increasing the strengths of VR, AR, and MR tech, and producing a new immersive interface.

Faisal Yazadi, CEO of AirV Labs USA, is quoted to have said that AirV Labs winning the prestigious award validates their cost-effective technology which meets the increasing demands of the industry. He further stated that AirV Labs’s global team are primed to distribute innovative MR projects via partnerships and collaboration with industry leaders. The CEO of AirV Labs India, Chinmay Sengupta, is also quoted to have said that their technology will account for the current paradigm shift requiring businesses to re-strategise by enabling companies to sustainably manage global operations, providing their teams with a virtual learning environment.

The founder and global CTO of AirV Labs, Dr. Kesh Kesavadas, stated that the company is honoured to be recognised by NASSCOM as an AR/VR industry leader, as their technology addressing the demand for affordable yet rapid content creation is the result of six years of research at the University of Illinois. He claims that the AirVu™ Platform will deliver immersive learning to meet the requirements of the tech industry, education, and healthcare, especially in the current COVID-19 climate.




K & A partners with vGIS to offer Augmented Reality expertise

AR provides users with an interactive experience of a real world environment, enabling them to visualize digitally-created objects within the real world view. AR can also offer the ability to interact with these objects via smartphones and headset devices.

“By incorporating AR into enterprise GIS data that can be viewed as a natural extension of the real world, users will be able to see their geospatial data around them in the most understandable way possible.”

Ultraleap and Qualcomm announce a multi-year agreement

The leading standalone VR headset, Oculus Quest, has been increasingly focusing on controllerless hand-tracking as a means of input for the device. Other major headset makers, like Microsoft and its HoloLens 2, have also honed in on hand-tracking as a key input method. As industry leaders coalesce around hand-tracking, it becomes increasingly important for competing devices to offer similar functionality.

But hand-tracking isn’t a ‘solved’ problem, making it a challenge for organizations that don’t have the resources of Facebook and Microsoft to work out their own hand-tracking solution.

Ultraleap’s fifth generation hand tracking platform, known as Gemini, will be pre-integrated and optimised on the standalone, untethered Snapdragon XR2 5G reference design, signalling a significant step change for the XR space. The Gemini platform delivers the fastest, most accurate and most robust hand tracking and will provide the most open and accessible platform for developers.

The Snapdragon XR2 5G Platform is the world’s first 5G-supported platform designed specifically for untethered VR, MR and AR (collectively, extended reality or XR). Gemini has been optimised for the Snapdragon XR2 5G platform to allow for an ‘always on’ experience and the most natural interaction in untethered XR.

Steve Cliffe, CEO of Ultraleap, said: “Qualcomm Technologies recognises the importance of high-precision hand tracking in order to revolutionise interaction in XR. The compatibility of our technology with the Snapdragon XR2 5G Platform will make the process of designing hand tracking within a very wide variety of products as simple as pick and place. Qualcomm Technologies is in the position to bring transformation to XR by making state-of-the-art technologies – including 5G and spatial computing – available to a broad market. We are proud to be at the forefront of this fast-growing ecosystem alongside them.”

Hiren Bhinde, Director of Product Management, Qualcomm Technologies, Inc., said: “Hand tracking is becoming a table stakes feature in next-gen XR devices. True immersive XR experiences require seamless, natural and intuitive usage and interaction of the users’ hand when interacting in the digital world as they do in the physical world. Ultraleap’s hand tracking technology enables this seamless interaction through a natural connection between people and technology, which is incredibly important for the next generation of XR devices. We are excited to work with Ultraleap to help deliver more immersive experiences on the Snapdragon XR2 5G reference design.”

Read the original Ultraleap news press release here 

 




What The Future Of Manufacturing Could Look Like With AR/VR (Forbes)

Our community of readers interested in AR in the enterprise are likely to be interested in a recent article in Forbes Technology Council from the experience and perspective of Dan Gamota, working at a high-tech lab co-located in a Silicon Valley innovation center.

!Going to work was an opportunity to be fully immersed in a continuous learning environment with cutting-edge technologies and some of the best minds in engineering, science and manufacturing. Until, of course, the day we shifted to a work-from-home model.  Overnight, we were separated from each other as well as our vital lab hardware, software and tools. Yet we still are developing dozens of critical manufacturing processes, many of which have been transferred, deployed and audited in factories and facilities all over the world.”

The team moved on despite the pandemic with seamless collaboration and accelerate innovation, by collectively reaching for thier augmented reality and virtual reality (AR/VR) headsets.

These tools already have proven indispensable for training production-line operators while guiding them through complex manufacturing operations. In Singapore, for instance, a team of engineers working in our additive manufacturing center uses AR to reduce training time by 50% on complex 3D printers. Similarly, AR helps speed maintenance instruction training and facilitates remote support. Topcs covered in the article include Building Cyber-Physical Bridges, Innovation Without Boundaries and Advancing Innovation With Avatars.

Read the full original article here.

 

 




A new 3D approach to remote design engineering

And trying to untangle complex problems remotely from thousands of miles away is fraught with difficulties – even when using products like Microsoft’s Remote Assist. The expert often has to resort to waving their hands around on a screen to communicate to the technician which part of a machine they should be fixing – and which parts should be left alone.

Real-time immersive 3D collaboration is now adding a new dimension to such problem solving – users can share live, complex 3D files such as CAD data, interact with them and reveal ‘hidden’ parts deep within a machine that may be causing an issue. The technology also transforms day-to-day collaboration between remote engineering team members. Design reviews, for example, can be brought to life, with participants ‘walking through’ a model, no matter where they are in the world.

 

The fundamental problem at the root of many of these issues until now has been that enterprise teams have lacked the ability to effectively collaborate in real time using live, complex 3D data. The solution lies in purpose-built framework technology for integrating natively real-time collaboration and immersive device support directly into legacy enterprise software packages.

The key to enabling true real-time collaboration is to start where the data ‘sits’ and ensure that this original data ‘truth’ is the same for everybody when working together, no matter where they are located or what device they wish to use. This way, everyone in the team has the correct and most up-to-date information available.

Whether it is a CAD package, PLM software, an MRI scanner, robotic simulation software or a laser scanning system, many industries are becoming increasingly dependent on spatial data types and digital twins. These complex data formats are usually incompatible or just too cumbersome to use ‘as is’ in existing collaboration platforms such as Webex, Skype, Google docs or Slack – all built primarily for 2D data formats such as video, text or images.

Moreover, the legacy software generating the data itself is unlikely to have any in-built real-time collaboration functionality – forcing enterprise users to resort to one of two methods. One option is to manually export the data, carry out a painful and time-consuming reformatting process, then manually import the newly crunched data into some type of third-party standalone collaboration package. The alternative is to ignore the spatial nature of the data entirely and instead screen-grab or render out 2D ‘flat’ images of the original 3D data for use in a basic PowerPoint presentation or something similar.

Neither of these methods allows teams to efficiently collaborate using a live data truth – i.e. the original data itself instead of a reformatted, already out-of-date interpretation of it. So, both methods only compound the root collaboration problem instead of helping to solve it.

The latest generation of real-time immersive 3D collaboration technology is integrated directly into the host software, grabbing the original data at source before efficiently pushing it into a real-time environment which users can access using their choice of device (VR, AR, desktop, browser or mobile) for instant and intuitive collaboration. End-to-end encryption ensures that even the most sensitive data may be confidently shared across remote locations.

The integration into the host package provides not only a live connection to the data but also a bi-directional connection, meaning that users are still connected to the host software package running in the background. The advantage of this over standalone applications is that it still gives access to core features of the host package – enabling accurate measurement of a CAD model using vertex or spline snapping to the original B-Rep held in the CAD package, for example. All the underlying metadata from the host package is also available to the immersive experience – and annotations, snapshots, action lists or spatial co-ordinate changes can be saved back into the host package.

The new post-pandemic requirement to have a distributed workforce – in conjunction with the rollout and adoption of key technology enablers such as server-side rendering and high-capacity, low-latency connectivity – is set to accelerate the adoption and integration of real-time immersive collaboration solutions. In the future, 5G technology will also open up the potential to stream to immersive AR and VR devices – untethering the experience and facilitating factory-wide adoption of immersive solutions. For example, as ‘Industrial Internet of Things’ (IIoT) data streams from smart devices in the factory, it will be overlaid via AR glasses in the physical space. And as cloud service providers build out features such as spatial anchoring to support ever-larger physical spaces, these new services will be used within collaborative environments rich with real-time data.

Factory workers, for example, will have the ability to ‘dial an expert’ directly from a virtual panel on a smart factory device. This offsite expert will appear as a holographic colleague and bring with them live 3D data for that individual machine. Both users will have real-time IIoT data overlaid intuitively on the fully interactive 3D model to facilitate a more effective diagnosis and maintenance process.

Empowering shop-floor workers with hands-free AR and detailed 3D data will dramatically improve assembly line efficiency, with an intuitive environment where product data is fully interactive. Users will be able to move, hide, isolate and cross-section through parts, while using mark-up and voice tools to create efficient instructions for the assembly or disassembly of complex products. These instructions will be recorded and delivered as holographic guides via AR directly on the assembly line.

The next generation of real-time immersive 3D collaboration technology is even set to enable you to have a scaled-down hologram of your latest engine design sitting right in front of you on your desk. As you work on the design and refine it using your CAD software, the changes will be dynamically loaded into the hologram so that you can see the effects immediately and make any further necessary adjustments.

Meanwhile, digital sleeving – with 3D images overlaid on physical designs – will enable you to check how two parts of the engine come together, even when they are being designed by different teams in different locations. Similarly, you will be able to see how, for example, cabling will fit inside your latest aircraft seat design or where best to put the maintenance pockets for easy access.

This kind of approach adds a new dimension to the handoff between design and manufacturing. If adjustments need to be made to a fan assembly design, for example, the relevant part can be isolated within an immersive design review – and speech-to-text notes can be added to the part number and automatically become part of the change request. It’s all a far cry from endless design iterations, spreadsheets and printouts – or CAD screen shares using a 2D representation of a 3D problem.

In the post-pandemic remote world, conferencing is bringing people, video and documents together. Collaboration is now adding the fourth dimension of 3D immersive experience to complete the picture.

 




5 Tips on How AR Smart Glasses Increase Employee Satisfaction – Ubimax

Here are Ubimax’s top 5 tips:

1. SMART GLASSES IMPROVE WORKING ERGONOMICS

Smart glasses allow working with both hands throughout. Tablets or notebooks no longer need to be held in the arm. This takes the strain off many tasks and prevents incorrect posture. Imbalanced strains and joint wear due to permanent one-sided holding of scanners and repetitive movements are avoided. The workflows themselves are structured and often simplified. Unnecessary steps are eliminated.

2. INCREASE OF OCCUPATIONAL SECURITY

The ability to use both hands is a major security advantage of smart glasses. For example, it is a lot safer for a logistics worker to climb a ladder using both hands instead of operating a hand scanner in elevated positions. In addition, warnings about ergonomically questionable situations, for example when unhandy or heavy parts need to be handled as it is often the case in industrial environments, can be displayed directly on the glasses. The punctual display of safety instructions can also reduce the potential of injury in high-risk occupational fields, utilities, or in production through the targeted use of AR devices. Step-by-step instructions increase security by indicating temporary prohibited zones, e.g. for tests in laboratories, or notes on hygiene regulations.

3. WORK FACILITATION THROUGH EXPERT CALLS AND REDUCED TRAVEL ACTIVITIES

Smart glasses enable global collaboration across distances and time zones. In case of problems, experts can easily be consulted via video call on the smart glasses. The expert sees exactly what the employee sees while the employee continues to work hands-free. Collaboration through remote support enables remote training of employees and saves money by reducing the travel expenses of experts. They do not have to travel at short notice but can help their colleagues on site from their own desks. Global collaboration also increases the feeling of belonging and team spirit.

4. ALL AGE GROUPS BENEFIT FROM THE USE OF SMART GLASSES

The use of smart glasses does not only delight young, technology-oriented employees. Older employees also benefit from them. For example, heavy, unwieldy scanners are no longer needed for order picking which makes work easier on the one hand and helps workers achieve their targets faster on the other. Furthermore, smart glasses simplify the inclusion of handicapped workers. The possibility to show instructions step-by-step at the employee’s pace and to carry out a subsequent quality control allows them to participate fully in working life.

5. QUICK ONBOARDING DUE TO EASIER KNOWLEDGE TRANSFER

Onboarding training is simplified due to smart glasses. Step-by-step instructions, automatic quality checks, help and expert support enable employees to work independently and productively a lot faster. This not only makes their daily work easier, but also that of their colleagues who do not have to take on the task of training new colleagues in addition to their own work. There are fewer disappointments and negative experiences which reduces the drop-out rate, and quality and productivity reach the level of experienced colleagues much earlier.