1

Standalone AR/VR headsets are finally ready to make a big leap forward

Key points in the article include:

  • In 2019, Qualcomm foreshadowed that XR2 was on the cusp of being adopted by AR/VR headset makers.
  • The Snapdragon XR2 and Niantic’s XR2-powered AR glasses were both announced by their respective companies without a product or timeline/imagery to go with them.
  • However, Geekbench 5 released results last week for the HTC Vive Focus model with the XR2, and its likely configuration matched with the specs.
  • Completed XR2-powered headsets will vary depending on the company despite Qualcomm providing the reference platforms and chipset.
  • Horwitz, the writer of the article, believes that the general trend will favour higer-resolution VR displays.
  • Pico’s high-resolution Neo 2 is powered by a Snapdragon 845, therefore Horwitz expects that XR2 headsets will surpass Pico’s quality – the new headsets will likely use 90Hz refresh rates, creating display speeds of PC standard, therefore reduced nausea.
  • Facebook’s Quest achieved complex visuals from the Snapdragon 835, and Qualcomm’s suggestion that the XR2 has twice as much GPU and CPU power as the 835 means that XR2-powered titles will rival previous visuals.
  • Since “current-generation” and “entry-level” are changeable, there is no way mobile-class XR2 headsets will entirely eliminate a demand for high-spec technology, however, the visual delta between tethered and untethered headsets will be less of a priority.
  • Snapdragon XR2’s Artificial Intelligence processing capabilities could also be a key factor in enhancing MR headset performance.
  • Although quantity does matter in regards to AI performance, as do quality and system-level engineering and software considerations making use of competencies.
  • AI can further impact MR headset performance via generating solutions for partly original problems, empowering computer opponents, enabling richer voice controls, and segmenting live visuals to blend with digital content.

The article concludes by acknowledging that nothing is currently certain in regards to release of XR headsets due to COVID-19, however, it appears that Snapdragon XR2 headsets will be in stores relatively soon.




AREA member RealWear’s Firmware Release 11.2

Release Highlights:

  • Cloud Sync
    • A new application that enables customers to easily authenticate to cloud storage drives including Microsoft OneDrive, upload tagged photos / videos captured in My Camera and browse cloud drives in My Files
  • Ease of Use
    • Tetrominos, a fun Tetris-like game which helps users get familiar with RealWear’s user interface
    • Wi-Fi Band Control which allows end users or IT Admins to lock their RealWear devices to either the 2.4 or 5 GHz band
    • My Controls Grid View to easily navigate the growing functionality in My Controls
  • Security
    • Android Security Patches from March – July 2020 integrated into the RealWear device firmware
    • Updated Lock Screen which leverages a secure keyboard instead of head tracking
  • Equity and Inclusion
    • Changes in software and documentation terminology in support of equity and inclusion
  • Full Language Support for Traditional Chinese
  • Bug Fixes – As with any release, bug fixes and minor enhancements are incorporated.

See Realwear’s AREA member profile here

Visit RealWear’s website here




Enterprise AR software provider Taqtile graduates from 5G Open Innovation Lab

Founded by Intel, T-Mobile, and NASA, the 5G OI Lab is an international ecosystem of enterprises, startups, developers, government institutions, and academia. Startups aiming to deploy 5G networks are given access to technology, industry resources, and advanced engineering.

General Partner of the 5G OI Lab, Jim Brisimitzis, is quoted to have said that engagement from Taqtile and other members was “phenomenal” throughout. He believes that working with innovative organisations such as Taqtile will increase global adoption of 5G, as well as general technological advances.

Taqtile creates AR software for enterprise, equipping frontline workers with the ability to easily perform complex tasks, capture knowledge, and remotely collaborate with experts. The company has claimed that participating in the 5G OI Lab will allow it to greater leverage 5G power, further increasing throughput, improving customer security, and helping to resolve latency issues.

Dirck Schou, CEO of Taqtile, is quoted to have said that August has been a significant month for the company, beginning with the Mixed Reality 2020 Microsoft Partner of the Year Award. Now, after partaking in the 5G OI Lab, they can progress even further. Taqtile is a current leader in an up and coming section of software that will ensure continuous operation via the cloud, spatial computing, and networking, and essentially change how industrial organisations enable workers.




K & A partners with vGIS to offer Augmented Reality expertise

AR provides users with an interactive experience of a real world environment, enabling them to visualize digitally-created objects within the real world view. AR can also offer the ability to interact with these objects via smartphones and headset devices.

“By incorporating AR into enterprise GIS data that can be viewed as a natural extension of the real world, users will be able to see their geospatial data around them in the most understandable way possible.”

Ultraleap and Qualcomm announce a multi-year agreement

The leading standalone VR headset, Oculus Quest, has been increasingly focusing on controllerless hand-tracking as a means of input for the device. Other major headset makers, like Microsoft and its HoloLens 2, have also honed in on hand-tracking as a key input method. As industry leaders coalesce around hand-tracking, it becomes increasingly important for competing devices to offer similar functionality.

But hand-tracking isn’t a ‘solved’ problem, making it a challenge for organizations that don’t have the resources of Facebook and Microsoft to work out their own hand-tracking solution.

Ultraleap’s fifth generation hand tracking platform, known as Gemini, will be pre-integrated and optimised on the standalone, untethered Snapdragon XR2 5G reference design, signalling a significant step change for the XR space. The Gemini platform delivers the fastest, most accurate and most robust hand tracking and will provide the most open and accessible platform for developers.

The Snapdragon XR2 5G Platform is the world’s first 5G-supported platform designed specifically for untethered VR, MR and AR (collectively, extended reality or XR). Gemini has been optimised for the Snapdragon XR2 5G platform to allow for an ‘always on’ experience and the most natural interaction in untethered XR.

Steve Cliffe, CEO of Ultraleap, said: “Qualcomm Technologies recognises the importance of high-precision hand tracking in order to revolutionise interaction in XR. The compatibility of our technology with the Snapdragon XR2 5G Platform will make the process of designing hand tracking within a very wide variety of products as simple as pick and place. Qualcomm Technologies is in the position to bring transformation to XR by making state-of-the-art technologies – including 5G and spatial computing – available to a broad market. We are proud to be at the forefront of this fast-growing ecosystem alongside them.”

Hiren Bhinde, Director of Product Management, Qualcomm Technologies, Inc., said: “Hand tracking is becoming a table stakes feature in next-gen XR devices. True immersive XR experiences require seamless, natural and intuitive usage and interaction of the users’ hand when interacting in the digital world as they do in the physical world. Ultraleap’s hand tracking technology enables this seamless interaction through a natural connection between people and technology, which is incredibly important for the next generation of XR devices. We are excited to work with Ultraleap to help deliver more immersive experiences on the Snapdragon XR2 5G reference design.”

Read the original Ultraleap news press release here 

 




What The Future Of Manufacturing Could Look Like With AR/VR (Forbes)

Our community of readers interested in AR in the enterprise are likely to be interested in a recent article in Forbes Technology Council from the experience and perspective of Dan Gamota, working at a high-tech lab co-located in a Silicon Valley innovation center.

!Going to work was an opportunity to be fully immersed in a continuous learning environment with cutting-edge technologies and some of the best minds in engineering, science and manufacturing. Until, of course, the day we shifted to a work-from-home model.  Overnight, we were separated from each other as well as our vital lab hardware, software and tools. Yet we still are developing dozens of critical manufacturing processes, many of which have been transferred, deployed and audited in factories and facilities all over the world.”

The team moved on despite the pandemic with seamless collaboration and accelerate innovation, by collectively reaching for thier augmented reality and virtual reality (AR/VR) headsets.

These tools already have proven indispensable for training production-line operators while guiding them through complex manufacturing operations. In Singapore, for instance, a team of engineers working in our additive manufacturing center uses AR to reduce training time by 50% on complex 3D printers. Similarly, AR helps speed maintenance instruction training and facilitates remote support. Topcs covered in the article include Building Cyber-Physical Bridges, Innovation Without Boundaries and Advancing Innovation With Avatars.

Read the full original article here.

 

 




A new 3D approach to remote design engineering

And trying to untangle complex problems remotely from thousands of miles away is fraught with difficulties – even when using products like Microsoft’s Remote Assist. The expert often has to resort to waving their hands around on a screen to communicate to the technician which part of a machine they should be fixing – and which parts should be left alone.

Real-time immersive 3D collaboration is now adding a new dimension to such problem solving – users can share live, complex 3D files such as CAD data, interact with them and reveal ‘hidden’ parts deep within a machine that may be causing an issue. The technology also transforms day-to-day collaboration between remote engineering team members. Design reviews, for example, can be brought to life, with participants ‘walking through’ a model, no matter where they are in the world.

 

The fundamental problem at the root of many of these issues until now has been that enterprise teams have lacked the ability to effectively collaborate in real time using live, complex 3D data. The solution lies in purpose-built framework technology for integrating natively real-time collaboration and immersive device support directly into legacy enterprise software packages.

The key to enabling true real-time collaboration is to start where the data ‘sits’ and ensure that this original data ‘truth’ is the same for everybody when working together, no matter where they are located or what device they wish to use. This way, everyone in the team has the correct and most up-to-date information available.

Whether it is a CAD package, PLM software, an MRI scanner, robotic simulation software or a laser scanning system, many industries are becoming increasingly dependent on spatial data types and digital twins. These complex data formats are usually incompatible or just too cumbersome to use ‘as is’ in existing collaboration platforms such as Webex, Skype, Google docs or Slack – all built primarily for 2D data formats such as video, text or images.

Moreover, the legacy software generating the data itself is unlikely to have any in-built real-time collaboration functionality – forcing enterprise users to resort to one of two methods. One option is to manually export the data, carry out a painful and time-consuming reformatting process, then manually import the newly crunched data into some type of third-party standalone collaboration package. The alternative is to ignore the spatial nature of the data entirely and instead screen-grab or render out 2D ‘flat’ images of the original 3D data for use in a basic PowerPoint presentation or something similar.

Neither of these methods allows teams to efficiently collaborate using a live data truth – i.e. the original data itself instead of a reformatted, already out-of-date interpretation of it. So, both methods only compound the root collaboration problem instead of helping to solve it.

The latest generation of real-time immersive 3D collaboration technology is integrated directly into the host software, grabbing the original data at source before efficiently pushing it into a real-time environment which users can access using their choice of device (VR, AR, desktop, browser or mobile) for instant and intuitive collaboration. End-to-end encryption ensures that even the most sensitive data may be confidently shared across remote locations.

The integration into the host package provides not only a live connection to the data but also a bi-directional connection, meaning that users are still connected to the host software package running in the background. The advantage of this over standalone applications is that it still gives access to core features of the host package – enabling accurate measurement of a CAD model using vertex or spline snapping to the original B-Rep held in the CAD package, for example. All the underlying metadata from the host package is also available to the immersive experience – and annotations, snapshots, action lists or spatial co-ordinate changes can be saved back into the host package.

The new post-pandemic requirement to have a distributed workforce – in conjunction with the rollout and adoption of key technology enablers such as server-side rendering and high-capacity, low-latency connectivity – is set to accelerate the adoption and integration of real-time immersive collaboration solutions. In the future, 5G technology will also open up the potential to stream to immersive AR and VR devices – untethering the experience and facilitating factory-wide adoption of immersive solutions. For example, as ‘Industrial Internet of Things’ (IIoT) data streams from smart devices in the factory, it will be overlaid via AR glasses in the physical space. And as cloud service providers build out features such as spatial anchoring to support ever-larger physical spaces, these new services will be used within collaborative environments rich with real-time data.

Factory workers, for example, will have the ability to ‘dial an expert’ directly from a virtual panel on a smart factory device. This offsite expert will appear as a holographic colleague and bring with them live 3D data for that individual machine. Both users will have real-time IIoT data overlaid intuitively on the fully interactive 3D model to facilitate a more effective diagnosis and maintenance process.

Empowering shop-floor workers with hands-free AR and detailed 3D data will dramatically improve assembly line efficiency, with an intuitive environment where product data is fully interactive. Users will be able to move, hide, isolate and cross-section through parts, while using mark-up and voice tools to create efficient instructions for the assembly or disassembly of complex products. These instructions will be recorded and delivered as holographic guides via AR directly on the assembly line.

The next generation of real-time immersive 3D collaboration technology is even set to enable you to have a scaled-down hologram of your latest engine design sitting right in front of you on your desk. As you work on the design and refine it using your CAD software, the changes will be dynamically loaded into the hologram so that you can see the effects immediately and make any further necessary adjustments.

Meanwhile, digital sleeving – with 3D images overlaid on physical designs – will enable you to check how two parts of the engine come together, even when they are being designed by different teams in different locations. Similarly, you will be able to see how, for example, cabling will fit inside your latest aircraft seat design or where best to put the maintenance pockets for easy access.

This kind of approach adds a new dimension to the handoff between design and manufacturing. If adjustments need to be made to a fan assembly design, for example, the relevant part can be isolated within an immersive design review – and speech-to-text notes can be added to the part number and automatically become part of the change request. It’s all a far cry from endless design iterations, spreadsheets and printouts – or CAD screen shares using a 2D representation of a 3D problem.

In the post-pandemic remote world, conferencing is bringing people, video and documents together. Collaboration is now adding the fourth dimension of 3D immersive experience to complete the picture.

 




5 Tips on How AR Smart Glasses Increase Employee Satisfaction – Ubimax

Here are Ubimax’s top 5 tips:

1. SMART GLASSES IMPROVE WORKING ERGONOMICS

Smart glasses allow working with both hands throughout. Tablets or notebooks no longer need to be held in the arm. This takes the strain off many tasks and prevents incorrect posture. Imbalanced strains and joint wear due to permanent one-sided holding of scanners and repetitive movements are avoided. The workflows themselves are structured and often simplified. Unnecessary steps are eliminated.

2. INCREASE OF OCCUPATIONAL SECURITY

The ability to use both hands is a major security advantage of smart glasses. For example, it is a lot safer for a logistics worker to climb a ladder using both hands instead of operating a hand scanner in elevated positions. In addition, warnings about ergonomically questionable situations, for example when unhandy or heavy parts need to be handled as it is often the case in industrial environments, can be displayed directly on the glasses. The punctual display of safety instructions can also reduce the potential of injury in high-risk occupational fields, utilities, or in production through the targeted use of AR devices. Step-by-step instructions increase security by indicating temporary prohibited zones, e.g. for tests in laboratories, or notes on hygiene regulations.

3. WORK FACILITATION THROUGH EXPERT CALLS AND REDUCED TRAVEL ACTIVITIES

Smart glasses enable global collaboration across distances and time zones. In case of problems, experts can easily be consulted via video call on the smart glasses. The expert sees exactly what the employee sees while the employee continues to work hands-free. Collaboration through remote support enables remote training of employees and saves money by reducing the travel expenses of experts. They do not have to travel at short notice but can help their colleagues on site from their own desks. Global collaboration also increases the feeling of belonging and team spirit.

4. ALL AGE GROUPS BENEFIT FROM THE USE OF SMART GLASSES

The use of smart glasses does not only delight young, technology-oriented employees. Older employees also benefit from them. For example, heavy, unwieldy scanners are no longer needed for order picking which makes work easier on the one hand and helps workers achieve their targets faster on the other. Furthermore, smart glasses simplify the inclusion of handicapped workers. The possibility to show instructions step-by-step at the employee’s pace and to carry out a subsequent quality control allows them to participate fully in working life.

5. QUICK ONBOARDING DUE TO EASIER KNOWLEDGE TRANSFER

Onboarding training is simplified due to smart glasses. Step-by-step instructions, automatic quality checks, help and expert support enable employees to work independently and productively a lot faster. This not only makes their daily work easier, but also that of their colleagues who do not have to take on the task of training new colleagues in addition to their own work. There are fewer disappointments and negative experiences which reduces the drop-out rate, and quality and productivity reach the level of experienced colleagues much earlier.

 




Smart Glasses In Surgery: Expert Analysis Outside The Operating Room

Surgical teams around the world consist of doctors with diverse levels of training, experience and expertise. Sometimes, members of those teams need to consult with a specialist about a surgery they’re performing while the patient is on the operating table, to decide the best steps to take in their care.

Historically, an on-call consultant at a hospital where a surgery is being performed would have to don the necessary personal protective equipment (PPE), head into the theatre and give their verdict. Now, thanks to smart glasses technology, there is a much more efficient route forward.

Iristick, a company that makes smart glasses for industrial purposes, has partnered with Rods&Cones, which focuses on remote assistance in the operating theatre, to create a specialist solution. The two organisations have developed a specially designed pair of smart specs customised for use during surgeries to enhance communication and interaction within an operating theatre.

The smart glasses enable a surgeon to share what they are seeing with a remote specialist. Through the glasses’ microphone, and its two cameras with optical zoom lenses, a consultant outside of the operating room can have an unrestricted, close-up view of a surgery as it progresses. Watching the operation unfold, they have the ability to speak to the surgeon and provide real-time feedback and advice.

As the smart glasses are technically classed as a telecommunications device, rather than a medical one, they haven’t had to seek the European CE approval to start being used in hospitals. Currently, they’re being used in the Netherlands, Belgium, Spain and Italy, with plans for further international expansion.

JUST A QR CODE AWAY

“We keep the surgeon in full control over the communication, while all the handling of the cameras is done by the remote expert,” says Rods&Cones founding partner and CEO Bruno Dheedene.

Let’s say a surgeon is implanting a patient with a device that hasn’t been on the market for long, and which as a result they aren’t overly familiar with. The smart glasses feature a QR code scanner that enables a surgeon to dial-in an on-call expert, perhaps even somebody from the team that developed the new device, simply by looking at the code.

“YOU JUST HAVE TO ASK A CIRCULATORY NURSE FOR THE QR CODE OF THE PERSON YOU WANT TO CALL.”

“You wash your hands, you start the surgery, and half an hour later you want to get some expert advice from a colleague,” says Dheedene. “You just have to ask a circulatory nurse for the QR code of the person you want to call.”

The remote expert will then be able to see everything the surgeon can see through the cameras of the glasses. They’re in full control of and can make enhancements to the footage streamed to them by zooming in, taking pictures and even adjusting the exposure and contrast of the images.

Rods&Cones have also made specific enhancements to the glasses so they can handle X-ray video feeds, the high-contract screens of in-theatre devices and red balance issues.

IMPROVED ACCESS TO EXPERTS, PPE SAVINGS AND A BETTER VIEW

There are a number of advantages to allowing operating surgeons to consult remotely with experts outside of the hospital they’re working in. They have access to a much wider field of specialists than they would otherwise have, and could even speak to multiple people about the same issue if it proves to be particularly complex.

Additionally, the hospital saves on PPE. No one has to gown up to look over a surgery for a few minutes when they can dial-in from outside the room. In the age of Covid-19, when PPE supplies are running low, this is particularly significant. The smart glasses can help to enforce social distancing too, by keeping the number of people inside each operating room that’s currently up and running to a minimum.

The glasses also provide an arguably better view of the surgical field than could be gained from actually being stood in the room. Remote assistants now have what are effectively the best seats in the house.

“Surgery is mostly happening in a very small cavity. If you go into surgery and stand next to the doctor, you won’t be able to see everything he’s doing, because he’s working in between his hands,” says Dheedene.

THE FUTURE OF SMART GLASSES IN SURGERY
Rods&Cones chose to partner with Iristick for the development of the device due to the quality of the glasses the company was already manufacturing.

Alongside the video quality of the intuitively positioned cameras, the glasses are incredibly light at only 70g, meaning they’re unlikely to prove bothersome to wear for long stretches of time. Instead of having hardware weighing the device down Iristick’s glasses are fibreoptic and all streaming and processing is carried out via a module worn in the surgeon’s pocket.

That said, the Rods&Cones software can integrate with other smart glasses too.

“It’s not a mutually exclusive partnership, so in the future we might go in with other partners,” says Dheedene. “We want to adapt existing technology, as far as possible, to the use-case of surgery. We have made our software such that we can integrate with any glass. You just need to put a module in-between, to connect the parameters of our platform and the glass platform.”

Introducing video conferencing to an operating room in such a sophisticated fashion could well be a gamechanger. When it’s possible for operations to be carried out from miles away by utilising a 5G mobile network connection, using a pair of smart glasses to dial-in a consultant when needed seems only logical. With the world being as interconnected as it is, having on-demand access to specialist feedback and advice during an operation is more than just a futuristic luxury – it may, instead, become a daily essential.




Digital twins and predictive maintenance to increase efficiency at Repsol facilities

It was already in 2017 when this alliance was sealed to integrate tools such as the cloud within the oil company , to store the huge amount of data handled by the company. “Digitization is the lever towards the energy transition , ” explained the company’s CIO, Valero Marín, in a meeting with the media. The company has a return in the form of cash flow of € 1 billion until 2022 and an additional € 300 million  until 2020.

The oil company chaired by Antonio Brufau advances in the implementation of data technology, analytical models and artificial intelligence. ” All this allows predictive maintenance and improve the efficiency of operations,” said Marín. “We also explore the use of drones and the blockchain for our operations.”

And a practical example of the use of blockchain is the creation of a certification platform in which Repsol  has integrated its collaborating companies so that there is a record of the operations carried out with suppliers and distributors.
So that employees are not negatively impacted by the advancement of digitization in the company, the company has already trained 2,500 employees in artificial intelligence and another 500 in blockchain.

The oil company contemplates its future with connected and intelligent service stations . An idea that he plans to carry out through the installation of Internet of Things technologies.

The digitization processes will also reach their refineries and plants with facilities that remotely emit information to generate predictive models.

But not only that, both companies are working on the creation of digital twins, that is, data models that would allow analysis and reproduction of a scenario. Given that digital twins are a digital copy of a certain machine, it would allow simulations and understanding the consequences of certain changes, generating scenarios and validating hypotheses.

But the oil company also has a Cloud Competence Center in which more than a hundred professionals specialized in cloud technologies work . Repsol expects 70% of the infrastructure to be in cloud environments by 2022, compared to 30% today, for which it aims to reach a total of 4,000 servers in the cloud.