1

4 Key Challenges Facing UX Design for XR and How to Solve Them by ThreeSixty

XR promises more intuitive and natural ways of interacting with information. In both VR and AR, we can use our hands, gaze, gestures and voice to directly interact with content and manipulate virtual objects, but perhaps we’ve be lead to assume that immersive tech is already usable out of the box.

Today, we consistently find evidence that the UX is still in its infancy and improvements to UX design are not being focused on enough in the immersive technology space. AIXR have covered accessibility issues with UX design recently, check that article out here. 

At Threesixty Reality we have an immersive tech usability lab, where we test a wide range of AR and VR applications with target users. We repeatedly see most users struggle to use these devices effectively, and even mastering the basic controls can take some practice. Let’s not forget that humans have been trained for the last 20 years or so to interact with flat 2D menus on screens, with their finger or with a mouse. The transition to immersive tech isn’t automatic. The interactions are mostly all new and UX design conventions for virtual and augmented reality are just starting to emerge.

Ryan Gerber, Systems-level UX and product design at Vive VR points out: “Even though we’ve begun to see enterprise readily adopt XR solutions, largely due to a quantifiable return on investment, the much more massive market of normal humans is still largely skeptical around this technology’s ease of adoptability.”

In fact a survey of 140 industry professionals in 2018 by global law firm Perkins Coie found the top rated barrier for both VR and AR adoption was poor UX for a second year in a row. Although, in many cases these UX barriers relate to setting up the hardware, we need to also pay attention to the many challenges around how to best design immersive software and new UI paradigms, that are easier to adopt and provide a sense of familiarity as users move from application to application. Vik Parthiban, XR graduate researcher at MIT Media Lab highlights the need to prioritise interaction design for XR, “People underestimate the importance of interaction in AR and VR.”

In the XR industry, we tend to focus a lot on the immersiveness, the compelling 3D world, the sense of presence, low latency tracking that makes you think those are your real hands, the spatial audio, the 3D holograms that seem to obey the laws of physics. In other words, there is a fascination with the potential of the technology and what it can do and less with the step by step journey a human will go through to actually get things done and interact with the system effectively.

This sense of ever improving presence and graphical realise are qualities that make users say “wow!” the first time they experience modern immersive tech, but what we find in user research is that the vast majority of issues occur when the user tries to interact with objects and get the same level of return for their efforts as they would from a usable mobile application. These issues are often quite severe to the point where the user is frustrated and confused and quickly loses the motivation to continue. We hear comments like “I couldn’t get it to do what I was trying to do” or “I could have done this faster on my phone” all the time.

Read the full article on UX Design and Immersive Tech by ThreeSixty here.

 




Farmington’s Polarity raises millions connecting AR to the workplace, cybersecurity

“It enables what we call a collective memory across the team,” said CEO and co-founder Paul Battista, a Connecticut native and former U.S. intelligence officer who leads the mostly virtual company from his Farmington home.

With 20 employees, the five-year-old startup has raised $11.6 million in venture capital so far and is growing, with plans to double the size of its team in the next 18 months.

“The biggest challenge right now is scaling and hiring folks,” said Battista. “We have lots of potential use cases but limited resources.”

Battista markets Polarity as AR minus the goggles. It uses computer-vision algorithms and overlays to add data to text that appears on a user’s screen.  It works in two ways: First, users can highlight text on their screen, such as a person’s name, and add a notation, like a brief biography.

Polarity’s algorithms will then recognize those characters and display the note anytime that text appears on their screen, regardless of the application. The notes will also become part of the team’s “collective memory,” making it easier for co-workers to collaborate, Battista said.

“You don’t have to interrupt workflows or send emails asking about updates on things. You’re seeing everybody’s notes in line with the tools you’re already using,” Battista said. In the cybersecurity world — a target market — it can alert analysts to suspicious IP addresses flagged by other team members.

Another feature allows users to pull data from outside sources, bypassing the need for a Google search. For instance, Google Maps could be displayed whenever a street address is recognized, or Standard & Poor’s data could be linked to company names.

For the short term, Battista is concentrating on building his team and growing his customer base, which currently leans heavily toward Fortune 500 companies.

One priority is to make the software more accessible for smaller firms without a dedicated IT department. The company is developing a hosted version of its software that can be used without an on-site server, Battista said.  And while the startup currently has office space through an investor in Washington, D.C., Battista hasn’t ruled out expanding in Connecticut if he finds the right talent here.

“Our mentality is typically to hire the best people, wherever they might be,” he said.

Read the full article on Hartford Business.com

 




The University of Michigan advances work in Extended Reality

“Our commitment to academic excellence is longstanding,” Philbert said. “The XR initiative will provide significant opportunities to explore how these new technologies can bolster excellence—in student learning, in new research possibilities and in serving the world more effectively.”

XR encompasses augmented reality, virtual reality, mixed reality and other variations of these forms of computer-generated real and virtual combined environments and human-machine interactions.

Philbert charged the Center for Academic Innovation with establishing and facilitating the new priorities, to seed new projects and experiments that integrate XR into residential and online curricula, and to create innovative public/private partnerships to develop new XR related educational technology.

“We are eager to explore possible breakthrough innovations that enhance teaching and learning across disciplines, foster equity and inclusivity, and increase access and affordability.” James DeVaney

“XR applied thoughtfully in an educational context has the potential to fundamentally change the way we teach and learn,” said James DeVaney, associate vice provost for academic innovation and founding executive director of the Center for Academic Innovation. “We are eager to explore possible breakthrough innovations that enhance teaching and learning across disciplines, foster equity and inclusivity, and increase access and affordability.”

A new XR Innovation Fund will provide the U-M community access to financial and in-kind support for new innovative projects.

The center will work closely with units across campus and across disciplines to fully understand the potential for these new technologies to enhance learning. Many faculty and academic units are already thinking deeply about these technologies, DeVaney said.

In fact, U-M faculty are using the technology across various disciplines to treat and diagnose illnesses, test cars of the future, teach students in the sciences why architectural structures fail, help those in education practice teaching before stepping into a classroom full of youngsters, and allow students in screen arts and cultures to take a look at the work of Orson Welles through a different lens.


Image courtesy: U-M Center for Academic Innovation, via news.umich.edu

“An important part of this project, which will set it apart from experiments with XR on many other campuses, is our interest in humanities-centered perspectives to shape innovations in teaching and learning at a great liberal arts institution,” said Sara Blair, vice provost for academic and faculty affairs, and the Patricia S. Yaeger Collegiate Professor of English Language and Literature.

“How can we use XR tools and platforms to help our students develop historical imagination or to help students consider the value and limits of empathy, and the way we produce knowledge of other lives than our own?

“We hope that arts and humanities colleagues won’t just participate in this [initiative], but lead in developing deeper understandings of what we can do with XR technologies, as we think critically about our engagements with them.” Patricia S. Yaeger

“We hope that arts and humanities colleagues won’t just participate in this [initiative], but lead in developing deeper understandings of what we can do with XR technologies, as we think critically about our engagements with them.”

Joanna Millunchick, associate dean for undergraduate education at Michigan Engineering and professor of materials science and engineering, is working with augmented reality in her courses to help students better understand crystal structures at the molecular scale. She believes the technology has the potential to impact STEM (science, technology, engineering and mathematics) retention.

“The language of the STEM fields is math. But for many students, math is too abstract and not linked to the physical world,” Millunchick said. “Using XR in the classroom could bridge that gap in ways that is not currently possible.”

At present, an interdisciplinary team of faculty from several U-M departments, led by the School of Information, is working on an augmented, virtual and mixed reality graduate certification that provides advanced training and research in computer-generated technologies.

Through the XR Initiative announced today, U-M will explore additional curricular and co-curricular offerings, research opportunities, and multi-institutional and industry collaborations, said James Hilton, vice provost for academic innovation.

“XR is exciting because it has the potential to touch all of the disciplines at Michigan,” he said. “While it will initially be physically located in the Duderstadt Center, in order to take advantage of the VR technology and expertise that is already there, the scope of the initiative is campuswide and builds on Michigan’s long-standing commitment to continually ask, ‘What’s next?’—to experiment with leading edge technology to discover how it may change the ways we learn, create and educate in our third century.”

The center has named Jeremy Nelson as director of the XR Initiative. Nelson, a graduate of the College of Engineering, returns to U-M from the health care and public sectors where he worked to leverage innovative technology to solve customer problems. Most recently, Nelson was a managing partner at Afia, a health care consulting firm based in Ann Arbor that he co-founded in 2007. Prior to Afia, Nelson was the chief information architect at the Washtenaw Community Health Organization.

The XR Initiative will be an inflection point for the University of Michigan to continue to lead and engage the world to solve the problems that matter most.
Jeremy Nelson

A first objective for Nelson will be to engage a wide range of stakeholders across and beyond campus, DeVaney said.

“We are embarking on the next great shift in how human beings interact with technology and use it to alter the future,” Nelson said. “The XR Initiative will be an inflection point for the University of Michigan to continue to lead and engage the world to solve the problems that matter most.”

 




Industry Reborn – how tech is changing the way we make things – Dassault Systems

As information technology remakes the modern factory, forward-looking companies are creating virtual worlds to optimize real-world manufacturing. The rewards include improvements in business value and sustainability that would have been almost unimaginable just a few years ago.

Among the most important domains in which data-driven approaches are helping manufacturers boost innovation and performance are innovation and performance are:

  1. Digital twin tech and the next gen factory
  2. From supply chains to Value networks
  3. Cultivating the industry workforce

The article proceeds through each technology in turn and explains how it works.

Digital twins can also guide sustainable manufacturing, letting companies test out different approaches in a virtual environment. That lets them see how they can best eliminate potential waste, whether in inventory, energy use, equipment efficiency or anywhere else.

A digital twin’s most powerful application, however, may be in the design and planning of manufacturing processes and even entire factories. Eric Green, vice president at Dassault Systèmes, cites the case of a company that Dassault Systèmes helped to create a digital model as a starting point for a new plant.

The company realized that it could improve quality and reduce costs by self-manufacturing parts that it had long outsourced. Working with the digital simulacrum, the company simulated different production volumes and flow rates for the parts it wanted to make in-house.

The state-of-the-art plant worked efficiently from day one—the digital twin eliminated the need for a shakedown period. As a bonus, the company now has nearly identical virtual and real environments. This allows managers to more efficiently shift production around various lines.

“They can simulate and optimize for production rates as they grow their business and understand what they need to do before they actually make changes on the factory floor,” says Green. “They’ve now saved a lot of money and become very efficient.”  Read the article in full here.




Microsoft HoloLens AR glasses that could boost productivity on Crossrail

The example used is via the building of Crossrail.  Crossrail is the new high frequency, high capacity railway for London and the South East in England, UK. When the service opens Crossrail trains will travel from Maidenhead and Heathrow in the west to Shenfield and Abbey Wood in the east via new twin tunnels under central London. It will link Heathrow Airport, the West End, the City of London and Canary Wharf.

Crossrail is set to open by 2021.

Senior Engineer, Ravi Kugananthan, who works for construction giant Laing O’Rourke has traded in his tablet computer to test a pair of augmented reality (AR) glasses.

The full article is available via The Times newspaper.

Image credit – Petr MacDiarmid

 




Iristick announces First smart glasses in the world compatible with iOS phones

From the start, Iristick made the choice to tether the Iristick smart glasses to a smartphone. This combines the best of both worlds: powerful, yet comfortable to wear smart glasses, complemented with the fast-evolving processing power of a smartphone.

Linking the Iristick.Z1 to an iPhone gives companies with a strict iOS company policy the benefits of working with smart glasses for remote assistance, work instruction guidance and pick-by-vision. Iristick smart glasses are now fully compatible with both iOS and Android smartphones.

Why are Iristick the first smart glasses to do so?

Until today, no smart glasses device on the market has been compatible with iOS, making the use case for smart glasses in some industries impossible. The Iristick.Z1 smart glasses are tethered to a smartphone for its processing power, weight distribution and battery requirements, opening up a range of possibilities other smart glasses do not have. These advantages were previously only available for Android phones. Moving forward, it will be possible for the entire range of iOS devices to be tethered with smart glasses.

Why is it so hard to make smart glasses compatible with iOS?

“Traditionally, iOS was a closed system that could only work with external hardware by using very intrusive modifications to the iPhone. Iristick does not require any modification to iOS. The new Iristick framework hides all hardware-level complexity and offers software partners a highly transparent software layer. Creating applications for Iristick becomes standard iPhone application development.” Riemer Grootjans, CTO

Is the market waiting for iOS compatible smart glasses?

In some industries, e.g. pharma, life sciences, space and aero, iPhones and the iOS operating system are IT standards. In those industries, IT departments are reluctant to accept non-Apple devices for multiple reasons (security, deployment cost, …).

This makes working with smart glasses impossible for them, since there are no smart glasses on the market that support Apple phones. At the same time, these are industries where remote assistance and work instructions can significantly improve quality and solve compliance issues. Now, these companies no longer have to compromise by introducing non-standard Android devices. The iOS-Iristick combination solves this dilemma.

What applications are ready for this combination of Iristick smart glasses and iOS?

“Proceedix is a digital platform for enterprise work instruction and inspection execution with mobile and wearable technology. Technicians, operators and inspectors execute their workflow with Android, iOS or Windows tablets and phones. The Proceedix app also runs on assisted reality smart glasses like the Iristick.Z1. For the past 2 years we have leveraged the power of the Iristick.Z1 with an Android phone for hands-free guidance of complex workflow execution. We absolutely welcome the new combination with an iOS device for various customers in Aerospace, Pharma and other process industries.” Peter Verstraeten, CEO Proceedix.

ABOUT IRISTICK  See Iristick AREA member profile 

Founded in 2015, Iristick creates industrial smart safety glasses to support enterprises in their digital transformation. Iristick empowers the deskless operators of the Industry 4.0 future in three domains: remote assistance, digital work instructions and pick-byvision logistics. Iristick smart eyewear is currently being used and tested by customers in maintenance, after-sales support, logistics, shop floor activities, quality control, tele-medicine and education. Iristick, based in Antwerp, Belgium, supports customers globally. Winner of a Red Dot Award, H2020 European Commission Innovation Grant (N°811820) in 2018 and holder of multiple patents. More info: www.iristick.com

ABOUT PROCEEDIX  See Proceedix AREA member profile

Proceedix is a platform to manage enterprise procedures, work instructions and inspections in an easy, digital way. We change the way your deskless workers execute work: anytime, anywhere, on smartphones, tablets and smart glasses. With offices in New York, San Francisco and Ghent, Proceedix helps Fortune 500 companies empower deskless workers around the globe. More info: www.proceedix.com

You can read the full Press Release from Iristick here.




How Augmented Reality Is Transforming the Construction Industry

In the UK, a consortium allocated £1 million—or about $1.31 million—for the development of AR in construction. Meanwhile, researchers from Virginia Tech are designing an augmented reality-based interface for wearable, powered exoskeletons that will aid workers’ performance in the industrial sector.

Yes, AR-assisted exoskeletons are coming soon, but the technology is already disrupting the architecture, engineering, and construction (AEC) industry with several use cases.

Making It Easier to Visualize Construction Projects

More and more companies are using Building Information Modeling (BIM) with augmented reality to make 3D blueprints come alive. Uploading a BIM model into their AR software and using a tablet or a pair of AR glasses would allow workers to have full-scale walkthroughs of the plan.

Companies like Daqri and Intellectsoft, for instance, are using wearable technology to render BIM models life-like. Daqri offers its flagship Smart Glasses paired with their Worksense suite of AR applications. Intellectsoft has collaborated with Microsoft to use their tech with the HoloLens.

But AEC teams don’t necessarily have to spend on special hardware to take full advantage of AR. For example, MLM Group, an engineering, environmental, and building control design consultancy, is using WakingApp to showcase their projects beyond the blueprint. The company uses the app to create 3D models of their sites in as little as 30 minutes. Clients and team members can then view the models from MLM’s mobile app and project them on top of the original blueprint.

 “AR technology is providing clients with more control and understanding before the first nail is ever hammered into their construction projects,” said Matan Libis, CEO of WakingApp. “Today, AR capabilities allow consumers to fully visualize their projects so they can make sure the couch fits perfectly in the living room or the right size boiler will be included in the new apartment building. We believe that in the very near future, augmented reality will enable users to create custom features—like beds or light fixtures—that can be custom-made and included in your project.”

Allowing Construction Teams to Plan Adequately

Indeed, designers and architects can use the technology in selecting materials and organizing the layout of an area. In fact, they can even use it to guide builders through the execution of complex designs, saving time and effort in the process.

Meanwhile, when the digital model is overlaid on the actual site, workers get to see the parts of the structure as they are intended to be installed. They can see the ductwork and pipes prior to assembly to get an overview and assess units that need reinforcements or modifications. They can also take measurements with high precision, preventing costly errors.

Augmented reality also allows them to tag objects and real-world equipment with valuable information that everyone else on the team will be able to access. Workers can scan their surroundings and create 3D digital models of equipment for better collaboration.

Improving Workplace Safety

In the US, one in five construction workers suffered fatal injuries in 2017 alone. AR can potentially lower this statistic by giving contractors room for error before they actually start building.

One prototype demonstrates this by allowing workers to see the machinery and expected environment overlaid on an empty site. Such application of augmented reality will help teams prepare and check for safety hazards before work officially starts.

Later on, inspectors can survey the jobsite and compare the real-life structure against the full-scale digital model in real time. They can note any disparities that may be hazardous to worker safety.

The technology also facilitates employee training. New employees can train to use potentially dangerous equipment without the threat of an accident. Blurring the line between theory and practice, augmented reality lets workers learn how to operate machinery faster and more safely.

The Future of Augmented Reality in Construction

By 2029, the construction industry expects to have autonomous machines and Iron Man-like suits to aid workers. Augmented reality is set to be a part of this future. The technology will help builders visualize blueprints, maximize efficiency, and improve workplace safety. We’re already seeing AR in action. And with all the possibilities it brings to such a crucial industry, this tech can only keep growing.




Augmented and Virtual Reality in the Healthcare Market

The AREA is not affiliated with the producers of market reports however, many headline findings coming out of these reports will be useful for enterprise buyers, investors, researchers and providers alike who may seek to find a suitable supplier, provider, or to monitor trends in the industry.

The Augmented and Virtual Reality in Healthcare Market Report helps industry leaders and business decision makers to make assured investment decisions, develop tactical strategies and improve their businesses.

This report presents the worldwide Augmented and Virtual Reality in Healthcare Market size (value, production and consumption), splits the breakdown (data status 2014-2019 and forecast to 2025), by manufacturers, regions, types and applications.

Manufacturers included in this report include

Google, Microsoft, DAQRI, Psious, Mindmaze, Fristhand Technology, Medical Realities, Atheer, Augmedix, Oculus, CAE Healthcare, Philips, 3D Systems, VirtaMed, HTC< Siemens and Virtually Better.

Organisation Types

On the basis of the end users/applications, this report focuses on the status and outlook for major applications/end users, consumption (sales), Augmented and Virtual Reality in Healthcare industry share and growth rate for each application, including:

  • Hospitals
  • Clinics and Surgical Centers
  • Research Organizations and Pharma Companies
  • Research and Diagnostics Laboratories
  • Government and Defense Institutions

 




Magic Leap Teams With Brainlab, SyncThink, And XRHealth For Medical AR

Magic Leap’s $2,300 spatial computing platform Magic Leap One may be too expensive for most consumers, but like other early augmented reality devicesenterprise users with bigger pocketbooks are embracing its potential as a business tool.

One particularly promising category is health care, where Magic Leap says it’s now collaborating with at least five different companies to bring its hardware into labs, clinics, and even hospital operating rooms.

On the surgical side, German medical technology company Brainlab is working with Magic Leap on a collaborative 3D spatial viewer for Digital Imaging and Communications in Medicine (DICOM) content, enabling clinicians to work together when viewing medical images. Brainlab’s software could, for example, let a doctor and radiologist talk through multiple brain scans before a surgical procedure, or enable a surgeon to rely on a heads-up display of scanned while performing a surgical procedure.

Another brain-focused initiative involves SyncThink, a company that uses eye tracking analytics to help diagnose patients’ concussions and balance disorders. Having worked with Magic Leap One for the last year, SyncThink hopes to make it “the gold standard in brain health assessment” by letting doctors use the platform’s collection of sensors to easily determine what wearers are seeing and experiencing.

On the patient side, XRHealth (formerly VRHealth) is working to bring a therapeutic platform called ARHealth to Magic Leap, offering users rehabilitation, pain distraction, psychological assessment, and cognitive training tools. Unlike the prior solutions, which one would use at a doctor’s office, the ARHealth tools will let patients analyze and quantify their own results, then pass the information back to their doctors.

Magic Leap also says that it’s working with the Dan Marino Foundation on a tool to help young adults with autism spectrum disorder practice for in-person job interviews, and creating a virtual person-based medical training application for the Lucile Packard Children’s Hospital Stanford. The company expects to leverage its partnership with AT&T to incorporate 5G, AI, and edge computing into future Magic Leap-based medical solutions, enabling low-latency collaboration and co-presence, among other benefits.

 

 




Augmented Reality Market Report – Military AR Headgear

The report can be requested at the following link: Military Augmented Reality Headgear Market

Time period covered

Five years from 2019 to 2024

Key players in this market

Applied Research Associates (ARA), BAE Systems, Elbit Systems, Rockwell Collins, Thales Group, Facebook, Google, Microsoft, Osterhout Design Group, VUZIX

Products are split into:

  • Head-Mounted Displays
  • Monitor-Based
  • Video See-Through HMD

By Application, the report also covers

  • Military Simulation
  • Trauma Treatment

The Global Military Augmented Reality (AR) Headgear statistical surveying report studies the presence of the top to bottom market segments. The market is surveyed based on revenue (USD Million) and presents the significant players and providers affecting the market.