1

Magic Leap partners with Geopogo on Augmented Reality solution for architecture and design

Geopogo is a California-based 3D design software company that is working to transform the design and construction process. The company’s software allows architects and designers to create renderings and a virtual reality (VR) or augmented reality experience in minutes by importing existing CAD models or building directly with the Geopogo 3D creator tool.

Now, with Geopogo’s software on Magic Leap’s AR headset platform, the interaction of digital content with the physical world will help to bring architectural designs to life, according to the companies. “This is a phenomenal opportunity to make architectural design understandable and accessible to project clients, city officials, and the general public,” said Geopogo’s Creative Director, Michael Hoppe.

According to Magic Leap, the American Institute of Architects, San Francisco (AIASF) utilized the partnership’s technology as part of its ‘Shape Your City’ campaign, an ongoing fundraising effort to build its new headquarters in the Bay Area’s new Center for Architecture + Design. The organization also sought to fund expanded architecture-focused tours, exhibitions, educational programs, and events for people of all ages.

As a result, AIASF hosted on-site building tours to build excitement and engagement for the project from the architectural community and the public, and offered tour participants a 3D virtual model of the future Center. The integration of AR technology during the building tours allowed for a more interactive, transparent, immersive, and exciting way to visualize what the space will look like, even before construction has started.

“The power of the AR experience succeeded in inspiring donors to contribute much-needed construction funding for the project, as hoped for by the non-profit organizations. We were especially happy to see how the AR experience brought so much delight to the faces of the non-profit Board, the organization members, and members of the larger community,” said Dave Alpert, Geopogo CEO and Cofounder. 

“The AR model has allowed our project partners, Board members, potential donors, and community to experience the future Center first-hand and visualize the positive impact it will have on future generations,” agreed AIASF Executive Director, Stacy Williams.

For more information on Geopogo and its augmented reality solutions for the architecture and design industry, click here. For more information on Magic Leap and its AR hardware solutions, click here.

 




Qualcomm is trying to simplify app creation for AR glasses

The ultimate aim is to make AR more accessible. Ideally, developers will make apps directly available to you through mobile app stores, using glasses tethered to smartphones. You might not see Snapdragon Spaces used for stand-alone glasses, at least not at first.

The manufacturer support will be there. Spaces won’t be widely available until spring 2022, but Qualcomm has lined up partners like Lenovo (including Motorola), Oppo and Xiaomi. Carriers like T-Mobile and NTT DoCoMo will help build “5G experiences” using Spaces. Lenovo will be the first to make use of the technology, pairing its ThinkReality A3 glasses with an unnamed Motorola phone.

It’s too soon to know if Snapdragon Spaces will have a meaningful effect on AR. While this should streamline app work, that will only matter if there are both compelling projects and AR glasses people want to buy. This also won’t be much help for iPhone owners waiting on possible Apple AR devices. Efforts like this might lower some of the barriers, though, and it’s easy to see a flurry of AR software in the near future.

 




Extended Reality – Mixed Reality Versus Augmented Reality

Augmented Reality Defined

Augmented Reality is quickly making its way into a variety of settings. Retailers use it to help customers visualize a product before they buy it. Engineers turn to augmented reality as a way of accessing valuable information about a product without fumbling with physical manuals. With AR, users can embed or overlay elements of the digital world into the physical world.

Tools like ARkit from Apple and Google ARCore even allow users to build their own smartphone immersive experiences. However, it is possible to further enhance AR experiences through things like smart glasses. These overlay the digital content you need to see in the real world in a much more immersive way, without requiring you to hold a phone in front of your face.

Mixed Reality Defined

Mixed Reality is a hybrid of AR and VR (virtual reality), though it goes further than AR when it comes to immersion. Through MR virtual or digital content isn’t just overlaid into the real world; it’s embedded in a way that users can interact with it.

This form of MR is an advanced kind of AR, which makes the digital elements you bring into your environment feel more authentic and realistic. MR can have elements of both virtual and augmented reality within it. However, the major difference is that the focus is on blending everything together. You’re not entirely replacing an environment, or simply augmenting it with new content. Instead, you’re creating an entirely new reality by combining both the physical and digital environment.

Exploring AR and MR

There are numerous differences between AR and MR, but the biggest noticeable aspects are:

  • Device requirements – AR is usable on most smartphones or tablets, with the added option of specialist headsets. However, to provide a MR experience, more power and sensors are required.
  • Realistic interaction – AR offers limited interactivity with the virtualized elements. The computer-generated content can’t interact with the real-world elements users see.

It’s up to you whether to use VR, AR for your project. Each of them is made for particular tasks. For many companies, augmented reality will be one of the easiest ways to enter the world of extended reality. The environment is accessible because you can create applications and tools that work in smartphones, as well as through smart glasses and headsets. However, as the technology available to us continues to evolve, Mixed Reality may also become more accessible.

Many leading companies are experimenting with MR already, though it’s still technically the youngest technology in the XR space.

In manufacturing, an important hurdle to overcome when trying to bring together several emerging technologies in one place is data connectivity. At the Manufacturing Technology Center (MTC) in the UK, they understand this issue all too well and are working to combat it using ATS Bus.

ATS Bus is a platform for their VIVAR (Virtual Instruction, Inspection and Verification using Augmented and/or Virtual Reality) project which investigates “how augmented and virtual reality could be used to enhance the operator experience when viewing work instructions and increase efficiency and accuracy for both instruction delivery and data capture.”

The work orders received are translated by ATS Bus into a standard data format where they are then sent down to the shop floor where ATS Bus translates them again into the required format for use on the Adv (Advanced Display Device) server.

You can read the original article on INFRASI’s website.




As the Metaverse & AR Mature, Will They Fall Into Tech’s Common Silos

As the world of AR and the Metaverse matures, the ability for software and hardware products to integrate with one another becomes a huge factor in the adoption and use of these technologies.

Dan chats with Christine Perey, the founder and principal analyst of Perey Research & Consulting and founder of The AREA, on how history reflects tech’s tendency to embrace operational and hardware silos, and why siloed products cause significant inefficiencies and increase cost.

Abridged Thoughts:

“[Interoperability in the AR world] is the ability for components, software, hardware, services from any vendor, to be able to exchange data without the user needing to concern themselves with who made that part, and so it’s the ability for multiple vendors to combine parts and their customers also to be able to combine parts into new and unique ways and come up with new, innovative solutions that solve a specific problem.

And so the interoperability also allows the market to go to scale because you’re no longer going to be focusing only on one use case or only on one component of the whole system. You can take your component into many, many different pieces of hardware, for example, something I know a lot about, or software; you could take your content and deliver it on any browser, any player.”

– Christine Perey 

 




AR enables efficient remote support – XMReality

One of the greatest examples of AR technology is the popular mobile app Pokémon Go, which allows players to locate and capture Pokémon characters that appear in the real world. In addition to entertainment, augmented reality is also used in other areas, such as marketing, fashion, tourism, and retail.

Overall, the use of AR is growing as mobile devices that are powerful enough to handle AR software become more accessible around the world. However, AR is not a new invention. In fact, the first AR technology was developed back in 1968, when the Harvard computer scientist Ivan Sutherland created an AR head-mounted display system.

Following in Surtherland’s footsteps, lab universities, companies, and national agencies developed AR for wearables and digital displays. But it was not until 2008 that the first commercial AR application was created by German agencies in Munich. They designed a printed magazine ad for a BMW Mini car. When held in front of a computer’s camera, the user was able to control the car on the screen simply by manipulating the magazine ad.

Since then, one of the most successful uses of AR for commercial purposes has been the ability to try on products, such as clothes, jewelry, and even make-up, without having to leave your house. In addition, many tourism apps use AR technology to bring the past to life at historical sites. For example, at Pompeii in Italy, AR can project views of ancient civilizations over today’s ruins. Other examples include neurosurgeons using an AR projection of a 3D brain to aid them in surgeries and airport ground crews wearing AR glasses to see information about cargo containers. Needless to say, the potential of augmented reality is endless.

 

AR enables efficient remote support 

At XMReality, we have embraced augmented reality from the beginning. Founded in 2007 by researchers from the Swedish Defense Research Agency, our first project was to help bomb disposal experts defuse landmines in the field. For six years, we performed advanced contract research in AR for the Swedish Defense Materiel Administration and BAE Systems.

Though we continue to work and innovate in the defense sector, we expanded to help other industries with our remote support solution XMReality Remote Guidance. In remote support calls, you can use the AR feature Hands Overlay to guide your counterpart by overlaying your hand gestures on top of real time video.

This is especially useful when you need to show someone how to turn a screw, explain what cord goes where, or provide other instructions where technical support is needed. And it comes in handy when you need both your hands to give instructions or guide someone through complex tasks.

The user-friendly software and AR technology enables you to improve operational efficiency and quality for processes like audits, maintenance, service, repair, training, and support at production sites, packaging, energy grids or properties. Find more information about how to use remote support in different industries here.

Don’t tell it, show it with AR

In a rapidly growing ​​AR marketplace, we always continue to develop the use of AR technology. To enhance the Hands Overlay experience, we have introduced additional hardware: The Pointpad.

Together with the Hands Overlay, the Pointpad is useful for experts in a helpdesk setup who is using XMReality Remote Guidance from a desktop computer or support stations. This allows you to enhance hand gestures for clear instructions during everyday calls.

Imagine that you are a technician dealing with electricity sub-stations, which include extremely complex industrial installations with myriad switch-gear, screens, and interfaces. When you are restricted to voice only support, you have to rely on the customer to explain what they see in front of them, and you must give them support while acting blind.

By using XMReality and its’ AR technology, you can both see exactly what the customer sees but also guide their hands with your own.  This way you don’t have to trust the customer to explain everything just right, and you don’t need to keep in mind every detail that the customer has said, since you can continuously see it while you and the customer are troubleshooting together. You also don’t need to worry about language barriers and having to say every instruction in the most easily-understood way, since you will use your hands to show the customer what to do with their own. The reduced risk for misunderstandings combined with faster trouble resolution is a great way to achieve happier customers and more efficient processes

You can read the original blog post by XMReality here.




Case Study of AR Technology Hirschmann Automotive and RealWear

The Challenge

With seven factories worldwide, Hirschmann Automotive needed a more cost-effective and time-efficient knowledge-transfer approach to maintaining and repairing equipment than flying experts around the world.

“If something isn’t working properly at one of our plants, technicians have to call our headquarters in Austria. And even then, they might not be able to solve the problem. Then it becomes an issue of flying someone around the world to assess the problem in person”

That’s when Fliri and his team looked at virtual and augmented reality solutions. Unfortunately, most devices were too delicate for the production plant environment — until Fliri discovered the RealWear HMT-1.

The Solution

Deploying RealWear running Cisco Webex Expert on Demand allowed Hirschmann Automotive to streamline collaboration and reduce equipment downtime.

The Results

  • Reduced travel needs and costs
  • Improved maintenance and repair response
  • Streamlined information accessibility and collaboration
  • Increased first-time fix rates
  • Shortened first-time resolution time

Hands-Free Use Case

  • Remote mentoring

Readers can download the case study for free on RealWear’s website




Can we say goodbye to Geospatial now?

AR’s baby steps in geospatial applications

It wasn’t long before the geospatial world saw the potential of AR. However, unfortunately, the technology wasn’t ready. Using software like Unity 3D and Unreal, it was possible to bring 3D models into the physical environment but difficult to accurately scale or get a good ratio of pixel to the real world. Further, mobile phone GPS, even in the best phones of 2013, wasn’t able to get better than a fix ±/10m  of a position. So, creating overlays of the real world was tricky without using third party software like Vuforia to create anchors (or georeference points in GIS terminology, or ground control points in survey terminology).

It wasn’t until 2017, a whole four  years after the AR boom that Mapbox, Esri and some others brought SDKs (Software Development Kits) into Unity and Unreal, which allowed map units to be referenced and used alongside maps, real world coordinates and navigation. To really make the most of the AR overlay, a correction to the GPS by Real-Time Kinematic (RTK) was needed, though this was big and cumbersome for the average mobile user.

Over the space of one year, geospatial AR moved by leaps and bounds. Vuforia and Google improved their detection systems so that it was possible to detect surfaces and objects far better than ever before. Ways of triggering events improved with augmented geofences which allowed for things to be triggered when the mobile device was within an area in the real world. This one particular capability (along with some help from Esri) enabled one of the biggest and most popular AR games with over 600 million downloads. The fact that users could interact with game characters in the real world through their mobile devices sounded like science fiction before the turn of the century. This one mobile application should have turned the tide on how AR was used in business.

There were a few innovations in the geospatial industry using AR that are still amazing and underrated. One of them is Trimble Sitevision, which is essentially a Real-Time Kinematic (RTK) with a mobile phone mount, though once you have tried it, you realize how much more there is to it

Uses of AR

So, why did Google Glass fail? Why aren’t there more AR games and business applications today? After all, the potential applications for AR are immense; even the military saw the potential and used HoloLens headsets in the field. AR could be used to overlay building layouts for emergency services, it could revolutionize navigation by providing an overlay on a vehicle windscreen so that the driver wouldn’t need to look away from the road, and it could also improve the housing industry by providing information overlays for the building being viewed. The applications are endless.

There were a few innovations in the geospatial industry using AR that are still amazing and underrated. One of them is Trimble Sitevision, which is essentially a Real-Time Kinematic (RTK) with a mobile phone mount, though once you have tried it, you realize how much more there is to it. Trimble have done their homework; you are able to integrate your BIM, site drawings and other information into the AR application on your mobile device and then view that information as an augmented environment to fantastic precision. Wires behind walls, new pipes about to be installed, and even new objects can be shown in AR against what has actually been built.

Another great AR innovation came from vGIS. One of the greatest challenges for the construction industry is being able to identify the precise position of underground cables. Here, vGIS, much like the Trimble Sitevision, is able to overlay BIM and as-built data, but also uses Esri data types to create a simple geospatial solution. Furthermore, vGIS has worked with Microsoft to make it work with HoloLens.

One application that I had a vested interest in and which is no longer on Google Play, was made by the Carto Group—it was a commercial real estate application which would overlay potential augmented office layouts in an empty office space, meaning you could view office spaces to rent or buy and use your mobile phone to view what they would look like with different furniture layouts. This was supported with a “model mode” so that you could put a doll’s house size version on the ground or a table and look at different layouts, as well as some other great functionalities. It was a great way to use existing information that was present but could be used in a way that made the information more useful.

Is AR worth it?

With only a few examples of the great things AR can do when mixed with geospatial information, should we give up? I don’t think we should. We are in the midst of a geospatial revolution. Computational technology is rapidly declining in size, GPS chips are becoming more accurate, and companies like Trimble are starting to provide GNSS correction services to allow centimeter accuracy on your phone…how long before this is commercialized and made commonplace by Apple or Google?

Over the last year, we have seen a steady increase in Virtual Reality (VR) and excitement about the metaverse: Ray-Ban has worked with Facebook to release Smart Glasses; Amazon has released the Echo Frames 2nd Gen; Lenovo has released the ThinkReality A3 AR glasses that allow business users to view 5 virtual displays; and there are even some AR glasses for cyclists who want more real-time Strava feedback called Solos. AR is by no means dead but, in my view is waiting for the right technology to become available. At present, AR glasses look too much like one has put half of a  computer on one’s head. Further, battery life can be short and mobile AR doesn’t quite have the graphic capability to overcome the uncanny valley effect.

A case for the future

Although the Ray-Ban Smart Glasses aren’t really AR, they prove that the technology can be cool and consumable. Now imagine them being capable of smart assistance like Jarvis in the Iron Man movies, overlaying information to questions and giving real-time feedback on performance, or based on what they are hearing, providing the information that Alexa, Siri and Google provide both audibly and visually. Our assistants on our mobile phones have become almost natural, so it isn’t hard to see how, given some small technological improvements, this could become our future.

Can we say goodbye to geospatial AR? Although the hype is over and the technology isn’t ideal, it is too early to how we communicate and interact with the world around us. This technology from vGIS, Trimble, and even the current Google Glass 2 shows how it can be used to improve current working practices.

 




Boeing’s Dr. Greg Garrett on the Work of the AREA Safety Committee

AREA: Are you an AR guy who got into safety, or a safety guy who got into AR?

Dr. Garrett: It’s the latter. In 2017, I was supporting the Boeing 767 tanker program when a couple of colleagues approached us in the Safety organization looking for safety and ergonomics guidelines on an Augmented Reality project using HoloLens for wiring work. We looked at each other and said, “What’s a HoloLens?” (laughs) I did some looking around and I couldn’t find any research on the safety ramifications of AR. I finally landed on some ergonomic recommendations for helicopter pilots using night vision goggles. That was the closest thing I could find, but at least it was a starting point. I put some recommendations together and very quickly became the subject matter expert for AR safety.

AREA: It sounds like everybody involved in studying safety requirements in enterprise AR has had to learn as they go along.

Dr. Garrett: It has been a very hands-on learning experience, but the technology is still a hands-on learning experience in a lot of ways. And as we’ve gone along, my interest has been pushed more into fully immersive technologies, not just the AR space. Once I became known as the AR guy, people started coming to me and asking me to help them with their VR projects. So that’s become part of my work now.

AREA: What is the AREA Safety Committee focused on right now?

Dr. Garrett: The past few years have been largely project-focused. There was the AREA Safety and Human Factors Assessment Framework and Best Practice Report. Things have changed a lot since that was published, so we’ll be doing a refresh of it. And then we put together the AREA Safety Infographic. We’ve now moved into the development of a playbook of sorts, a general guide to things to be aware of when you’re implement AR solutions from a safety perspective. What kind of infrastructure do you need? What kind of issues should you be aware of? How should you assess the environment? We’ve also brought in outside experts from academia and industry to provide their viewpoints and lessons learned. For example, at our next meeting in November, the CEO of Design Interactive will present some of the things they’ve been working on from a product design perspective, but also some of the research they’ve been involved in with their customers on usage requirements. We’ll be learning about the impact they’re beginning to see on the individuals who use AR.

AREA: What are the top AR safety issues that people are concerned about?

Dr. Garrett: Situational awareness is a big one. The restricted field of view. These are of particular concern in environments that have potential hazards. If you’re interacting with the system, you may not hear emergency or other messaging going on in your area. And with a restricted field of view, you might trip over something or bump into someone. Those are probably the top two. Cyber sickness is not generally a concern with AR, but we are starting to see some research that there are some impacts among those who are exposed for two hours or more. There is a correlation between the amount of usage and how much downtime you should have. As that research continues, we’ll be able to develop some requirements to address that issue.

AREA: What can we look forward to from the AREA Safety Committee in the near future?

Dr. Garrett: Last year, we entered into a partnership with the National Safety Council. We’re going to be working with them on the further refinement of the framework tool. It will give new AR adopters a checklist whereby they answer a series of yes/no questions to evaluate the job or their work environment from a safety perspective. In addition to the AREA sharing that framework tool with the AR ecosystem, the National Safety Council will be able to share it with their membership. We’re currently waiting for the NSC to arrange the resourcing of that work, but I expect we’ll see that completed next year.

AREA: Why should AREA members consider joining the Safety Committee?

Dr. Garrett: It’s really about having a voice and a say as to what content is being delivered to protect all employees. International standards are another area where we need a lot of support. There are standards development efforts underway right now at Underwriters Laboratories, IEEE, and ISO, and we need AR users to be represented in the room. There’s a lot of manufacturers and academics involved, but not enough AR customers, and their voices need to be heard.

 

If you’re an AREA member and would like more information about joining the AREA Safety Committee, contact Dr. Greg Garrett or AREA Executive Director Mark Sage. If you’re not yet an AREA member but care about ensuring safety in enterprise AR, please consider joining; you can find member information here.




How Assisted Reality differs from Augmented Reality

In Industry 4.0, Augmented Reality (AR) and Virtual Reality (VR) often get the spotlight as the next great leap in boosting worker productivity. But these X-Reality (XR) technologies aren’t always practical when used as manufacturing or frontline tools.

Enter another aR: assisted Reality.

What is assisted reality? How does it differ from augmented reality?

Assisted Reality gives you access to the right information right when you need it, allowing you to have full situational awareness. Unlike AR, it’s a reality first, digital second experience. Assisted Reality allows a person to view a screen within immediate field of vision, hands free. Information is not overlaid with real-world view.

Let’s explore this by looking at heads-up displays (HUDs). HUDs in vehicles give an extra layer of relevant information without hampering vision or distracting the driver. The driver doesn’t have to shift their gaze to the dashboard. They can keep their eyes on what’s most important (the road) and have both hands free to control their vehicle.

Assisted reality devices can also be wearable to be more practical in certain situations.

  • Headsets with micro-displays: A small but high-resolution screen that’s positioned in front of the user’s eye. With the appropriate focal depth, a half-inch display can look like a 7-inch tablet held at arm’s length.
  • Smart glasses: Worn like ordinary glasses, purpose-built smart glasses project images directly onto the lenses (note: most assisted reality use cases are not dependent on SLAM (simultaneous location and mapping) computer vision.
  • RealWear devices with assisted reality technology are leading the industrial field’s digital transformation with hands-free, Android-based headsets, designed specifically with safety in mind.
  • RealWear devices with assisted reality technology are leading the industrial field’s digital transformation with hands-free, Android-based headsets, designed specifically with safety in mind.

How is assisted reality different from augmented reality?

Assisted reality differs from augmented reality in a key way. Assisted reality gives users access to relevant information in their immediate field of view (FoV), augmented reality uses computer-generated, digital content to create an interactive experience within real-world environments.

Read the full article on the RealWear blog here.




After consumer dismay, Magic Leap’s new AR headset targets enterprises instead

  • Magic Leap raised US$500 million off a US$2 billion valuation and unveiled its Magic Leap 2 AR headset that is set for release in 2022

 

  • Chief executive Peggy Johnson said the headset would be the industry’s “smallest and lightest device built for enterprise adoption”

 

  • Much like Microsoft’s Hololens, the goal for this headset is to help remote workers connect and train away from physical office

 

When Magic Leap was founded 11 years ago, the company set out to be a pioneer in augmented reality and mixed reality technologies. It even received almost US$3 billion to fund its first consumer-friendly AR headset, the Magic Leap One, which was launched in 2018 after a long delay. The US$2,300-priced headset eventually flopped, having sold only 6,000 units — a figure far removed from the one-million sales goal set initially.

The startup eventually narrowed its focus to professional applications, tried unsuccessfully to sell the company, and fired more than half of its workforce during the challenging economic climate of 2020. Plans to make mixed reality glasses mainstream were pushed back.

Amidst this whirlwind of shifting expectations, co-founder and CEO Rony Abovitz decided to leave the company in July 2020. Replaced by Peggy Johnson, the company then unveiled the Magic Leap 2, dubbing it as the industry’s smallest and lightest device built for enterprise, “designed to increase business adoption of AR.”

Johnson, formerly with Microsoft, revealed the new headset during a CNBC interview and in a blog post this past week. In a sign of investor confidence in the burgeoning enterprise AR space, Magic Leap further announced that it has raised US$500 million in funding at a post-money valuation of roughly US$2 billion. “The new capital will further Magic Leap’s focus on delivering best-in-class AR solutions including the roll-out of its second-generation product, Magic Leap 2, in 2022.”

Additionally, as claimed by Johnson, “this more advanced headset boasts critical updates that make it more immersive and even more comfortable, with leading optics, the largest field of view in the industry, and dimming – a first-to-market innovation that enables the headset to be used in brightly lit settings, in addition to a significantly smaller and lighter form factor.”