1

Surgical Smart Glasses Market Significant Growth

The market is expected to reach US$  303,934.14 thousand by 2028 from US$ 145,287.76 thousand in 2020

It is estimated to grow at a CAGR of 9.9% from 2020 to 2028

The report highlights the trends prevailing in the market, along with market drivers and deterrents.

 

Surgical smart glasses provide a smarter way to perform surgeries across the globe. Surgeons wear these glasses while performing a surgery and expert professionals from outside can view the surgery through these smart glasses and give advice regarding the surgery.

Surgeon in Operation Theater can communicate with another surgeon with the help of microphones of the glasses and share the surgical view with the wide lens camera on the surgical smart glass.

A few prominent players operating in the surgical smart glasses market are

  • MEDITHINQ CO., LTD
  • Taiwan Main Orthopaedic Biotechnology
  • IRISTICK (AREA member)
  • DTU Srl
  • Vuzix Corporation (AREA member)
  • Pixee Medical
  • RODS AND CONES
  • AMA XpertEye
  • ImmersiveTouch, Inc
  • Microsoft Corporation (AREA member)

Market players are launching new and innovative products and services to sustain their position in the surgical smart glasses market. For instance, in Jun 2020, MediThinQ commercially launched wireless smart glasses GV-200C. These wireless smart glasses solutions provide surgical site vision to front of eyes regardless of surgeon’s position and orientation.

The COVID-19 pandemic is having the significant positive impact on the surgical smart glasses market. COVID-19 vaccine has been developed by several pharmaceutical companies and is vaccination has been started. Thus, the surgical smart glasses market is growing due to the increasing awareness about remote clinical support as an option for surgeons, which is also anticipated to have a positive impact on the other segments of the market in the coming months. On the other side, disruptions in the supply chain caused due to the halt in global operations are hindering the market growth.

Increasing Number of Surgical Procedures Contribute Significantly to Market Growth

The rising prevalence of chronic diseases such as cancer, diabetes, cardiovascular diseases (CVDs), stroke, and kidney diseases is expected to have a positive impact on the number of surgical procedures across the globe.

As per the WHO, CVDs are the leading cause of mortality, and they result in the deaths of ~17.9 million people every year across the world. Moreover, as per the International Diabetes Federation’s Diabetes Atlas Ninth edition 2019, about 463 million adults (aged 20–79 years) had diabetes in 2019, and the number is expected to reach 700 million by 2045.

Such high prevalence of chronic conditions is expected to drive the number of surgical procedures, which will ultimately boost the growth of the surgical smart glasses market. For instance, according to the data published by the Obesity and Metabolic Surgery Society of India in 2020, the number of bariatric surgeries in India increased by around 86.7% during 2014–2018 in India.

 




One Billion Points Streamed In Augmented Reality

Ships regularly undergo large-scale retrofits, but shipowners rarely have design data in digital form at hand. To design ship modifications, an engineering accuracy of the as-built ship geometry is required, which means each vessel must be 3D laser-scanned. ShipReality, a company specialised in AR/VR ship design automation and remote ops, synthesises these large ship laser scans with its CAD software to design directly in 3D, resulting in merged models of CAD in the as-built ship geometry point clouds.

“We want to speed up and optimise retrofit designs for 60,000 ships that require greenhouse gas emissions reduction, energy conversions & ballast water treatment system (BWTS) retrofits in the coming years”, said Georgios Bourtzos, CEO and co-founder of ShipReality. “A major challenge we faced designing directly in large point clouds was visualising entire vessels layered with resulting 3D designs for immersive design reviews on mobile XR devices like Oculus Quest 2 and HoloLens 2.”

Exploring the use of point clouds in AR

Point clouds are precise models of real environments based on 3D laser or photogrammetry scanning. Objects and space are represented in the form of “points”. Millions of such points combined formulate a point cloud scan. The scan is then imported into a 3D modeling platform with the purpose of creating an as-built model. Common CAD software used for ship designs, although incorporating 3D laser scans, still rely on 2D projections and screens to visualise and design in 3D. This often results in incompatibilities with the existing ship geometry, which are only realised during installation, creating substantial delays and high additional costs.

Visualisation in 3D is a key issue to address these problems. However, visualising large point clouds requires substantial CPU and graphics power. The performance requirements are simply too high for rendering it locally on a mobile XR device. It would cause an extremely low frame rate and even software crashes. Dealing with large datasets, ShipReality had to find a solution that could surpass the limited memory, CPU and GPU resources of mobile devices.

XR streaming solution that supports point cloud

XR streaming technology outsources the rendering process to a local server or the cloud. But not every solution on the market supports ultra large data or point clouds. We integrated the ISAR SDK into our solution to stream large 3D laser scans merged with CAD retrofit designs to a HoloLens 2,” said Mr Bourtzos. ISAR (Interactive Streaming for Augmented Reality) is a unique remote rendering software component that allows to stream entire augmented and virtual reality applications in real time. “The simplicity and ease of integration of the software development kit worked seamlessly with our large models.”

Integrating ISAR into their engine and software, ShipReality was able to visualise a massive model containing more than one billion points. Layered on top was also BWTS CAD design data created by the company’s ShipMR-design software with additional five million polygons. To compare, a mobile XR device could only render about one and half million polygons locally. As the remote rendering server ShipReality used a moderate gaming laptop and the local WiFi, broadcasting on 2.4GHz band. More performance and bandwidth enable even greater visualisations.

Next level immersive experiences

“ISAR has amazing potential for AR/VR visualisation of massive digital twins and real-time monitoring of projected complex 3D designs merged with as-built environments in shipping and other industries,” said Mr Bourtzos. ShipReality is now able to:

  • visualise ship models that are only suited for high-performance processing
  • capture large assets 1:1 and integrate complex 3D CAD designs/data
  • visualise detailed models for spatial analytics in augmented reality

“ISAR can save us a lot of time and resources because we can directly use point clouds in mobile AR: some pre-processing steps can be avoided.”

Data availability, accuracy, density, and size of 3D point clouds are also forecast to vastly increase within the next years. “To realise the full potential of immersive point cloud experiences, streaming will play a major role,” concluded Georgios Bourtzos.

 




Augmented Reality in Education – Maryville University Guide

The contents of this comprehensive guide are as follows:

  1. What is augmented reality?
  2. Benefits of augmented reality in education
  3. Augmented reality in education examples
  4. Augmented reality in higher education
  5. Augmented reality apps for education

Benefits of AR

AR can have a significant impact on learning environments:

  • Student engagement and interest: Student interest skyrockets with the opportunity to engage in creating educational content. AR technologies can allow them to add to curriculum content, create virtual worlds, and explore new interests.
  • Learning environment: Classes that incorporate AR can help students become more involved. An interactive learning environment provides opportunities to implement hands-on learning approaches that can increase engagement, enhance the learning experience, and get students to learn and practice new skills.
  • Content understanding: Lack of quality content focused on education, rather than entertainment, is a noted concern among teachers hesitant to use augmented reality in education. However, existing AR technology enables teachers to create immersive educational experiences on their own to help ensure their students understand curriculum content.
  • Collaboration: As AR content is digital, it is easily shared. For example, a group of teachers can work with their students to continually refine the content. A collaborative learning environment provides students with increased motivation to learn because they are actively engaged in the educational content creation process.
  • Memory: AR is an excellent tool for bringing lessons to life and helping students remember essential details. For example, instead of just presenting photographs on a projector showcasing life in Colonial America, a teacher can use AR technology to create memorable interactive stories.

Read the full resource here 




Workplace of the Future by RE’FLEKT

The way we work is changing. While businesses tackle the challenges of increasingly competitive markets, the workforce needs to constantly adapt to new tools, changing processes and different work environments.

Emerging technology is transforming the industrial workplace. Artificial Intelligence (AI) and the Internet of Things (IoT) improve automation, data driven insights and operational efficiency. However, the technology that holds the most potential for creating our future workplace is Augmented Reality (AR).

RE’FLEKT explores the future of the industrial workplace and the role of Work Augmentation and Remote Collaboration based on AR technology. We take a close look at how businesses will work tomorrow by connecting their workforce with information and knowledge today – using Enterprise Augmented Reality.

The guide is comprehensive and comes in a number of sections:

  • How enterprise AR tackles business challenges
  • The digital AR guide
  • Manufacturing in the remote economy age
  • AR devices

You can read the full article here 




The Pandemic Pushed XR Use Beyond Fun And Games

Because the pandemic has forced many people to work, socialize, study, and shop at home, they’re using XR experiences to replace in-person ones. This year, 58.9 million people in the US will use VR at least once per month, and 93.3 million will be monthly AR users.

Though VR and AR are different technologies growing at different rates, the pandemic appears to have galvanized the market for both.

Established use cases have increased

VR and AR usage has grown as more people stay home and pursue activities aligned with crowd avoidance and social distancing, including playing video games, consuming entertainment, participating in social VR, using AR features on social networks, and experimenting with virtual try-ons, virtual shopping, and 360-degree travel videos.

For example, June 2020 research by Ipsos and the Global Myopia Awareness Coalition (GMAC) found that 58% of US children and teens spent more time with smartphones, 53% spent more time with video game consoles, and 15% spent more time with VR headsets since the pandemic began. In general, people who owned VR headsets used them more; others explored nonheadset options or considered buying headsets.

Big Tech sees big opportunity

The pandemic has turned XR into an even more important growth area for Big Tech. While Facebook is on its way to becoming the VR leader in the US with its Oculus ecosystem, it is also investing in AR. Other heavy hitters, including Apple, Google, Microsoft, and Samsung, are all reportedly racing to introduce their own VR, AR, and/or MR solutions to grow the market and capitalize on increasing demand.

5G is becoming more available

XR developers are optimistic about the rollout of 5G wireless service—both in the US and around the world. Higher-speed 5G networks are expected to eliminate many persistent technical difficulties and boost XR’s viability. In an April 2020 survey conducted by Toluna and Advertiser Perceptions on behalf of Verizon Media, 44% of US adults cited streaming VR content and 36% cited AR experiences as expected benefits of 5G technology.

Likewise, a majority of adults in South Korea, the UK, and the US found the idea of subscription-based VR and AR at least somewhat appealing, according to a January 2020 Nokia poll conducted by Parks Associates. Nearly three-quarters (73%) of respondents found a subscription to VR experiences appealing or very appealing, while 70% and 65% said the same for AR experiences and VR sports, respectively.




AR Use Cases gain ground due to COVID-19, Maturing the Tech

How has the COVID-19 pandemic driven the rise of AR?

Tuong Nguyen: I would say the pandemic has been another enormous boon for this industry. It’s put the value proposition for AR front and center for users and IT buyers. Before, when you were trying to sell AR, you had to make that business proposition: ‘Look, here’s the benefit, you can do things remotely, you can see the unseen, you can do things digitally without touching it physically.’

And the buyer would be like, ‘Why, when we can do all those things [physically]?’ But, in a time when we can’t, it becomes a little bit clearer, and that value proposition will carry on beyond the pandemic. You can help someone repair a piece of equipment when normally you would call and they would say, ‘I’m on this jobsite, I won’t be able to fly out there for another three days,’ and my task gets put off for three days. Now, they can remotely dial you in, you can draw on my screen, show me how to do things and we’re off and running.

Facebook announced it is working on AR glasses and Apple is working on an AR headset. They’re new entrants to the market, so what does this say to you about how AR use cases are maturing?

Nguyen: These are all steppingstones, and I think that announcements and introductions from companies like Facebook and Apple will be important milestones in moving us toward our goal — spreading, evangelizing the benefits of this solution.

Similar to smartphones and computing, someone had to go out there and say, ‘This is why you want a tiny computer in your pocket or purse.’ Apple did that really well with the introduction of the iPhone. That’s what I expect to see from Apple and Facebook and whoever else is doing this — to start introducing this to the consumer and the enterprise market and saying, ‘Look, this is why this is the next era of interfaces and computing.’

Should CIOs and IT professionals be seriously looking at AR use cases in 2021?

Nguyen: It depends on the application. Within the enterprise, which is where we see more of the adoption and maturity happening, there are certain industries that are benefiting from it more than others.

I will delineate it in the following way: frontline workers versus information workers. I am an information worker; I spend about 10 hours a day at my desk hammering away. Whereas frontline workers, they’re not in front of a desk, and they’re using one or both hands to do something, fix something, assemble something, pack something, etc.

AR is benefiting frontline workers more, and it’s typically [benefiting] … capital asset-intensive industries — oil and gas, energy and utilities, manufacturing, those types of industries where you have really expensive machinery.

In the past, companies were willing to fly you around the world because you’re one of the two people who can fix this [expensive machine]. They’ll put that bill up [against] $10,000 to $20,000 [for AR] because that’s a drop in the bucket in terms of the investment they made or the productivity loss due to that.

In short, IT leaders are adopting [AR], but within those certain parameters.

Where are you finding the most exciting AR use cases in the enterprise?

Nguyen: Exciting to me is something that is applicable and shows value. So, exciting to me are procedural tasks and situational videos. It’s exciting because this is what we see enterprises adopting it for.

A procedural task means task itemization. Let’s say I have a [packing order] because I work in a warehouse, AR delivers me that information. I’ll give you a hypothetical situation. I have on a headset, and my job is to go around the warehouse to pick certain things and put them in a box. Now I get the information delivered to me on demand. Or maybe it’s a procedure for you to repair something; [AR] gives you the instructions. Whether you are a veteran at it or newer, it doesn’t hurt to have that little reminder on your screen to say, OK, step one, two, oh, did you forget step three because you’ve been working a 12-hour shift?’ That’s procedural tasks.

Situational video is ‘see what I see.’ I’m on site, I call you, it’s either wait three weeks for you to come out or you can look through my video. It’s kind of like a FaceTime but with perks because it’s augmented reality. Now you’re looking at the same thing I am, and you start to draw on my screen from 3,000 miles away. They say, ‘This is a thing you should be looking at, I just circled it in red. Rotate it 180 degrees clockwise and then replace it with the following.’

Those are the two use cases and you’re seeing it being used for many things — guidance, maintenance, repairs, collaboration, inventory management, etc. That’s what organizations are deploying, and that’s what I’m excited to see more of. It’s still new, so everyone is still toe in the water. I’m excited to see more organizations recognize the benefit, but also recognize that this is the future. This is how people will interact with the world.




Roundup on AR Devices and AR Smartglasses April 2021

Whilst the consumer side has not been entirely successful as yet, it looks as though producers are betting on the enterprise and industrial customers.

  • Facebook‘s AR/VR research division last month showed off its futuristic wristband for controlling AR glasses. The company’s Ray-Ban smart glasses will arrive later this year, though they won’t have an integrated display. They’re considered a precursor to future glasses with full augmented-reality features.
  • Last week, Niantic CEO John Hanke teased what appears to be a see-through headset or smart glasses; he said the company is working on “new kinds of devices” that leverage its augmented-reality platform. (The WSJ reports that Niantic is developing AR glasses with chipmaker Qualcomm.)
  • Apple is expected to reveal its $1,000+ AR/VR headset in the next several months, possibly during Apple’s virtual Worldwide Developers Conference in June. They’re a forerunner to Apple’s more complicated AR glasses, due out by 2025.
  • Snap, which already sells its Spectacles smart glasses with a camera but no display, is expected to reveal its AR glasses in May at its Partner Summit. After that, Snap will ship the glasses to developers and creators.
  • Google opened up its Glass Enterprise Edition 2 AR headset, geared toward businesses and developers, for direct purchase last year. Now, the WSJ reports that Google is “likely to try a consumer play again” in the AR space.
  • While fewer than 1M AR glasses and headsets are expected to sell this year, IDC projects that will skyrocket to 23.4M in 2025, mostly on the business side:

You can read all about it in the original article on the Wall Street Journal https://www.wsj.com/articles/facebook-apple-and-niantic-bet-people-are-ready-for-augmented-reality-glasses-11617713387




How will Augmented Reality revolutionise the building industry?

Title “How will Augmented Reality Revolutionise the Building Industry?”

 

Please find below the webinar recording for future reference:

 

Topic: How will Augmented Reality revolutionise the building industry?

Start Time: March 15th, 2021 18:35

 

If you didn’t catch the recording at the time you can watch the replay here at the Meeting Recording link

https://us02web.zoom.us/rec/share/KKNgfCCysodDIn3jYij0ZIrGx5p-H770FZvIpAgUZFKTc1C__fR3pA2lhrFNHsDw.csXrUNbVY57uZfIU

 

Access Passcode: vT#kncB3

 

Speaker Details:

Mark Sage

Executive Director

AR for Enterprise Alliance

 




Khronos and EMVA Collaborate to Gather Requirements for Embedded Camera and Sensor API Standards

All participants will be able to discuss use cases and requirements for new interoperability standards to accelerate market growth and reduce development costs in embedded markets using vision and sensor processing and associated acceleration. If the Exploratory Group reaches significant consensus then Khronos and EMVA will work to initiate the proposed standardization projects at the appropriate organizations.

All sensor and camera manufacturers, silicon vendors, and software developers working on vision and sensor processing are invited to participate in this initiative. More details and instructions for joining the group are here.

The Embedded Camera API Exploratory Group has been created in response to industry requests. Increasingly, camera sensors are being tightly integrated with image, vision and inferencing accelerators in self-contained systems. Innovation and efficiency in the embedded vision market is becoming constrained by the lack of open cross-vendor camera control API standards to reduce development and integration costs of multiple advanced sensors and cameras. A consistent set of interoperability standards and guidelines for embedded cameras and sensors could streamline deployment by manufacturers and system integrators by enabling control of a wide range of camera sensors, depth sensors, camera arrays and ISP hardware to generate sophisticated image streams for downstream processing by diverse accelerators.

This Exploratory Group will use Khronos’ proven framework for new initiatives in collaboration with the EMVA. Any companies, universities, consortiums, open-source participants, and industry experts who are willing to sign an NDA are welcome to join, at no cost. All participants will have an equal voice in exploring industry needs for, and benefits of, creating a consensus to develop a Scope of Work (SOW) document describing the objectives and high-level direction of standardization initiatives of value to the industry. The Exploratory Group is expected to meet online over a period of several months starting on March 25, 2021.

All Exploratory Group discussions will be covered by a simple project NDA to encourage open discussions. The Group is open to all proposals and relevant topics but will not discuss detailed technical design contributions to protect participants intellectual property (IP). If a SOW is agreed, Khronos and EMVA will work to initiate the standardization work at the most suitable host organizations or open source projects, using those organizations’ normal collaborative agreements and IP frameworks.

Many industry leaders have indicated an interest in joining the Exploratory Group, including ALL3D, Almalence, AMD, Apertus, AREA, Arm, Cadence, Codeplay, Collabora, EA, Facebook, Google, Holochip, HP, Huawei, LunarG, Mobica, NVIDIA, Oculus, OPPO, Qualcomm, RedHat, Texas Instruments, Ultraleap, and Valve from Khronos; as well as EMVA members and machine vision players such as Allied Vision, Basler AG, Baumer, MVTec, and Stemmer Imaging AG.

“Judging by the significant industry interest, the time seems right to organize an effort around identifying and aligning on the need for interoperability APIs for embedded cameras and sensors. This is a topic that is very relevant to Khronos as our acceleration APIs, such as OpenCL™, SYCL™, and OpenVX™ are often used to accelerate sophisticated sensor stream processing,” said Neil Trevett, Khronos Group president. “Our work is also very complementary to EMVA, and we are delighted that the two organizations are working together to bring a meaningful quorum from diverse parts of the industry into this cooperative exploratory process.”

”We are delighted to work with Khronos on this initiative to commonly understand the industry needs for the future of embedded vision,” said Dr. Chris Yates, EMVA president. “Both the EMVA and the Khronos group have a well-established history of standardization developments which enable industry to develop new products more simply, whilst ensuring friction is reduced in the market. This Exploratory Group is an excellent approach to understanding broader industry needs and will bring together many companies and views in an open forum. We look forward to working closely with the Khronos Group and welcoming all new and existing participants to this important initiative for the vision community.”

Supportive Quotes from the Industry

“Embedded vision is a natural progression from full-sized PC-based vision systems to systems on a chip and is critically important to the future of the vision industry. The industry has seen great benefits from digital interface/interoperability standards such as GigE Vison and USB3 Vision in expanding markets, reducing costs, and simplifying technology application. It makes great sense to continue these standardization concepts at the embedded level,” shared Jeff Burnstein, president of the Association for Advancing Automation (A3), parent association to AIA – Advancing Vision+Imaging.

“Lack of API standards for advanced use of embedded cameras and sensors is an impediment to industry growth, collaboration and innovation. Enterprise AR customers and systems integrators/value added providers will benefit from greater clarity, open interfaces between modular systems and innovation in the component provider ecosystem. Standards for camera and sensor control will increase opportunities for powerful new combinations of sensor and AR compute resources, integration with existing IT, and lower cost and complexity of future solutions,” said Christine Perey, interoperability and standards program leader for the Augmented Reality for Enterprise Alliance (AREA).

“The establishment of this Exploratory Group provides a great opportunity to connect with the Khronos Group, EMVA and industry partners to ensure that together we can create the best experience for embedded cameras on all Linux platforms,” explained Laurent Pinchart, lead architect of libcamera®. “The Linux camera community has seen a need for standardisation and interoperability in the embedded camera space for more than a decade. We launched the libcamera project two years ago to address that need, initiating an ambitious effort to reach out to the industry and improve Linux camera support for mobile, embedded and desktop systems. We are eagerly looking forward to actively participating in the Exploratory Group and deepening our collaboration with all the involved parties.”

About EMVA

The European Machine Vision Association is a non-for-profit and non-commercial association representing the Machine Vision industry in Europe. The association was founded in 2003 to promote the development and use of vision technology in all sectors, and represents members from within Europe, North America, and Asia. The EMVA is open for all types of organizations having a stake in machine vision, computer vision, embedded vision or imaging technologies: manufacturers, system and machine builders, integrators, distributors, consultancies, research organizations and academia. All members – as the 100% owners of the association – benefit from the networking, cooperation, standardization, and the numerous and diverse activities of the EMVA. The EMVA is the host of four global machine vision standards: The two widely established standards GenICam and EMVA 1288 as well as the two standardization initiatives Open Optics Camera Interface (OOCI) and Embedded Vision Interface Standard (emVision).

About Khronos

The Khronos Group is an open, non-profit, member-driven consortium of over 150 industry-leading companies creating advanced, royalty-free, interoperability standards for 3D graphics, augmented and virtual reality, parallel programming, vision acceleration and machine learning. Khronos activities include 3D Commerce™, ANARI™, glTF™, NNEF™, OpenCL™, OpenGL®, OpenGL® ES, OpenVG™, OpenVX™, OpenXR™, SPIR-V™, SYCL™, Vulkan®, and WebGL™. Khronos members drive the development and evolution of Khronos specifications and are able to accelerate the delivery of cutting-edge platforms and applications through early access to specification drafts and conformance tests.

 




The Power of Augmented Reality in Construction

The following AR use cases for construction are addressed in the article:

  • Project presentation: Details and elements can be layered onto a building plan using AR. It can also provide tours and showcase 3D models. This allows both stakeholders and clients a clearer idea of the project, building, and any installations before it is made.
  • Progress capture: AR can track and document the progression of projects. Applications can use a device’s AR features to identify what progress has been made so far with the floorplan, taking automatic shots of each capture point. This allows for better accuracy and efficiency in progress capture.
  • Better collaboration: Teams can share 3D images and videos with off-site members using AR. Stakeholders can remotely view videos or images in greater detail, allowing for error identification.
  • Enhanced safety: If tags or labels are placed in specific hazardous areas of a construction site, AR can scan them to bring up text or 3D models detailing safety information.
  • Construction training: AR can assist educators with life-like demos to teach workers on using heavy machinery or complex equipment. This allows workers to see the equipment in action prior to arriving on-site. Hazardous materials or environments can also be demonstrated using AR, preventing team members being exposed unsafely.

Since AR use cases in construction already exist, Mixed Reality is considered the next step forward. MR combines both Augmented and Virtual Reality so that users can interact with digital elements while still being aware of their physical environment. Teams can collaborate better; they can interact with one another in the same physical room while conducting a virtual tour together. On site, MR allows workers to view instructions and information overlay for installation and repair support.

Despite AR adoption in construction being behind other industries, ARVR use in the construction industry is said to see “strong growth” in the upcoming five to ten years. The two drivers of AR adoption are said to be:

  • Willingness of construction professionals to go through digital transformation
  • Maturity of AR technology itself

Examples of AR being utilised in construction already are:

  • Akular AR: This mobile app brings 3D models into the physical environment, allowing walkthroughs in the real world. The app offers a solution for construction firms to show life-sized 3D building models to stakeholders.
  • GAMMA AR: This app uses AR to overlay 3D BIM models onto the construction site. Errors can be detected before construction, limiting mistakes and back-and-forth between team members. Models and designs can also be visualised before building. It provides a solution for presenting and sharing construction models, as stakeholders can avoid errors, communicate effectively, and make smarter decisions.
  • Arvizio: AREA member Arvizio is an enterprise AR and MR solutions. Features offered by them include processing, optimisation, import, and hybrid rendering of complex 3D models and LiDAR scans for sharing digital twins with multiple users. Use cases from this include spatial data management, QA inspections, on-site model alignment, design reviews, and marketing demos. Stakeholders can conduct synchronised collaborative AR and MR sessions.
  • ICT Tracker: This AR software company helps contractors to streamline project installation reporting and tracking. It is an easy-to-use, model-based production app that digitises iPad data in the field. The data collected is delivered in easy-to-read reports, improving project knowledge across the entire team. BIM or 3D models can be compared against current installations, eliminating the need for manual tracking. ICT’s capture of real-time data helps to understand installation status and identify production, cost, and scheduling issues.
  • The Wild: This is a collaboration platform that offers support for BIM 360 and Revit. An entire team can be brought into a virtual workspace to spatially communicate, add markups, and review designs. VR headsets, mobile devices, or desktop can all access it. Design reviews can be remotely sped up and aligned throughout the process.
  • VisualLive: A range of applications shift BIM/CAD power onto the construction site with VisualLive. There are AR and MR solutions on HoloLens 1 and 2, iOS, and Android, so design models can be brought onto these devices. Plugins with Navisworks and Revit allow users to bring CAD build BIM onto the jobsite.

The article concludes by acknowledging that AR will be a big part of construction in coming years. Companies must leverage the technology by finding opportunities to use AR in projects, and researching solution providers.