in

How VR and digital twins are streamlining remote work in technical professions

XH4D/Getty Images

“Houston, we’ve had a problem here…We are venting something out into space.”

Those are two phrases you never want to speak, especially when you’re about 200,000 miles from the nearest mechanic. It was April 13, 1970, and an explosion had just taken place aboard Apollo 13. The spacecraft lost power and oxygen, and the machine that scrubbed the carbon dioxide (CO2) from the air was broken.

Special Feature

Three human beings in a tiny spacecraft near the moon were now totally dependent on teams of scientists back on Earth. The situation was as dire as dire could be.

But the astronauts and the scientists had three things going for them: Radio communications between the spacecraft and the ground, a constant stream of telemetry data being sent back to Earth and recorded, and the ground team that had an actual physical twin of Apollo 13 in the form of a training and testing simulator.

Using those advantages, NASA scientists devised a design for a CO2 scrubber that the astronauts could build from cardboard and scraps found in the spacecraft. You know how the story ends – the astronauts made it home alive.

Also: Your business is going to rely on hundreds of AI models. Here’s why

Clearly, remote work and digital twins date back to the mid-20th century. This technology has come a long way since the early days of the pandemic, let alone the first era of the space age. 

Now, remote work is more common, Earth-side. Only about 5% of US jobs were classified as remote jobs back in 2017 and 2018, according to LinkedIn’s Global State of Remote and Hybrid Work report from January 2024. Today, LinkedIn reports that 24% of jobs (at least as it pertains to those added to LinkedIn profiles) are remote.

As this trend continues, the marriage of digital twins and technology like virtual reality, augmented reality and others could drive a new era in the ways people work together – even if the people communicating are far apart.  

Understanding the digital twin concept

Put simply, a digital twin is a digital clone of a real-world system that is updated regularly with current telemetry data. 

“In the simplest of terms, digital twins specifically should have an intelligent element such as simulation/AI, etc. to allow forecasting the future and enable decisions to be made. The digital twin system also includes the processes for triggering those decisions (often automated but there are many examples with human-in-the-loop business processes),” David McKee, ambassador and chair of the Digital Twin Consortium and a Royal Academy of Engineering enterprise fellow, said in an email. 

Also: Meet the industrial metaverse: How Sony and Siemens seek to unleash the power of immersive engineering

A twin combines a software representation (whether in CAD or some other computer model) of an original along with a constant data feed providing wide-ranging telemetry that keeps the model up to date about the condition and status of the specific original being twinned. It’s possible to use the twin to represent and analyze aspects of the original, as well as to predict the future behavior and condition of the original based on applying simulated tests and current data against the twin.

Digital twins are being used to support the management of factories and even cities. For a deep dive into digital twins and XR, see my guide.

Virtual reality check

Meanwhile, virtual reality, augmented reality, mixed reality, and extended reality all play a key role in tying digital twins and remote collaboration together. If you realize that it’s possible to simulate an entire factory, it’s not that big a leap to understand that being able to walk around a virtual version of the factory might be advantageous.

With the introduction of the $3,500 Apple Vision Pro and the $500 Meta Quest 3, XR has pretty much become practical, if still a bit cumbersome. 

Many of us have co-edited something inside of Google Docs by now. When I edit a document live with one of my editors who lives 3,000 miles away, and the changes I make are immediately visible to her, it’s a big productivity boon. Of course, it can go wrong, too. I’ve been in Google Docs frenzies where 10 people in a Zoom meeting are all editing the same document at the same time. Ouch!

Also: Meta Quest 3 review: The VR headset most people should buy in 2024

When not taken to such extremes, collaborative document editing is perfect for fine-tuning a white paper or an article. But what if you’re designing a new coffee maker? Some CAD programs allow you to share screens. Zoom and its brethren also allow screen sharing, but in both of these examples, the object being modeled is still shown in flat 2D space.

Instead, imagine you’re designing that coffee maker, and now the three members of the meeting are inside the same virtual kitchen. Each person can look down and under the cabinets to see how much room there is when the coffee maker’s lid opens up. It’s possible to rotate and adjust the coffee maker on the counter and see as if you were there, just how it would look and how easy it would be to reach and manipulate.

That kitchen could be all virtual (VR immersion) or it could be a real kitchen with a virtual coffee pot sitting on its counter (MR). That doesn’t really matter, except to the designers at the moment. The point is that the physicality of the object is now much more visceral.

Let’s get big. Instead of a coffee pot, you’re designing a car. It’s nice to see photos and renderings of the car, but with AR and VR, you could drop that car right onto a street, walk up to it, look at it from various angles, and even sit (on a physical chair in your own space) inside the virtual cabin to experience how the various instruments and controls are to see, use, and reach.

Let’s get bigger. We’ll stay with the idea of designing a car, but now we’re designing a factory. In VR, you can move between stations and physically see how far the machines are from each other, how the production path moves, and even how well-lit the environment might be. And you can do it with team members, so if you point to something in the factory, they can see what you’re pointing at, all from a real-seeming perspective.

Let’s get even bigger. We’ll jump from simulations the size of a factory to ones the size of cities.

Singapore has developed a Virtual Singapore project that contains 3D models and simulations that empower the various city sectors to use digital tools to solve challenges ranging from improving parks to optimizing evacuation routes. India’s southern coastal state of Andhra Pradesh is designing Amaravati, a $6.5 billion smart city that’s using thousands of data sets as part of a digital twin to help manage everything from permitting to construction progress and designs to help the city mitigate its extreme climate conditions.

Factory floors and better buildings

“XR/VR/AR can be a tool/enabling technology used within digital twin systems and vice versa,” The Digital Twins Consortium’s McKee said.

In Stuttgart, Germany, Porsche created a full model of a factory. It used the design models to explore how its desired configuration worked with the weight-bearing capacity of the floors. Suppliers were also able to use the model to tune their equipment to allocated space within the Porsche factory. The operational factory is producing cars, with the digital twin virtual model constantly guiding operational improvement.

Although this factory hasn’t yet implemented XR, it’s possible to see how vendors and Porsche engineers located around the world could come together virtually to tour the factory facilities and make design and operational decisions together without having to spend the night flying to the factory, sitting in cramped seats, and eating unsavory airline food.

Also: 81% of workers using AI are more productive. Here’s how to implement it.

Gothenburg Sweden-based Winniio is a consulting firm specializing in digital transformation and digital twins. One of Winniio’s projects is optimizing smart heating solutions in commercial buildings. The company has built a digital twin for schools in Sweden’s Växjö municipality (whose motto is “Europe’s Greenest City”). The project optimizes energy flow to each radiator, maximizing efficient heat distribution and reducing CO2 in buildings.

Winniio does this using 300 wireless mesh sensors connected to more than 200 radiators. Those sensors constantly feed information to the twin. It contains 3D models of the buildings based on original drawings. The entire system is run from inside a game engine, which promotes visualization and collaboration.

“In 2D, it takes a person 40 minutes to identify 1 out of 8 problems just through drawings,” said CEO and founder Nicolas Waern in an email. But it’s very hard to explain the situation to decision-makers. “In 3D, it takes a person 20 minutes to identify 2 problems.” It’s easier to show decision-makers using a visual model.

But, “In VR, a team of four people can work together. It takes eight minutes to identify eight of eight problems, understand what it leads to, and how to solve it with less effort. Why? The simple reason is that we all are used to working in 3D immersively. It’s our reality,” Waern said.

The live data fed into the digital twin lets Waern’s team not only record reality, but emulate it, simulate it, and test scenarios. 

Focusing on the solutions, not the technologies

Given the prevalence of so many buzzwords and acronyms, it’s easy to get caught up in the nomenclature and characterization of the technologies we’ve been discussing. Don’t let yourself get carried away by trying to pigeonhole all these elements. Instead, focus on the solution where a live system has its own digital twin, and where you can use that to model, predict, test, forecast, and diagnose.

At the University of Leeds, in the UK, a digital twin project not only uses simulation models and VR, but also haptics to help users further connect with the simulation.

Also: How my 4 favorite AI tools help me get more done at work

“Academics used digital twin technology to deliver haptic interaction with VR through procedural learning,” said Hosam Al-Samarraie, professor in digital innovation design, via email.

They used tools to provide tactile sensations like force resistance and texture so that users could “feel” or “touch” virtual objects. They built one project that allowed teams to communicate and work virtually solving design problems.

Using these technologies together, Leeds teams created design solutions that “redesigned the future of immersive galleries” so visitors could experience the sensations of touching and feeling virtual artifacts. 

Smart, connected ecosystems of all sizes

In a 2022 Autodesk report on digital twins, the company predicted that there will be 500 cities operating digital twins by 2025. Further, 91% of Internet of Things (IoT) platforms will offer some form of digital twinning capability by 2026, and by 2028, digital twinning will become standard as part of IoT enablement.

The ability to see and simulate key aspects of a system can lead to smarter decisions, which in turn leads to better return on investment. By being able to predict operational “gotchas” and test solutions in simulation, operations will be able to run more smoothly, and maintenance will be more proactive, pre-problematic, and fundamentally less expensive.

Also: My favorite XR glasses for productivity and traveling just got three major upgrades

A huge beneficiary of digital twins is architecture, engineering, and construction (AEC) firms, which can offer more services, value, and longer-term strategic benefits derived from better predictive and preventative intelligence.

The global digital twin market overall was worth $10 billion in 2023 and is expected to rocket up by 61% year-over-year to $110 billion by 2028, according to analyst firm MarketsandMarkets Research. Fueling that growth will be a huge increase in the use of digital twins for predictive maintenance, which, when combined with AI, is expected to have far-reaching implications when it comes to getting ahead of costly operational problems and challenges.

VR and XR are a part of that solution. As those headsets come down in price (and, more importantly, size and weight) they will gain increased adoption. There is a huge difference in perception between seeing a city block on a computer monitor and walking in a virtual city block, with the environment all around you.

When applied to digital twins, the visceral benefits of XR are more than just another technology. That altered perspective has a substantial potential to unlock observations and insights that might not be apparent when seen on a small screen.

Overall, stay tuned. Both AI and XR are innovating at an extreme pace. ZDNET will continue to explore the implications of these technologies. Let us know what you think. Are you using XR now? What about AI? Do you think your organization might deploy a digital twin? Let us know in the comments below.


You can follow my day-to-day project updates on social media. Be sure to subscribe to my weekly update newsletter, and follow me on Twitter/X at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, and on YouTube at YouTube.com/DavidGewirtzTV.


Source: Robotics - zdnet.com

Microsoft will start charging for Windows 10 updates next year. Here’s how much

Apple retires iPhone 13 and iPhone 15 Pro models – what should you buy instead?