XR with the VARJO XR-3 and Nvidia CloudXR in Autodesk VRED

Danny Tierney Danny Tierney July 29, 2021

2 min read

XR is the sea change that Automotive designers, creatives and managers have been waiting on for quite some time. Reviewing a virtual model in a real-world setting enables these personas to see their colleagues, compare existing physical properties and interact in a personal way. Whether head-mounted or tablet-based, the results are phenomenal and accessible. XR is paving the way for 100% trust in the digital decision-making process, while saving costs and time.

For this Deep Dive session, Simon Nagel and Danny Tierney describe how to set up two different types of Extended Reality devices with Autodesk VRED:

  1. The VARJO XR-3 is a cabled/tethered XR Head-Mounted Display (HMD). A high-resolution, large field-of-view, professional device, it connects to a workstation with a good graphics card.
  2. Nvidia Cloud XR is an application capable of streaming lossless XR/VR content to an HMD or Tablet over a 5G or 5Ghz network. The rendering takes place on a workstation or server with a good graphics card.

We show how each device is configured, calibrated and run using high-quality, high-complexity, and visually stunning real-time graphics. VRED is the obvious choice to achieve these results. Our session offers the best practices for each XR device, as well as insights into innovative workflows and inspirational use cases.

Finally, we demonstrate how using multiple XR devices in collaboration will change how we work together, in a world that has gotten smaller in terms of size, but less accessible due to modern challenges.

Extended reality (XR) is an umbrella term for the following sub-categories:

Augmented Reality (AR) is where the real world and the virtual world are combined, by overlaying (augmenting) real-time content with a physical environment. It can be simple, like Pokemon Go, which runs on most mobile devices, or more complex, where the content needs to be streamed from a more powerful computer or server. Positioning or tracking the content is important to maintain a relation to the real and virtual worlds. Interaction with both worlds is usually limited in most AR applications.

Virtual Reality (VR) is where the user is completely immersed in a virtual environment and does not see the real world at all. Interaction is mainly limited to the virtual objects in a scene. Users can still interact with the physical world, but this requires high-precision tracking and is rather uncommon. Think of Walk the Plank, where a real plank of wood is placed on the ground to enhance the experience.

Mixed Reality (MR) is the most challenging of all these categories, since it requires seamless interaction with both worlds and includes the AR technique of viewing an object in combination with natural human interaction – touch and gesturing. It combines elements of both VR and AR to achieve this interaction. Imagine a training simulator where the pilot sits in a real cockpit and the controls and HUDs are all virtual.

XR can be displayed on any screen. Some make more sense than others depending on the use case. For example, XR on a standard monitor does not make much sense, because it is immobile and makes interaction with physical and virtual objects impractical. XR by default lends itself well to HMDs and Mobile Devices.

XR can be rendered locally on the device, cabled to a workstation or laptop, or streamed from a data centre. Mobile devices are limited in terms of processing power for detailed content, whereas cabled devices have flexibility limitations and streamed content relies on a suitable fast, low-latency and robust network.

Tags and Categories

Essential SKILLS Industrial Design

Get Design Studio updates in your inbox

By clicking subscribe, I agree to receive the Design Studio newsletter and acknowledge the Autodesk Privacy Statement.