Simon Nagel and I have made two Deep Dive videos on our new VRED GPU Ray Tracing technology that we’re really excited to share with you. But rather than just dropping the videos on YouTube, we wanted to give you the story about why we did it and how we got here.
I’ve worked for 12 years in the real-time graphics industry. That industry has a holy grail: real-time ray tracing, real-time global illumination. (Our colleague, Product Manager Lukas Faeth, talks a bit about the journey to this real-time milestone here.) It’s been possible in the past, but only on systems and computers that would fill an entire room and need all the infrastructure behind that.
For us, the real breakthrough came when we were able to do it on a reasonably small system, something that could fit in my attic office, doesn’t require constant maintenance, and delivers quick, high fidelity real-time imagery. This is ideal for the entire automotive design process because it means you can reduce the physical properties in the design and concept phase that you produce (a costly endeavour). It can be more costly to change something on the physical rather than the virtual model – the ready accessibility to a virtual model and the speed with which one can incorporate variants, animations, HMI integration are strong arguments for this. But on a computer, if you’ve got reliable, physically-based results (like measured materials or light simulation), you can do so much more. You’ve no fear of committing to something different, that may not work. Because when you do it virtually, you can easily undo, change, or adapt it.
I had seen real-time ray tracing before, even on a GPU – but it was so costly, big, and unreliable that it wasn’t really feasible. We took interest in this about two years ago. We thought it was promising, then we thought it was cool, and now we think it’s got a practical application that our customers are really interested in. We’ve been sharing beta versions of this, tech-preview versions of this, over the last six months.
That was the basis for the videos we’re going to share with you. We wanted to show the world, without the world having to come to us, that this is possible.
Our original intention was to present this new technology in tandem with an electric car at the AIF in Munich this April. We wanted to show a German electric car, to focus on the history, to tie the story together: we have a home for the AIF in Munich, so we wanted a German car that has heritage, a sporting pedigree, and is electric.
The plan was to compare, one-to-one and in real time, the physical model and the virtual model of the Porsche Taycan. Part of our passion for this particular project comes from the fact that we saw where they make the car – not just virtually, not just in the design studio, but also in the factory. We supported this model from the design process right through to the actual manufacturing with various software tools from the Autodesk range.
That’s what we had organized and worked for. Then came the global pandemic.
We weren’t willing to let this passion project go, because a lot of us at Autodesk had a real emotional involvement with this vehicle and its story. So we shifted gears and adapted to this new, unforeseeable context. If we weren’t able to have the physical model, we wanted to make the virtual model as emotional, interactive, and beautiful as possible.
To ensure we managed this as well as possible, we assigned tasks, came up with a storyboard, then got to work. The storyboard highlighted this ray tracing, global illumination technology. It’s not just about beauty, it’s about physical correctness—to show how light and shadow work. These are the main features of global illumination, dynamic light and shadow: light running across the surface of a pre-production car, shadow in the gaps, headlights and rearlights, interior lighting influencing the customer’s perception of the car, etc. We wanted to tell that story visually, so that each individual person watching the video would see that this isn’t a computer game. It’s real.
We ended up using a feature of VRED called Reference Manager to facilitate how we were moving files back and forth. Because when we began the work, we were transferring a 6GB file, with each individual change going from my computer to Simon’s and back again. That didn’t work well. So we switched to Reference Manager. I was assigned one component, Simon was assigned another component, and we worked with a Master File that was a Reference. When Simon made a change, it was automatically updated. When I made a change, it was automatically updated into his master model. We had a central data control model, with both of us working on it simultaneously.
That’s the initial video we made, available here. It’s the eye-opener, the attention-grabber of what this newly accessible technology allows us to do.
The Deep Dive(s)
We knew there would be quite a few questions about how the technology works, so we wanted to offer two ways into this feature. Part 1 (below) is the theory, the figures and numbers that we got from testing this feature over a few weeks. Part 2 (also below) is how we recreate the video to show how you do it, in real time. It’s the kind of thing where we can make changes and experiment without penalty. The ability to do that—to change anything, move geometry, completely change the essence of what looks like a film—gives us the power and the flexibility and the technology with the software to change any part of it whatsoever in real time. That’s the holy grail.
So check out the videos Simon and I have made. They share our experience and detailed insights into the facts, figures and theory surrounding GPU Raytracing.
Part 1 focuses on the need for Raytracing in the digital design process, the relevant numbers when comparing various setups, and concise observations for everyone requiring physically-based full global illumination in real-time.
Part 2 allows us to replicate the Taycan video as a real-time demo scenario and show the method of creating engaging and realistic content for a truly mind-blowing virtual design review.