As a young civil engineer, I spent countless hours producing presentations to help our stakeholders better understand how our civil projects would impact the current environment. Done right, these presentations were meant to get our proposals approved quicker. Many of you know what I mean when I say it took quite a bit of time to produce these presentations from paper drawings, photographs, and scribbled notes, only to have stakeholders scratch their heads, unclear about the intent of our designs and the impact of our projects.
Today, civil engineers can easily use virtual reality (VR) to project realistic contexts of their projects, giving them and stakeholders increased insights about project impacts. Most people assume that deploying VR is time consuming. I happen to disagree. Let’s bust two myths right from the get-go:
Myth 1: VR is time consuming.
It doesn’t take weeks to generate a compelling VR experience. I can literally do it in 45 minutes. And if I can do it in forty-five minutes, so can YOU!
Myth 2: You need to be a visualization expert.
Wrong! You don’t need to be a visualization expert or need a dedicated visualization specialist on staff who focuses exclusively on VR. Given recent technology innovations and integrations in software and tools, any designer should be capable of producing immersive experiences in VR. Here’s how.
1 Capture Reality
As with any AEC project, time is of the essence. Using a drone to gather aerial survey information is a definite time saver. What might traditionally take a day or two to gather existing conditions manually can be cut down to just a few hours by following a fairly straight-forward process.
One of the first steps I take is to place ground control points (GCP’s) over the site in order to tie the drone images to survey control. Then, after flying the drone to take aerial photos of the site, I use Autodesk® ReCap™ Photo* to assign the GCPs to the photos, enhancing the positioning and accuracy of the mapping outputs.
With a couple of clicks the project is sent to the cloud where the images are stitched together based on the coordinates that are embedded on the images from the drone and the GCP’s assigned. This process generates a point cloud that can then be used to generate a terrain model and a geo-referenced image that help me better understand existing conditions on the site—all in a matter of hours instead of days.
2 Let Data Tell The Story
Once mapped, the next step is to decide what message to convey in the VR. Are you trying to get the public to experience how a new development will impact their neighborhood? Or are you trying to get approval for a new highway project? To help me get my story started, I use InfraWorks to aggregate all of the data related to my project — including Existing Conditions Data, Project Design Data, GIS data, etc., — into one model. And I bring all this project data and model into a VR experience to make it come to life and tell the story I want to tell.
I like to call InfraWorks the “data aggregator” in this workflow. This because with InfraWorks, I can simply drag and drop the data from various sources into the model and configure the data easily using the Data Source Configuration dialog. In this dialog, I be sure to set the data type I’m populating whether it’s city furniture, pipelines or something else. I also want to be sure that the coordinate system is set correctly before I tap Close and Refresh and move to the next step. It’s that easy!
Once everything is in the model, I can then export fbx files to push the model into a VR. I always export separate fbx files for each of the different items in the model to give me more flexibility in how I can use the data rather than having just one fbx file that has the entire model in it. For instance, I might want to just use the Ground and Buildings in my VR and not the pipelines. If it’s all in one file I wouldn’t be able to filter out just what I want.
3 No Visualization Experience Required
When I started creating VR, I was certainly no visualization expert. All I had was a cursory understanding of 3DS Max, and I was stil able to create a compelling VR experience. With my model prepped, all I needed to do is import my model from InfraWorks into 3dsMax. While in 3dsMax, I do have the option to create animations while for people, vehicles or anything else. But, I also have the option to let 3DS Max do the heavy lifting of getting the data into 3DS Max Interactive, the engine that will drive the VR experience. To get your VR model from 3dsMax to 3DS Max interactive, all it takes is a click of a button. However, and this is important, before pressing send, I run a script called the Meshinator (don’t you love that name?) in 3DS Max. This script optimizes the materials in your scene in 3ds Max for 3DS Max Interactive. You can access the Meshinator here.
4 Create Memorable, Immersive Experiences
Using the workflow I just described makes the data transfer from InfraWorks to 3dsMax to 3dsMax Interactive, seamless. However, there are a couple of things to do before I hit Level Send All to generate the VR.
First, I’ll need to select the correct equipment template in 3DS Max Interactive. In my case, I use the HTC Vive template to set all the defaults right out of the gate and let 3dsMax Interactive set up the scene for me and create a new Level – creating and saving that Level ensures that all the model information will populate in the right place. This is key.
With the Level populated, I can start to make decisions about how I want viewers to interact within the VR. One thing I do is create a Level Flow that allows viewers to turn the ground transparent using triggers on one of the controllers, allowing them to see utilities such as pipe networks, fiber optics and telecommunications networks underground. This view can be critical to ensure there aren’t any conflicts in the utilities systems. Creating yet another Level Flow allows for the simulation of traffic flows, for a parking lot for example, right within my VR.
A really cool superpower of creating Levels is that I can easily take advantage of aggregation of data in InfraWorks to create different proposals and have those proposals represented in the VR experience as different Levels. Now viewers can experience what a project looks like in its existing state and then experience several different designs of the proposed project. Because the VR is the aggregation of data that has been generated from my day-to-day activities on the project, there’s no need to create anything just for the VR experience! All that data is used to simply immerse the viewer within a true 3D environment. How cool is that?!?
The power of the Autodesk Architecture, Engineering and Construction Collection with ReCap Pro, InfraWorks, 3DS Max, and 3DS Max Interactive (formerly Stingray) can help you get to VR fast. It’s like VR-in-a-box! We don’t have to look outside of the AEC Collection to get from physically standing on existing conditions in a work site to walking around in a fully immersive VR environment inside your office. It’s that easy.
See my recent webcast where I run you through the VR workflow, step by step.