Understanding Live VR rendering
Understanding Live VR rendering
A little over a year ago, we gave our users the ability to output stereo cubemap images to be viewed in VR devices such as the Gear VR. With the release of our Guide to VR, this process became very popular, especially in the Architectural Design and Architectural Visualization community.
One issue that came from this process is that the output was only truly viewable with a VR headset, making it a challenge to see how the changes made to the scene would affect the final output. For this reason, users needed a way to tie the virtual frame buffer to an output that could be seen in VR.
In the upcoming V-Ray 3.5 for 3ds Max, we introduce a way to output the Active Shade VFB to either the Oculus Rift or HTC Vive. In doing so, the user can render and view the changes in VR as they are made.
What do I need to make this work?
Currently, this only works if your VR headset is connected to the computer that is rendering. Therefore we currently support the Oculus Rift, or devices that support OpenVR such as the HTC Vive.
Based on the fact that these renders need to be very large, it is recommended that you use GPU rendering to speed up the rendering as much as possible. Additionally, since you will also need a powerful GPU to view your VR output, it is also recommended that you have at least 2 GPUs in your computer for this to work, where one is reserved for the VR output, and the others can be used for rendering. If you use the same GPU for rendering and VR output, the output will have very poor performance and will create a very negative experience and possibly cyber sickness.
If your computer only has one GPU, but you have access to other computers that have more, Distributed Rendering can be used as long as you exclude your own computer for rendering so that it can be used for VR output.
How to set it up:
- If you have not already done so, install Oculus Rift PC runtime.
- Oculus Rift only: connect headset to the default graphics adapter – it will not work if connected to other graphic adapters.
- In the Oculus app, go to Oculus Settings -> General -> Unknown Sources and turn it on.
- Check that the headset is working and displays its home scene correctly.
You are now ready to start render in 3ds Max:
- In the V-Ray Production Render settings:
- Turn off Image Filtering. If you don’t you are likely to see seams in your cubemap.
- Under Camera Type select Cube 6×1.
- In V-Ray RT Active Shader settings:
- For Oculus: In Stereo mode select Oculus Rift (mono) or Oculus Rift (stereo), depending on your needs.
- For the HTC Vive: In Stereo mode select OpenVR (mono) or OpenVR (stereo), depending on your needs.
- In your image output select a 6×1 output. It is recommended that you choose something large enough for your display. 3000×500 is the recommended minimum.
- In your engine type, select CUDA or OpenCL. Again, CPU is not recommended as it will slow your performance.
- Under Render Device Select, make sure that the GPU that you are using to output is not selected as it will degrade your VR performance.
- If you wish, you can use Distributed Rendering.
- If you wish to change the interpupillary distance, select the Advanced settings of V-Ray RT and change the eye distance.
- Press the Active Shade Render button:
- The first render will start slowly since it takes some time to load the Oculus VR library.
- Since it is an Active Shade, you can continue to make changes to your scene such a moving objects, changing materials, and more, and it will update the render in both the VFB as well as the VR output.
Working in VR can be a challenge. Few application have an interface that can allow people to build in VR. Using a desktop application can be a bit of a guessing game as to what the VR output will be. Using the Live VR output of V-Ray 3.5 will allow people to remove several of the steps of seeing the content in VR as they make changes. This in turn allows for quicker iterations and a better overall VR experience.