ICVFX At Home 2 - nDisplay Config
nDisplay is how we reconcile our physical and digital environments. In the left gif, you can see the physical relationship between the TV (floating plane behind the interviewee’s chair) and the handheld camera on the right. nDisplay packages all this nicely and allows us to move the whole real-world set around the virtual environment.
Now that we have data flowing into UE5 via LiveLink and SteamVR, we are ready to setup our nDisplay Configuration. For this step we need to:
Create a virtual representation of our “video wall”. In this case just a flat plane to represent the 55in TV.
Identify the centerpoint of our recording space; ideally this should match the origin from SteamVR that we defined while setting up the room.
Create a CineCam object in our scene that matches our physical camera and lens properties.
Modeling the “Video Wall”
We need a surface (or surfaces) to represent the video displays we will be using in our environment. In my case, this is a simple, flat 55in display. Use your modeling program of choice to export an FBX with UVs.
nDisplay Settings
Once have our video surface and studio mapped, we’re ready to make an nDisplay Config. Best practice would be to make a “Stages” folder under “Content” to store configs for each of your real world environments (home mockup space, work studio, etc). Our config file will is where we integrate our physical components into the 3D space. We will need to set up a few things here:
Import our video display surface(s)
I set the center of my room as the origin in SteamVR, so all my motion data is relative to that position. So in the nDisplay Config, I am placing the “TV” plane I modeled at the correct distance from center to reflect it’s physical position in the room.
Add a “Cluster Node” in the Clusters tab
Add an “ICVFX Camera” in the Components tab
The ICVFX camera is going to determine the view that is projected to the video wall (my TV). Under its properties, I’m going to link this to the CineCamera in my scene which is receiving LiveLink data and determining my sensor and lens settings.
We need to also add a new Cluster Node so Switchboard knows where to request rendering power from when starting up the scene. Yes, this is a little convoluted as I am only using one computer for this at-home setup. But if I were to move this project to a proper studio, the component is already there, and I can simply point this at the actual render node’s IP address.
This should be enough to get us going in-scene. Next, we will need to setup Switchboard so we can send our ICVFX view to the screen!