Neural Radiance Fields in VR

,

Recently been playing around with NeRFs to scan objects, mainly using the Luma AI app.

I was wondering whether it would be possible to recreate a whole room using NeRFs? A room that could then be placed in a game engine (Unity/UE) and then be able to observe the environment that has been captured by NeRF technology from the HMD?

Would it simply be a case of stitching multiple NeRFs of individual objects together? Or would there be another way to do this?

Also, are there any additional complexities to using it with VR? I’m assuming the headset would just become the ‘camera’ to change the radiance field positional data (spatial location & viewing direction)? Any SDKs?

Fascinating new technology and would be interesting to here some discussion on these limitations/possibilities…