Moon | Lytro | VR Playhouse | Lightfield Post-Production

VR Playhouse conceived of the concept of what became “Moon’, the first Lightfield and CGI integrated short ever produced by the camera company Lytro. I was the Technical Director as well as lead artist on the Production and  Post Production side. The production had to be meticulous in both the amount of information that we had to ensure we had, lidar, extensive photography for photogrammetry, as well as exacting in its technical accuracy. All lighting had to be board controlled to ensure exact matches between the five wedges that we would end up shooting. Each wedge consisted of 61 cameras rotated every 72 degrees, with an additional 10 static cameras. I designed the rendering system in Houdini and all rendering was done in Mantra. The lidar was used for depth information for any necessary depth clean-up from the output from the camera. The depth was lined up with the CGI using Nuke and depth to points and exported as an alembic to Houdini where we could rectify our two worlds.
Lytro Stage Panorama

Reconstruction Methadology

Both Lidar and photogrammetry were used. The entire stage was scanned with a high resolution Faro that resulted in a point cloud that resulted in close to a billion points. These were culled down to roughly 250,000,000 points that were then edited further in Houdini. All photogrammetry was done in Capturing Realities’s Reality Capture, and then further processed in Houdini through a series of proprietary tools.

Production Methadology

Both Lidar and photogrammetry were used. The entire stage was scanned with a high resolution Faro that resulted in a point cloud that resulted in close to a billion points. These were culled down to roughly 250,000,000 points that were then edited further in Houdini. All photogrammetry was done in Capturing Realities’s Reality Capture, and then further processed in Houdini through a series of proprietary tools.

Behind the Scenes

Some photographs documenting the three day shoot.