Lytro’s “Moon” | Virtual Reality Lightfields in Production at VR Playhouse

  • Director: Ian Forrester
  • Producers: 
  • Technical Director/Art Lead: Jordan Halsey
  • Compositing Lead: Andrew Schwartz, Oren Green
  • Sound Design: Ecco Sounds
  • 3D Team: Art Perez, Andrew Stutchlick

Conception

VR Playhouse conceived of the concept of what became “Moon’, the first Lightfield and CGI integrated short ever produced by the camera company Lytro. The concept was conceived in a bar in San Jose where Ian and I were on the phone with the crew in L.A. Dylan, DJ Turner, and Christina Heller were all integral in figuring out the entire concept.

Technical Notes 

I was the Technical Director on the Production and Post Production side as well as the Lead Artist. The production had to be meticulous in the amount of information that we had as well as exacting in its technical accuracy. To cover the post-production, we captured Lidar using a Velodyne. We also shot extensive photogrammetry. The photogrammetry software Reality Capture had just been released, making this a much easier process. We also captured extensive HDRs under the highly controlled different lighting scenarios.

The Mack Sennet Studio Stage

We shot on the large stage at Mack Sennet Studios in Silverlake. The studio was built in the early 1920s and had all the charm of what you would expect from this old and used location. When we captured the Lidar data, we found every hole or seam in the building.

Lidar Scan of Lytro Set

Set Production

The most important part of the set was the 16’ X 16’ foot foam panels that formed the floor of the moon. 

Lighting

All lighting had to be board-controlled to ensure exact matches between the five wedges that we would end up shooting. Each wedge consisted of 61 cameras rotated every 72 degrees, with an additional ten static cameras. I designed the rendering system in Houdini, and all rendering was done in Mantra. The Lidar was used for depth information for any necessary depth clean-up from the output from the camera. The depth was lined up with the CGI using Nuke and depth to points and exported as an alembic to Houdini, where we could rectify our two worlds.