Tracking and Stabilizing for Virtual Reality.
Recently, I have the need for tracking and stabilizing footage in a spherical world for virtual reality. As a result, I started building some tools and methodologies for dealing with stitched spherical footage. As well, the footage would be coming from a variety of go-pro rigs. In addition, there is a need for 3D development and integration with 360 stitched spherical lat-long footage. In this case, all shot footage is usually coming from a six, ten, or sixteen camera rig with Autopano doing the initial stitching.
There is surprisingly little information out there on this subject, but there are some great videos done by Frank Reuter that got me on my way. As well as a few posts on the Foundry’s forum.
The idea is pretty straightforward. We want to convert our spherical stitch into a rectilinear cube map or 6 pack. In Nuke this is done with the Spherical Transform node. Through rotating along the Y we can get the four sides, and with a 90 and -90 rotations along X, we can get the top and the bottom.
Now that we have the six rectilinear images we can pipe them each into a project3D node, which in turn gets piped into their respective cameras. Create one camera with a 45-degree FOV and a 90 degree horizontal and vertical aperture. Duplicate the camera six times, one for reach axis and rotate them into position. I then project those cameras onto a sphere, re-creating my initial image perfectly. I then chose one sequence out of the six that would be best for performing a 3d track. Once the footage has an acceptable track, I then link the tracking data to an axis node that drives my camera rig.
I then create my render camera and pipe in the same tracking data and then invert the rotational curves for a nodal setup.