The R in VR — reality of Post-Production

Insider look into creating an immersive documentary

Karen Vanderborght
Virtual Reality Pop

--

May the force be with

your computer configuration

Creating and even viewing a VR production requires powerful and updated computers.

Images with lots of movements need a frame rate of 47, 60 or 80 i/s to assure fluidity. To get at a quality image you want to aim for a final resolution of 2K or 4K. This final video image is itself is the result of an even bigger image, made up of a combination of several cameras shooting each 2K, 4k or 1080p footage. It is clear you need to get yourself some big, fast hard drives. Lots of the platforms to watch 360º video don’t have the best screens and compression rates, so you better start off with a clean crisp image.

These are the minimum specifications suggested by the Kolor software;
http://www.kolor.com/wiki-en/Frequently_Asked_Questions_-_General

These are the Oculus Rift latest recommandations;
https://www.oculus.com/en-us/blog/the-rifts-recommended-spec-pc-sdk-0-6-released-and-mobile-vr-jam-voting/

your time

360º Video takes more steps in the production process than a flat square image. So make sure to adjust your planning and reserve more time.

If a 1-day’s shoot means 3 days of editing time in an ordinary shoot, it is safe to say shooting 360º can double that amount- depending on the shooting conditions. Static shots are stitched faster than content with lots of movement.

This article only covers 2D 360º video. The processes of 3D 360º video will be covered in a later article.

GENERAL WORKFLOW

  • Stitch the images in AVP and APG
  • Export previews and make selections
  • Edit your story together
  • Verify and finalize the stitches
  • Rotoscoping/Compositing/Color correction
  • Re-verify, ideally on your distribution platform and finalize the stitches
  • Color correction/grading/sharpen
  • Edit image/sound (synchronization and mix)/Titles
  • Compression/conversion
  • Add interactive components (coding can start earlier in the workflow)
  • Publish/Distribute

KOLOR APG & AVP

Kolor autopano video and autopano giga are 2 software packages that are commonly used for stitching video’s together.

Alternative: Video Stitch and PTGui
http://www.video-stitch.com/

Kolor was bought by GoPro and seems to be used by an increasing number of users.
https://gopro.com/news/lets-get-sphericalspherical-gopro-acquires-virtual-reality-start-up-kolor

The software is quite expensive and the support on the Q&A forums is not great if you encounter bugs or need information on the latests developments.
Even with all of their inconveniences they do offer a stable and efficient way of stitching images together.

A free app to download without hesitation is KOLOR EYES which permits you to view your 360º video as it is supposed to.
http://www.kolor.com/kolor-eyes/download/#beta

A screenshot of Kolor Eyes, reading a mp4 file, exported from AVP (demo version).

You can use Kolor Eyes with the Oculus Rift, on your desktop computer or on a mobile device. I am not sure if it will work with an HTC Vive.
http://www.kolor.com/kolor-eyes/#kolor-eyes-desktop

You can not watch your stitched shots in an ordinary video player, because it does not give you the effect of a VR experience. It is best to make use of Kolor Eyes or YouTube 360 to watch your 360º video work.

AUTOPANO VIDEO (AVP)

You will start off in AVP, importing your video footage.

Official documentation;
http://www.kolor.com/wiki-en/action/view/Autopano_Video_Documentation

The interface of AVP while exporting. Not all of the options of the timeline are visible.

This screenshot shows the same scene as the one used in the screenshot of Kolor Eyes. Only, AVP shows an equirectangular projection, while Kolor Eyes maps the video image on a sphere you can navigate.

The green lines (states) are the way keyframing works in AVP.

The official overview of it’s functions by Kolor;

AUTOPANO GIGA (APG)

The detailed work on the stitches are done in this piece of software.

Official documentation;
http://www.kolor.com/wiki-en/action/view/Autopano_Documentation

A screenshot of the demo version of APG, while trying to stitch some first tests.

APG opens up multiple windows and has to be used together with AVP, so you will need one huge screen or two screens to work comfortably.

Within the documentation of APG and AVP, you can find a couple of tips for specific setups and situations — like this one on using fisheye images;
http://www.kolor.com/wiki-en/action/view/Stitching_fisheye_images

steps AVP

  • Import all the video footage from all the cameras of the same take.
  • Synchronize your cameras.
  • Choose an IN and OUT — it will limit the time it takes to calculate whatever you do later.
  • Stitch (using your own template, or one of the standard models, or with a pano project- a panorama already stitched in the same environment). This last method is a good way to speed up the process of getting those previews/dailies out of the same shot/scene.
    (c.Position=stable images) (c.Selection=moving images)
Is it a disco ball or a 360º video? Hard thinking during a workshop I gave at La Presse news outlet, Montreal.

Since the images from the Z2X-C have a narrow overlap, the first results can be frightening. APG to the rescue!

  • Click on edit to open APG (autopano giga). Correct the stitches via the control points and move tool.

Jump around in the timeline and choose another moment in the scene when you have a hard time to get a decent first stitch result. It will change the control points you generate and may lead to a better result.

  • Stabilization and correction of the horizon.
  • Calculate the quality of the stitch (rms) and start correcting while creating different states (adaptive stitching).
  • Same process for the color correction and mask (if any). Both will influence the result of the blending.

Working on the mask doesn’t do much when shooting with the Z2X-C but it can be a great help when using rigs with 4 or more cameras where overlap is bigger. You might be able to save a character from a monstrous deformation or get those lines straight of the ceiling (parallax issues).

Color correction can be done both in AVP and APG, but APG offers more options for correction between the cameras. AVP does offer an automated color correction, but I prefer to use more sophisticated tools in grading software to take care of that.
http://www.kolor.com/wiki-en/action/view/Autopano_-_Panorama_Editor_-_Color_Correction

It is important to understand the blending process. Start off by choosing sharp for a fix setup, smooth for images with a lot of movement.

Don’t fixate on the results in APG,because AVP will overhaul them. It can be frustrating at times when the stitching looks better in APG then AVP. There is no way around this and WYSWG in AVP.

To understand the calculations behind the blending;
http://www.kolor.com/wiki-en/action/view/Understanding_and_using_the_rendering_engine

  • Export your work (render) in a compressed format for the previews/dailies or as an uncompressed avi or image sequence for the final edit.

The new version of AVP promises other export formats like ProRes.
Who/what uses uncompressed avi anyway?

Adobe Premiere/After Effects and Avid recognize images sequences as a single filmstrip but Final Cut Pro/X do not and will require some extra steps.
I did use Final Cut X for this project, just for the sake of knowing the workflow, and look, I am still here to tell about it.

steps APG

To start the detailed work on the stitches you want to find your inner monk.

  • Crop the camera images (certainly when using fisheye lenses)
  • Create/correct the stitch Control Points

Knowing the advanced setting of the control points will make a whole lot of difference;
http://www.kolor.com/wiki-en/action/view/Autopano_Video_-_How_to_fix_the_foreground

  • Transform and relocate images in the projection with the move tool or by manually changing the values of location and transformation expressed in yaw, pitch, roll (and Fov for location).
Some of the important windows of APG only. That second monitor is not a luxury finally.

It’s here that the real back and forth between AVP and APG starts. It is good to know that the control points you created are going to rule over all of your states.
You can try to work with different control points per state by saving different pano projects for each state you created . Use the command stitch as in AVP per state.

It is best to choose your battles. Are the imperfections really obvious? If they don’t fall in the viewing angle of your principal story line they might be good enough.

When creating an interactive experience why not place a navigational button on the nadir or zenith?

You can recycle your work done in APG for generating previews. A . pano project of the same scene (!) can be used with the stitch as pano command in AVP.

A preview image, exported as a mp4 from AVP, before color correction.

Compared to

An image exported as uncompressed avi from AVP and graded.

The blue fringes from the fisheye are too present now. It is not done yet.

IT IS NOT DONE YET.

Other corrections in compositing/editing software — audio editing and mixing — don’t forget about telling a gripping story — programming interactivity — compressions/conversions/exports.

The art of encoding

A neglected aspect of video production is the encoding. Lots of trial and error tests might be necessary to come to a good balance between quality and performance when watching the experience on your chosen platform(s). This is not the moment to start cutting corners in time and effort.

Everything depends on your distribution platform of course. Delivery will be different for a Samsung Gear VR, Oculus Rift, HTC Vive, web sites(HTML5), game engines like UNITY and UNREAL or a dedicated app written in C++.
A whole article can be dedicated solely on this subject.

Fortunately there is a comprehensive article out there, which also throws in a free tool to compress/convert your videos. Thank them from me.
http://www.purplepillvr.com/best-encoding-settings-resolution-for-4k-360-3d-vr-videos/

YouTube 360

Uploading your 360º video to YouTube360 is one of the easiest ways to distribute your work without having to deal with code.

You will need to download the free metadata tool and inject data in your video to make YouTube recognize your video as a 360º image.
https://support.google.com/youtube/answer/6178631?hl=en

One of the test shot in Kinshasa uploaded to youtube360. Watch via the dedicated youtube app when on mobile.

available PLUGIN’s

Adobe Premiere latest release supports VR, and the Kolor/GoPro VR Player installs plug-ins for Premiere as well. Here are a couple more;

Skybox
Plugin for After Effects which helps you to convert your compositions into a 360 sphere, mix in graphic elements, and much more.
http://www.mettle.com/product/skybox-studio/

Dashwood3d
Plugin for Premiere Pro, After Effects and Final Cut Pro/X which allows you to preview with the Oculus from within your NLE, as well as help with titles and grading.
http://www.dashwood3d.com/360vrtoolbox.php

I have seen the future and it will be

I don’t mention anything still in development or in beta version.
When you need to produce a 360º video or VR project in a given timeframe, you can’t risk waiting for Godot.

Implementing interactivity will be covered in another article .

--

--