Surround 360

From Virtual Reality, Augmented Reality Wiki
Jump to: navigation, search
Surround 360
Surround 360 (1).jpg
Basic Info
VR/AR Virtual Reality
Type 360-degree video capture system
Subtype Camera
Platform Linux-powered PC
Developer Facebook
Release Date 2016
Price Free (Hardware Design and Software)
Website https://github.com/facebook/Surround360
Requires Camera Control Software, Stitch and Render Software
System
Storage
Display
Resolution 4K, 6K, and 8K
Image
Optics
Tracking
Audio
Camera Point Grey cameras
Connectivity
Device
Input Controlled Remotely


Introduction

In 2016, Facebook introduced its very own high quality 3D 360-degree video capture system. The design of the hardware, control software, and the stitching algorithm code form part of the Surround 360 video capture system. Both the hardware design and software component is freely available to developers on GitHub. Developers can make full use of the design and code to create their own content. Such a move is aimed at boosting the development of 3D-360 video ecosystem.

Producing a stereoscopic 360-degree panoramic video without hand intervention is not an easy feat. Facebook has taken a mathematical approach to work out the left-right eye stereo disparity. The video stitching, usually carried out by hand, takes weeks, but Facebook’s Surround 360 algorithm reduces the stitching time to less than a day.

The current standard is 4K, but Surround 360 system produces an output of 8K. Each eye receives a video of 4K, 6K, and 8K resolution. With the help of Facebook’s Dynamic Streaming technology, users can play the high-resolution video on the Samsung Gear VR headset.

Facebook’s goal was to bring a revolution in the 3D-360 camera ecosystem by developing a system that can be used for capturing, editing, and streaming quality 3D-360 video. Virtual reality content creators can use Surround 360 to produce professional-grade video content for the audience.

The Surround 360 VR Capture System

The Objective

Explaining their motivation behind developing Surround 360, the Facebook team said that the existing 3D-360 cameras were unreliable, available only on special request, or out of reach of the general public. Most 3D-360 cameras were inefficient, not sturdy enough, or took too long to stitch because the process had to be done by hand.

Facebook set out to design a 3D-360 camera that performs the capture, edit, and render function in the most efficient manner each and every time. The task at hand wasn’t easy. Facebook after overcoming many challenges managed to achieve what was thought as impossible just a few years ago – to create the Surround 3D-360 System.

The Three Interconnected Components of the 3D-360 System

The Surround 360 high quality, end-to-end video capture system is made up of three main components. The components are interconnected; a flaw in one component would affect the performance of the others.

The design and control of the system are such that anyone can modify the basic, off-the-shelf available hardware and software to develop their own video capture system. Facebook gives developers full access to its technology.

The Three Major Components

  • Hardware (Computer and Camera)
  • Camera Control Software (To synchronize the captured content)
  • Stitch and Render Software

Hardware

Before designing the hardware (i.e.) camera of the system, the team first laid down the requirements. The hardware, to produce the desired result, had to follow all the points in the list; compromising any one of the essentials would affect the reliability and/or quality of the system.

  • The multiple cameras in the 3D-360 system must capture the scene simultaneously. A slight delay of less than 1 microsecond is acceptable. If the cameras are not synchronized, the process of stitching the frames to make one complete image would become a tedious task.
  • The system hardware should have a global shutter. Having a global shutter makes sure the pixels are exposed to the scene at once.
  • All the cameras in the system must function several hours without overheating.
  • If the cameras stay put in a position it becomes a lot easier to capture frames and the chance of error reduces drastically. Hence, the camera and rig must be sturdy and firm.
  • One of the goals of the Facebook team was to develop an easy to use, replicate, and repair system. For this reason, the rig must be easy to assemble.

The Surround 360 strictly adheres to the hardware requirements. The Point Grey cameras don’t overheat even after long hours of shooting. Furthermore, the cameras come with global shutters. To prevent the cameras and rig from moving the cameras is placed on aluminum chassis. Finally, to avoid any kind of damage a powder-coated steel exterior is used to house the cameras.

Camera Control Software

After the hardware, the focus shifts to the control, capture, and storage of the content. The Facebook team chose a Linux-powered PC with sufficient bandwidth to control the hardware and store the content, transferred from the cameras to the hard disk.

To ensure the steady thread of frames are captured and not dropped, the 3D-360 system needed a solid transfer rate and a reliable disk system. The Surround 360 uses a minimum of 17 GB/s transfer rate, and an 8-way Level-5 disk system to cope with the high transfer rate.

The team has designed the system in such a way that users can control the cameras remotely with the help of a simple UI from an HTML browser. The Surround 360 system uses custom capture software for each camera to control the frame rate, analogue sensor gain, shutter speed, and exposure. The same software is also used to globally sync the cameras. The system uses the unprocessed Bayer data to guarantee the highest quality throughout the rendering process.

Stitching Software

The 3D-360 system truly depends on the stitching software to achieve the desired outcome. It’s the most important as well as the most complex part of the Surround 360 system. The brilliantly high-quality videos that the system produces are the handiwork of the stitching algorithm. The software ensures the images do not lose pixel quality when passing through the various stages of processing. This is crucial because once the pixel quality drops, the quality is gone for good.

  • The Surround 360 system takes several steps to process the captured data:
  • The images are converted to gamma-corrected RGB from the raw Bayer data
  • Lens distortion is removed and the image is again projected into a polar coordinate system. These two steps are part of the intrinsic image correction process.
  • External extrinsic corrections are performed to correct any camera misalignment issue.
  • The most important section of the algorithm, the optical flow, works out the left-right eye stereo disparity and produces separate views for the left and right eye.

Playing the 3D-360 Video Content

The Surround 360 system outputs 4K, 6K, and 8K for each eye. The content can be viewed on the Gear VR headset. The headset uses the Dynamic Streaming Codec to play the 6K and 8K videos. The captured content can be played on other virtual reality headsets such as Oculus Rift.

Seamless stereoscopic 360 panoramic videos captured using the Surround 360 system can be shared on Facebook News Feed. The videos will look only monoscopic on the feed page, but the download would be in the stereo format.

References