Sign up FAST! Login

DreamWorks Reveals Glimpse of 360 Degree 'Super Cinema' Rendering for VR Films


90 minute movie requires 200 terabytes of data DreamWorks Reveals Glimpse of 360 Degree Super Cinema Rendering for VR Films video

Source: http://www.roadtovr.com/dreamworks-revea...

Let’s talk a bit about computer graphics. What’s important to understand is the difference between real-time rendering and pre-rendered content—if you already know the difference, skip this section.

Real-time rendering is generally required for interactive content like videogames; since the player has the option to move and look in any direction, the game must draw one frame at a time, rendering (or ‘calculating’) what each frame should look like based on where the user is looking. But computers can only do this so fast, and things slow down when the graphics become more complex. (Arguably) the minimum acceptable rate to perceive smooth motion is 30 frames per second (although virtual reality has been shown to demand much higher rates). Any slower than that and you start to see more of a slideshow than fluid video.

Much of today’s computer generated imagery, especially in the film world, is so complex that computers can’t render it anywhere close to 30 frames per second—in some cases a more accurate unit would be frames per hour. Transformers (2007), for example, had CGI so complex that it took 38 hours to render one frame in some cases, according to director Michael Bay.

Fortunately, since film has no user-input, and thus the view will always be identical from one viewing to the next, CGI frames can be rendered ahead of time—aptly called ‘pre-rendering.’ While each frame may take anywhere from minutes to hours to render, they are essentially saved as still photos which can then be compiled into a video and played back at the desired fluid framerate after the fact.

The challenge for a company like DreamWorks Animation, which primarily creates pre-rendered CGI films, is supporting headtracking in virtual reality while maintaining the same high-fidelity visuals that the company is known for. With user view control like headtracking, the usual method of pre-rendering is impossible because there’s no telling what direction the user will want to look and thus which frames to generate.

One option to hop this hurdle is to pre-render 360 degree 3D frames, then project them onto a virtual sphere around the user, affording the headtracking interactivity that’s critical to VR while not requiring each viewer to own a supercomputer for real-time rendering of those complex scenes. This can also enable VR experiences on lower-end hardware that lacks desktop-class computing. InnerspaceVR, for instance, is creating such experiences in CryEngine and pre-rendering for playback on less powerful devices like Gear VR.

Stashed in: Virtual Reality!, Storage!, Computer Graphics

To save this post, select a stash from drop-down menu or type in a new one:

Cool but I can imagine VR would be very disconcerting for a feature length film.

"A 90 minute movie requires 200 terabytes of data." Whoa. 

You May Also Like: