The Truth Behind Liveplace's Photo-Realistic 3D World And OTOY's Rendering Engine

Last week we posted a video that presented LivePlace, a 3D world with an incredible amount of detail. The impressive technology behind it is called OTOY, a streaming platform that allows developers to generate movie-quality renders “in the cloud”, which can then be streamed to more modestly-powered computers and even mobile phones. For more information on OTOY, see our intro post here.

The video was available to the public at LivePlace.com alongside the ambiguous headline “Live or Virtually Live?”, but apparently nobody was supposed to find it. Soon after we published the post, LivePlace removed the video from its servers. Brad Greenspan, the entrepreneur behind MySpace who owns LivePlace, says that the site was never meant to be seen by the public, explaining that it was for internal mockups, viral videos, and “something similar to a Funny or Die episode.” That explanation doesn’t sit well with me, but it’s unlikely we’re going to get anything more substantial out of Greenspan.

So what about that 3D virtual world – is it a sham?

Jules Urbach, founder of OTOY, explains that while he can’t comment on what Liveplace is doing (or why they released the video), virtual worlds running on the rendering engine in the video are on the way. He says this video isn’t representative of his system’s capabilities (which have actually improved since the footage was shot), and is actually just a number of random clips spliced together by Liveplace:

“The 14 mins of real time rendering in this material is streaming live to a Treo 700 at 240 kpbs. This was captured on March 2007, the server was running an ATI RX 1900 GPU. The tech has improved massively since then (as has the HW we now run on). There was never intention to show any part of this to the public until we could include voxel rendering and Lightstage based characters. I think anyone who liked what they saw, will find the final project much more impressive.

The whole aim of our work last month on the Ruby demo for AMD was to show that the quality of offline and real time work is identical starting with this generation of GPUs. The following presentations this month are just introducing Lightstage and how it makes characters (or any CG object) look 100% real in those real time environments.

The virtual worlds these technologies are going to be applied to was not meant to be discussed until later this year, after one further announcement regarding the server side platform being developed for OTOY.

We had nothing to do with editing or leaking this video and can’t comment on anything other than the OTOY technology, since this project is still under NDA.”

One concern readers had beyond the lack of consistency seen in the video is the possibility that it contains material pirated from other artists. The video begins with a brief clip of cars that is apparently taken from a artists’ portfolio and was originally created years ago. As it turns out, the footage is old, but Jules Urbach explains that the artist is now part of the OTOY team:

“JJ has been working with OTOY/JulesWorld on almost all of our major projects over the past 3 years (some of which are still under NDA). I couldn’t be prouder to count him as a great friend and partner.

JJ’s studio, BLR, is always properly accredited on all videos that our clients let us put our logos on, whether it is a for a real time project or linear VFX work. You can see the BLR logo on the real time Transformers OTOY clip that was on Techcrunch a few weeks back (originally from Daily Variety), and you will see it again in a November print ad campaign featuring our work.

Note: The VW beetle you see in the very beginning of the BCN street scene was is one of JJ’s first CG models and is his ‘baby’. It has appeared in nearly everything we’ve done together – from our ‘Bumblebee’ Transformers ad for Paramount, to our most recent Ruby voxel demo for AMD (you can find it on the right side of the street). It is also in one of the images from the TechCrunch piece on OTOY last month (rendered in real time on 512 Mb R770, pre-voxel renderer).

So what’s the bottom line? LivePlace doesn’t seem to have anything to do with the video provided or the city described, and shouldn’t have posted the footage in the first place. The impressive OTOY technology behind it is real, but we will have to wait to see what products will be taking advantage of it.

Here are more technical details Jules has provided:

– We sore voxel data in several ways, including geometry maps (see our Siggraph or Iceland presentations, where we show this method applied to the Ligthstage 5 structured light data, courtesy Andrew Jones ICT/Graphics lab)

– The datasets from the BCN and Ruby city scenes contain up to 64 data layers per voxel, including diffuse albedo, fresnel reflectance values, irradiance data, UV coordinates (up to 8 sets), normals, and, for static scenes, look up vectors for 1-20 bounces of light from up to 252 evenly distributed viewpoints (it is important to note that this data is always 100% optional, as the raycaster can do this procedurally when the voxels are close and reflection precision is more important than speed; however, with cached reflectance data, you might see the scene rendering at 100s-1000s of fps when the scene isn’t changing).

– A note on raytracing vs. rasterization: amplifying the tree trunk in Fincher’s Bug Snuff demo to 28 million polys using the GPU tessellator turned out to be faster than rendering a 28 million voxel point cloud for this object. So there is a threshold where voxels become faster than rasterziation at about 100 million polys. At least in our engine, on R7xx GPUs, using full precision raycasting at 1280×720. Below that point, traditional rasterization using the GPU tessellator seems to be faster for a single viewport.

– The engine can convert a 1 million poly mesh into voxel data in about 1/200th second on R770 (60 fps on R600 and 8800 GTX). This is useful for baking dense static scenes that are procedurally generated once, or infrequently, on the GPU. That is why some of the OTOY demos require the GPU tessellator to look right.

– Hard shadows in OTOY were done using rasterization until we got R770 in May. Now hard shadows, like reflections, can be calculated using raycasting, although shadow masks are still very useful, and raycasting with voxel data can still give you aliasing.

– We can use the raycaster with procedurally generated data (perlin generated terrain or clouds, spline based objects etc.). At Jon Peddie’s Siggraph event, we showed a deformation applied in real time to the Ruby street scene. It was resolution independent, like a Flash vector object, so you could get infinitely close to it with no stair stepping effects, and likewise, the shadow casting would work the same way.

– The voxel data is grouped into the rough equivalent of ‘triangle batches’ (which can be indexed into per object or per material groups as well). This allows us to work with subsets of the voxel data in the much the same way we do with traditional polygonal meshes.

– The reflections in the march 2007 ‘Treo’ video are about 1/1000th as precise/fast as the raycasting we now use for the Ruby demo on R770/R700.

– One R770 GPU can render about 100+ viewports at the quality and size shown in the ‘Treo’ video. When scenes are entirely voxel based, the number of simultaneous viewports is less important than the total rendered area of all the viewports combined.

– The server side rendering system is currently comprised of systems using 8x R770 GPUs ( 8 Gb VRAM, 1.5 Kw power per box).