OTOY Developing Server-Side 3D Rendering Technology

Imagine you could play video games – and immerse yourself in virtual worlds – with 3D graphics comparable to those found in blockbuster films like Transformers or WALL•E. And then imagine you could experience and control those graphics in real-time from any internet-enabled device, whether it be a desktop computer, set-top box or even iPhone.

Sound far-fetched? It doesn’t to Jules Urbach, founder and CEO of a Los Angeles-based company called OTOY, who has been working with microprocessor manufacturer AMD since 2006 to make the idea of server-side graphics processing a reality. If all goes as planned, 3D rendering will become just another computer task that jumps from the client to the cloud. Call it gaming as a service (GaaS) if you will. No more Xboxes, no more PlayStations, and no more souped-up PC towers. Just a monitor, some controls, and a way to receive and display frames generated by a powerful server farm.

But let’s take a step back for a second. Before it’s even possible to deliver movie-quality graphics through a thin client, there must be a way to produce those graphics – and in real-time. Movie producers have the luxury of knowing ahead of time just how they want their frames to look. Visual effects studios like Industrial Light and Magic don’t have to respond to user inputs, so they can spend hours rendering each and every frame. Game producers, however, rely on engines that must respond quickly to user behavior and serve up graphics at near-instantaneous speeds. That reliance constitutes perhaps the main reason why in-game graphics have lagged behind their big-screen counterparts for years.

Just the other week, however, AMD announced an initiative called Cinema 2.0 that promises to narrow the gap between movies and games with a new RV770 GPU. To demonstrate the power of AMD’s new consumer graphics cards, Urbach and his art teams in Spain, Canada and the US pulled together a set of videos that approximate the CGI you’d expect from movies. He took us through an overview of that work here:

Most of his demos focus on recreating Autobots and Decepticons from the Transformers movie. And the results are very impressive, even if they don’t quite match those found on the big screen. The stills at the bottom of this post are from voxel-based animations that were rendered in real-time, such as the one embedded at the top of this post.

But the rendering of machinery poses far fewer challenges than producing humanoid models that suspend disbelief. To achieve the organic in addition to the inorganic, Jules has worked on a project called LightStage that takes panoramic shots of real humans in motion and turns them into animated 3D models. Watch below as Jules explains how the Lightstage works:

All of this is just an extension of what has been done by technologists so far to mimic reality within virtual experiences. Urbach’s bold and particularly innovative proposal is that he can deliver these experiences through the browser. While we’ve seen 3D games delivered through the browser before, this time it’s very different.

First of all, OTOY-powered graphics can potentially go far beyond those found on any consumer device because they aren’t actually rendered by whatever hardware is sitting on your desk, resting in your hand, or laying on your living room floor. In the video below, Urbach shows how AMD graphics cards installed on the server (rather than the client) can be hooked up to work in parallel and deliver highly complex graphics from afar, in the form of pure frames.

The main limitations are bandwidth and server power (i.e. how fast the client can receive frames generated by the server, and how fast the server can generate those frames for all its concurrent users). Urbach claims that his technology can deliver up to 220 frames per second (fps), which is overkill for most monitors and the human eye. As for lag, he experiences 12-17 milliseconds on the west coast (where his current test server is located) and 100 ms in Japan. The compression codec used to deliver these levels of performance was developed internally, although with help from AMD’s engineering team.

The second main difference is that Urbach’s technology doesn’t require any browser plugin whatsoever (although it can take advantage of those, too). OTOY-powered graphics can be delivered via Ajax, Flash, Java, or ActiveX. Surprisingly, the Ajax-powered version in Safari works fastest. That’s good news for future iPhone 3G owners since this graphics delivery system is compatible with that mobile device (and any other device with a full-featured browser, such as Android). The possibilities here make one wonder whether the days of PSPs and other portable gaming devices are limited. And they suggest that the future of the web-enabled cell phone is bright indeed.

Expect one of the first commercial implementations of LightStage and this server-side rendering technology to come in the form of a virtual world. Urbach also plans to release a full suite of developer tools to those who want to leverage OTOY for their own applications.

OTOY is privately funded, although much of its GPU-based hardware will be provided by AMD. More stills from real-time renders and LightStage results are provided below.