GDC 2012 Part II

  • Print
Welcome to the second part of Meant to be Seen's coverage of GDC 2012!  Special thanks to MTBS' Field Writer Kris Roberts.  Not only is Kris formerly a Senior Game Designer at Rockstar Games, he is an avid stereoscopic 3D gamer in his own right.

Uncharted 3 Drake's Deception
The Cameras of Uncharted 3

When I was working on the Midnight Club games I ended up being the designer responsible for setting up and adjusting all the gameplay cameras, so I was interested in this talk to begin with and the fact that Uncharted 3 was also a really spectacular stereoscopic 3D game was a real bonus.

Thursday started with a programming session presented by Travis McIntosh - Uncharted 3 lead programmer from Naughty Dog.  Travis was the lead programmer on all three Uncharted games, and was also the point person for the camera system.  He made it clear that there were many people who worked on the cameras and it wasn't as though he did everything himself.  The talk started with an overview and description of how they approached the camera systems and took the approach of developing many cameras that each did simple things rather than trying to make a few smart cameras.  They had quite a few types and a 'camera stack' system that took care of blending between them and deciding which of the currently available cameras to use at any given point in the game.

There was an emphasis on keeping the camera under player control as much as possible, particularly in combat situations but also allowing the designers and artists to adjust how they were set up to provide visual emphasis or gameplay visibility in a fluid and intuitive sense.  The camera stack often had a number of potential cameras and depending on player input, object collision, scene setup and scripting it would pick and transition pretty intelligently.

The 3D specific content in the talk was the last point of his discussion, and it was clear that a great deal of effort and care was invested in getting the stereo presentation looking as good as possible. They took a dual rendering approach which was made easier by the fact that the game already needed to support split screen.  To get and maintain a fast enough frame rate, there were art optimizations for geometry, particles and effects that also needed to be made for the stereo version, and this took quite a bit of work.  

One thing that was particularly interesting was that they chose to toe in the cameras rather than maintaining parallel projection.  Another big decision was not to have any out of screen effects.  They chose to have Drake (the main character) always be at the zero plane, and they had to do some adjustments to check and set the zero plane distance in every frame to guard against the times when an object might suddenly appear between the camera and Drake.  When they started working on support for stereo 3D, the intention was to give the artists and designers control over the parameters, and expected adjustments on many shots and settings per camera.  Eventually they found that too much variation and dynamic changes were distracting and confusing to the player so they pulled back from that and applied general settings which the user could slightly adjust in the game's options.  These adjustments would impact the entire game.

In summary, here are the stereoscopic 3D Do's and Don'ts that were presented at the session:


Change the distortion.
Hive the animators control over 3D settings per shot.
Mix heavy FOV/Depth of field and 3D together.
Allow items to jut from the screen.
Do a lot of high contrast scenery.


Modulate the zero plane with the player distance.
Put in safeguards to prevent wild distortions.
Use the z-buffer to adjust the zero plane distance.

MTBS NOTE: While these were artistic choices made by this particular game developer, many gamers want the flexibility to have combined depth and pop-out effects, and there are visual trade-offs with toeing in the cameras versus parallel camera rendering.  We recommend that game developers experiment according to what would work best for their particular game(s), and not limit themselves to any one artistic guideline.

Jonathan Haswell - CEO
Nico Rondet - Cheif Instructor

Simeraceway Demo
I love driving and racing.  When I come up to San Francisco for GDC I almost always drive up from San Diego on Route-1 along the coast and relish every minute of it.  Spending weekends driving around the cones at Qualcomm stadium in Autocross competitions is a blast and something I would encourage anyone to get involved with (check out to find events in your region).

When it comes to video games, I really appreciate good driving games and particularly great simulations.  The years I spent working on the cars, cameras and controller stuff for the Midnight Club games were some of the best times of my career, and playing with cars for a living is hard to beat.

I had heard about Simeraceway a week or so before GDC and was excited to learn that they were going to be at the conference.  You can download the game for free and take it for a spin yourself (  If you enjoy serious, accurate and competitive motorsports, I think you will like it too!

Nico Rondet, Chief Instructor, SimeracewayAs far as stereoscopic 3D goes, the game has not been developed with specific support but does run on my system with default Nvidia 3D Vision driver support really very well.  When I talked to Jonathan and Nico, I tried to encourage them to take a good look at their product in stereo and support it natively.  Out of the box it is pretty satisfactory, and with a little could be awesome in 3D!

In addition to the game itself, which supports all the standard force feedback and articulated controllers and seats, they also have an innovative steering wheel specifically designed for the game. 

The SRW-S1 is a handheld wheel that uses motion control sensors and has all the inputs and controls at your fingertips (and there are a LOT of buttons and dials).  Honestly I was a little dubious about the idea of combining a really accurate and realistic racing simulation with a hand held motion controlled wheel, but it works.  I only drove it for a couple laps at their booth, but it was immediately obvious that the solution was both accurate and responsive.  It's clearly better than using a standard two stick controller, and way more portable than a full seat and wheel rig!

Power VR
David Harold - Director PR Imagination Technologies Group

Power VR Display at GDC 2012
The semiconductor technology produced by Imagination Technologies is used in a very wide range of mobile electronics and is at the heart of the major advances in their graphics capabilities.

MasterImage 3D Autostereoscopic 3D Demo
At GDC, their focus was on their Power VR Graphics IP and trying to share information with game developers and distribute their SDK as much as possible.  The more applications that push the hardware limits of mobile systems with features like stereoscopic 3D and augmented reality, the happier these guys are because that's what really shows off and drives their development.  It was also great to see the MasterImage 3D autostereoscopic display showcased!

Michael Vesely - Founder
Pat Quan - VP Business Development

Michael Vesely, Founder, and Pat Quan, VP Business Development for InfiniteZ at GDC 2012
zSpace absolutely blew me away, and I don't think I've ever said that about a product demo at a trade show.

It's going to be hard to get the impact of the product across in words, but the nuts and bolts of the system combine a 3D display panel with head tracking and a stylus to enable you to interact with objects in a 3D simulation as if they were real.

Most likely, the biggest part of what makes the illusion so convincing is the real time head tracking and how the display adjusts to every movement you make to keep the displayed image correctly projected.  I have taken a look at some systems with head tracking to adjust and calibrate autostereo displays, and demos with head tracking to produce holographic like displays - but this was the best and most responsive system I have seen so far!  I suspect it is looking at trackers on the glasses you are wearing to precisely calculate the position of your eyes.

The second most important part is probably the way the stylus lets you reach in and interact with the objects in the display.  Clearly, the accuracy and responsiveness of the system of knowing where the stylus is and how its oriented are critical.  It uses a ray projected from the tip when you press a button to let you select and interact with things.  It'ss amazing how it looks like a laser is really shooting out from the stylus.  It's not like its roughly in the right place and a frame or two behind in the update as you move it around - no, it looks like its a beam coming out of the physical device in your hand.  As you move your head and your hand to see how it looks, it continues to be convincing.

Pat Quan, VP Business Development for InfiniteZ at GDC 2012The last piece that really brings the system together is the resolution and quality of the passive 3D display panel.  Normally, an LCD passive display sacrifices resolution and looks pixilated - especially when you get close to it.  This one doesn't, and its really amazing how each eye sees a full resolution image.  Unlike a consumer display that needs to look good for both 2D and 3D, theirs is strictly for 3D and is just spectacular.  The details are proprietary, but they are using two panels and are not having problems with view angle, color, or intensity.

Even though it's expensive and primarily geared towards commercial applications, the zSpace interface really shows where things are headed with the convergence of display and user interaction.  I think this will open up great opportunities for games that we have never seen before (or felt like we could touch!).

Kris Roberts completes his GDC coverage tomorrow with a lot more to share!  Be sure to comment below!