By Kris Roberts
While the press made it seem that the Nintendo 3DS was the most exciting development at GDC 2011 this year, there were additional gems at the show! Today, I’ll share my experiences with Sony, Blitz Games Studios, DDD, Starcraft II, iGO3D, and more.
When the expo floor opened I spent some time in the Sony booth checking out the Mortal Kombat demo they had running in stereo 3D. It was fun, but the guy I was playing against was a lot better and knew the attack combos. In spite of not doing so well with the game it was interesting to see the 3D presentation.
The environment and characters themselves presented a fairly static scene, while the distant background geometry and the fighters had good depth in the foreground. As good as it looked, I’m not sure that stereo 3D really added much to this particular gameplay.
I also took a whack at playing Killzone 3 with the Sony Move in their rifle controller. Move is Sony’s answer to Nintendo’s Wii motion controller. I watched another fellow play for a bit and was pretty sure I understood the controls before it was my turn. It did take a few minutes to feel like I was getting the hang of it after starting to play, but after a little initial clumsiness it actually got quite fun.
Sadly the display was a standard 2D television and I was honestly disappointed that they didn’t have the combination of the move controller with the stereo 3D on display. I have now played with both parts separately – Killzone 3 in stereo 3D at home and here at GDC with the move controller. I really want to see how the Move affects the 3D and whether the two really do work well together.
Lamenting this I had a chance to talk to David Coombes who is the Platform Research Manager for Sony, and has been a long time evangelist for 3D internally within the company, as well as in outreach and developer support. David indicated they decided not to have the combined Killzone 3 Move and stereo 3D demo on the floor because it is overwhelming and might be a little too intense for some players. That made me want to try it even more!
David did a great job of giving me an overview of the Sony developer support program, which now has a stronger focus on reaching out to independent developers. For example, the cost of a PS3 devkit has come down quite a bit and they are enthusiastic about helping people get started with their game writing. This was really encouraging to hear and could be a great thing for smaller projects to experiment and do progressive stereo 3D development on the PlayStation 3.
Blitz Games Studios
Andrew Oliver, CTO & Co-Founder
Meeting Andrew was terrific and his experience with game development in general and stereo 3D in particular is impressive. I remember playing Invincible Tiger: The Legend of Han Tao on my Xbox and being surprised that they released it with stereo support on all platforms – even while the standards were still undecided. Hearing from Andrew how it was developed with some of the earliest 3D TV’s made the accomplishment even more impressive.
While the press has been going crazy over glasses free displays, Andrew is most animated and excited about auto-stereoscopic 3D on mobile devices. An example of a 3D mobile device is LG’s Optimus 3D Smartphone. Blitz Games is continuing to embrace stereoscopic 3D technology – this time through their Blitztech mobile game engine. Blitztech supports ALL mobile platforms including Android OS.
Looking forward, Andrew is convinced that parallax barrier displays on small screens work well enough now, and will improve and support larger and larger displays over the next few years. An interesting observation is that mobile devices only need to support one user at a time, and can conceivably do a lot to improve their presentation with head tracking and awareness of where the viewer is in relation to the screen. In contrast, larger displays with multiple viewers will be limited to glasses based solutions for some time. Auto-stereoscopic 3D on large displays with multiple viewers remains to be a big challenge because it requires new formats for describing the 3D scene information to the display, and having it render with something approaching a continuous sequence of parallax barriers.
Dynamic Digital Depth
Simon Kwok, Software Engineer
Lawrence Wang, VP of Business Development
I was really interested in meeting DDD and learning more about their product as an alterative stereo 3D solution on the PC. Among gamers, DDD is best known for their TriDef Experience software that includes stereoscopic 3D drivers and a 3D media player. TriDef’s popularity has been growing because it supports both AMD and Nvidia graphics cards.
The key features for developers working with their drivers include:
1) Auto focus of the cameras with management of the focal point
2) Virtual 3D as a depth buffer solution.
When implemented properly, auto-focus helps solve the challenge of inconsistent camera angles during a game. A sample problem is when your stereo settings are good for one scene, and suddenly uncomfortable when the angle changes because the separation and convergence settings are out of whack.
TriDef’s “Virtual 3D” or depth buffer approach makes their solution attractive for GPUs and systems that are not right up on the bleeding edge and might even produce better visual quality in some circumstances. Another interesting item was the fact that the game profiles for the various settings are available to the users themselves and the community actively shares them in a very open way. This accelerates their driver compatibility, and even opens the door to enhancements that their own developers may have missed.
Stereoscopic 3D Demystified: From Theory to Implementation in STARCRAFT II
Speaker/s: Dominic Filion (Blizzard Entertainment) and Samuel Gateau (NVIDIA)
Track / Format: Programming / Sponsored Session
Description: Stereoscopy is the study of techniques for creating the illusion of depth in an image. This field of study has been around since the 19th century, but has recently seen a resurgence in both movies and video games. This presentation will explore the theory of stereoscopy with a focus on computer graphics and real-time applications, to provide the required understanding of stereoscopy and how to make it work most effectively for end users. This will include a discussion on artistic concepts, with a comprehensive overview of the rendering techniques and challenges including examples on how these are solved with NVIDIAs stereoscopic solution 3D Vision. In addition, Dominic Filion will present a real world example of the challenges encountered during the implementation of stereo in STARCRAFT II.
The sad truth is I failed to get in and see the presentation.
I thought I allowed myself enough time, but when I got to the room where the presentation was going to be, the line went down the hall and around the corner – and the presentation room itself was not very large. I waited and hoped, but the room filled up about five people in front of me. I was really disappointed, but it was also very good to see the overwhelming interest and having way more people show up to a session than can fit in the room is always a good sign.
The silver lining was that after the presentation ended I was able to introduce myself to the presenters, and Samuel Gateau was graciously willing to meet with me and chat about the session and his experience as a key member of the Nvidia Dev Tech group.
I had actually seen a previous talk Samuel presented at GDC in 2009 that covered much of the same ground in the conceptual overview of the basic concepts of stereo 3D: the standard terms and definitions and explanation of how the illusion of 3D is presented.
He explained that the rest of the talk focused on the changes that are recommended to the game engine in order to support stereo 3D with as little drama as possible. Specifically, setting up the projection, monitoring the draw calls and managing the surfaces. Particular care needs to be taken with deferred shading or any other post processes.
The specific details for the Starcraft II development sounded like the actual integration and support for stereo 3D went fairly smoothly and they had most of it done in about three weeks. The slides from his previous presentation and current presentation are available online.
My last meeting of the day was with Andrew Hogue who is a professor with the University of Ontario Institute of Technology (UOIT). The university has an undergraduate program in game development that currently has around 250 students.
Andrew has worked on a number of interesting projects involving stereo 3D ranging from his thesis project with underwater robots constructing 3D models of submerged barges to VR cave setups and novel laser based head tracking solutions he engineered.
His current project is very topical and is aimed at researching the effects of stereo 3D on users; investigating claims of perceptual influences and establishing guidelines. There are a lot of ideas that are being tossed around as “best practices”, but if you follow them faithfully you may not end up with a very compelling 3D presentation. Understanding how various aspects of stereo 3D effects impact viewers can give game designers valuable insight into ways to use those effects (how often, how severely and for how long) to engage players and maximize the impact of stereo 3D.
The majority of Andrew’s work is funded by the Governement of Ontario (Canada), and a press release went out while I was at GDC. In addition to several Ontario universities, some of the industry partners include The S-3D Gaming Alliance, Electronic Arts Canada, Digital Extremes, Bedlam Games, Big Blue Bubble, and more.
Kris will have even more to share tomorrow!