MTBS3D Featured blog on @gamasutra, A #3D view of how we got here.
MTBS3D RT @IFCSummit: Companies attending and speaking at #IFCSummit include @AMDPC @IntelSoftware @Lenovo @htcvive @AccelByteInc @unity3d @ultra_
MTBS3D RT @IFCSummit: Immersive technology continues to be an important backbone for future computing and #IFCSummit looks forward to welcoming Vi…
MTBS3D RT @IntelSoftware: Register for The @IFCSummit today to meet content creators, innovators, and educators! Get exclusive access to education…
MTBS3D RT @IFCSummit: Melissa Brown, Senior Director of Developer Relations for @ultra_io will be speaking on our upcoming #Crypto and #Blockchain
MTBS3D RT @IFCSummit: Ernestine Fu is a Venture Partner at @alsoplouie and will be joining the #IFCSummit investment panel. #venturecapital #cloud
MTBS3D RT @IFCSummit: New ticket option available for independent studios, small businesses and start-up companies. It’s a full access ticket sell…
MTBS3D RT @IFCSummit: #IFCSummit is thrilled to welcome Jen MacLean, Head of Worldwide Business Development for Small and Mid-Sized Studios in the…
MTBS3D RT @IFCSummit: #IFCSummit ticket sales end October 31, 2019 at 5:00PM PST or when tickets are sold out. If possible, new block may become a…
MTBS3D RT @IFCSummit: Christopher Croteau, Executive in Residence at @intel Capital is speaking at #IFCSummit. He has 25+ years of experience in t…
MTBS3D RT @IFCSummit: .@AccelByteInc builds and operates large scale online game publishing platforms via cloud technologies. Kevin Robertson, the…
MTBS3D RT @IFCSummit: .@CintooCloud converts data in the #Cloud. Dominique Pouliquen, CEO of Cintoo, will discuss their unique problem-solving cli…
MTBS3D RT @IFCSummit: Nick Thomas is the Vice-President of Commercial Partnerships for @playhatchglobal and is speaking at #IFCSummit. #playhatch
MTBS3D RT @IFCSummit: #VR/#AR industry spokesperson and thought leader,@tipatat Chennavasin of the Venture Reality Fund is speaking at #IFCSummit.…
MTBS3D RT @IFCSummit: .@Dell’s Director of Virtualization and Commercial #VR and #AR is speaking at #IFCSummit.
MTBS3D RT @IFCSummit: .@tifcagroup’s International Future Computing Summit ushers in #clientocloudrevolution. #IFCSummit #PC #cloud #XR #VR #AR #A
MTBS3D RT @IFCSummit: Dr. Ali Khayrallah, Engineering Director for @ericsson speaking at #IFCSummit. #clientocloudrevolution #cloudcomputing #futu
MTBS3D RT @tifcagroup: TIFCA releases new #ClienttoCloud Vision Document and a $200 off code for @IFCSummit tickets. #TIFCA #IFCSummit #cloud #cli
MTBS3D RT @IFCSummit: .@tifcagroup releases new #ClienttoCloud Vision Document and a $200 off code for #IFCSummit tickets. #TIFCA #cloud #clientot

GDC 2013 in 3D, Part I

Each year, Kris Roberts covers GDC like few others an experienced game developer and long-time VR/3D enthusiast!  Some of Kris' industry highlights including being a Senior Game Designer for Rockstar Games San Diego, and a talented Game Designer with Sony Online Entertainment.  Each year, we are honored to get his unique perspective on things.  Of course - in the true spirit of MTBS - all the coverage has been shared in 3D!  So, take it away Kris!

Leaving GDC this year, I kept thinking about this quote:

“More and more now, there's all kinds of electronic goodies which are available for people like us to use if we can be bothered, and we can be bothered." - Roger Waters, Pink Floyd Live at Pompeii 1972

The emergence of audio electronics in synthesizers, amplifiers, multi-track recording equipment and distortion effects changed music. The changes didn’t just happen spontaneously, but there do appear to have been points in time when whole collections of technologic innovations came together and opened up new doors for expression in music as an art form that simply had not existed before.

It feels like we are on the cusp of significant change in video games at the very core of how players experience the medium. The general interest and enthusiasm for virtual reality, specifically the Oculus Rift at the conference was overwhelming. It has always seemed like VR could be great, and we have seen bits and pieces now and then for years, but for the very first time – this year, my impression from the general developer community is that not only is it going to be awesome, it is actually going to happen. Starting right now.

Before the week was out I was seeing updates and tweets from friends and colleagues who have received their Rift devkits – and the excitement is contagious. While at GDC, it was often the topic of conversation in the hallways, between sessions and at parties. I would ask people if they had seen “it”, and without being any more specific they started telling me about what they thought about virtual reality. These are the people who will make it happen, they are getting new electronic goodies, and they can be bothered to figure out how to use them.

In the conference program there were multiple sessions focused on virtual reality. They were all very well attended, generally standing room only, and many of the attendees - maybe the majority - confirmed that they participated in the Rift Kickstarter or pre-ordered devkits.

In the first part of this series (which isn't completely written yet!), I'm going to talk about the presentations put together by Oculus VR.

Running the VR Gauntlet – VR Ready, Are You?
Nate Mitchell and Michael Antonov (Oculus VR, Inc.)
DESCRIPTION: Virtual reality may be poised to revolutionize the way we play our favorite games, but creating a great VR game is surprisingly challenging. Developers have to carefully consider latency, user input, rendering performance, UI design, and overall user experience. We'll discuss what developers need to know about supporting the Oculus Rift, how to tackle the major technical hurdles associated with truly immersive virtual reality, and what we've learned so far from building a new platform for VR games.

Nate started off the talk with a general overview of what Oculus is all about - building a platform for VR games. The Oculus Rift is a devkit for game developers that includes a head-mounted display, head tracking, computer interface and software SDK.

Many of us have seen VR demos in the past, various head-mounted displays and other pieces of the puzzle, but for the first time the supporting technology is finally here: Lightweight mobile components, MEMS sensors, high resolution mobile screens, fast GPUs. All of this is coming together to allow for affordable hardware that we can use to start seeing what virtual really can really be.

There are lots of differences between the development of a VR game and developing games for traditional monitors. The inputs, gameplay and storytelling are unique in VR and the ultimate goal is immersion: the player should feel as though he or she is a part of the virtual world.

This really is just the beginning though, there is so much to learn and the message was repeated many times that we are at day zero. The development Oculus has done so far has been focused primarily on the production of the platform of the Rift hardware and SDK.  It's the game developers getting the equipment who are going to be the pioneers discovering what they can do with it. This is a community driven effort and we are going to learn a lot from each other.

Michael took over on the technical side, explaining how immersion requires linking the motion input to your senses. Player interaction matching up with expected reality. With the Rift there is head tracking input and rendering with the goal of maximizing performance and minimizing the motion to photon latency.

On the game design side, immersion is bringing the player into a different world. Key to that is avoiding standard things that would break out of gameplay. Keeping the player in control of at least their point of view and controlling how information is displayed are really important.

In terms of VR input, the Rift today provides head tracking from a system with gyro, accelerometer and magnetic sensitivity. The SDK lets you query orientation to get a quaternion which you can apply to the view transformation to look around. Using that information with a model of the head and neck helps a lot to position the cameras and make the motion more natural and convincing. Most of the demos and games so far are combining orientation from the head tracking and a controller – roll and pitch from the Rift, yaw combined with the controller input. More advanced input systems would include translation of the head tracking system (not in the current Rift), hand tracking, and full body kinematics.

For stereo rendering, the 7” devkit display is a 1280x800 panel, which gets split into 640x800 per eye. The virtual cameras use a parallel Z axis, a computed field of view (FOV) based on the HMD and custom eye separation adjustments. To account for the correction of lens distortion its best to render at a much higher resolution of 2170x1360 and then correct the image in a post process shader:

k0 * r + k1 r^3 + k2*r^5 (where r is the radius from lens center)

The increased render size for the relatively low resolution on the HMD panel was something I hadn't realized before, and its a little daunting to realize how large the render would have to be to support what we would hope to see in significantly higher resolution panels in the future; especially with the performance requirements for maintaining 60Hz without tearing or stuttering. One suggestion Michael had was moving the sensor sampling onto the render thread which he reported from their experiments showed a ~20ms improvement on motion latency.

Switching to Nate's part of the presentation, the focus shifted from the technical considerations to the realization that building VR games is fun! In many ways it is different from the way games have been made for traditional monitor/controller systems.

In a VR simulation, the player has a much more accurate intuition for the relative scale of themselves and the environment. Where in 2D games player size has been something which has been set arbitrarily or to accommodate other considerations, players in VR games know right away if they are four feet tall or a giant. Other aspects of level design and art direction which may have been necessary to exaggerate scale or sense of speed to come across on a monitor may well be overwhelming in VR.

Providing information to players is also particularly challenging. In a conventional game, 2D screen space HUD elements are the common way of showing data to the player – but in VR that becomes problematic. It may be tempting to just shrink an existing HUD and place it where the player can't help but look at it in the center of the screen, but that certainly affects the sense of immersion and introduces potential conflicts or visual contradictions in terms of the depth of UI elements and the scene. If a static HUD element is being drawn at any depth the player will see it double when their convergence shifts to look at other elements in the world. Alternate approaches to show the information players need on more natural objects within the scene seems promising but can be new territory in UI design.

Simulation sickness and individual player tolerance for intense VR experiences is also a serious consideration. While there is no single widely accepted cause of simulation sickness, many players do experience symptoms to various degrees particularly when they start with VR. The factors of latency, tracking precision, game content and the player themselves are all involved and something that developers need to be aware of. There are lots of things that players do in games that would make any person nauseous if they attempted them in the real world, and effective VR can have the same result.

If VR is about putting the player into the world, what types of games provide the best experiences? Nate presented two ends of the spectrum with Call Of Duty and Flower – both would be very powerful experiences in VR and obviously very different.

We are at the very beginning.  This is Day Zero for virtual reality gaming.

Virtual Reality: The Holy Grail of Gaming
Palmer Luckey (Oculus VR, Inc.)
DESCRIPTION: For years, developers have strived to make immersive virtual worlds, and gamers have spent countless billions on the systems that play them best. Software, hardware, and input devices have all leapt forward, but the connection between the player and the virtual world has remained limited. We've dreamed of stepping inside of our games, but the best we've been able to do is puppet characters through a tiny window! Technological progress in a variety of fields has finally brought immersive virtual reality within reach of gamers. We'll discuss VR's false starts, what's different this time, and why virtual reality is poised to revolutionize the way we play games.

Palmer's session was great in terms of conveying the obvious enthusiasm and excitement he has for virtual reality, applying that within the context of video games to push the medium forward into new territory.

From his point of view, gaming is about sharing experiences. We do have other mediums for sharing experiences: literature, music, theater, film and television. The example he used to talk about how they all work is trying to share the experience of skydiving. He could talk to you about skydiving, write a book, make a movie or even make a game about it. Each would be effective at conveying different parts of the experience and let you know something about what it was like but none of them would give you a feeling of what it really is. But a virtual reality simulation of skydiving could transcend the limitations of the other mediums and start to let you experience things much closer to first hand.

In all creative mediums, content innovation is driven by technology innovation. Conventional game design is limited by the technology. Traditional games are comprised of rectangular flat screens, abstract input devices and puppet characters. Virtual reality gaming has the potential to be the ultimate medium. Books can't reach out and touch you. Movies can not react to you. Traditional games can not put you into the world. Virtual reality opens new doors for sharing experiences.

What new tools/possibilities should game designers be excited for? That's really the question for the developers here at GDC. So far the effort within Oculus has been to develop the Rift devkit, and it's the game developers who are getting their hands on them now who are going to have the creative input on designing the experiences and gameplay in virtual reality.

From what we see so far with the first demos and ports is that there are many areas that have been touched on by games already, but are open for much more in virtual realty: immersion, sensations (falling, space, scale, flying), and emotions.

Games designed for VR will change the way we think about gaming as a medium. The VR technology itself is in its infancy. Great VR is uncharted territory. There is a great deal of work to be done, hard problems we already know about, and challenges we have yet to expose – but all of this is super exciting!

What emotions do you want your games to invoke?  Happiness, exhilaration, fear, panic?

The game developers of today will be the first developers with the opportunity to explore VR.

More GDC 2013 coverage to come!  Please comment and visit regularly!