Search

MTBS3D RT @GetImmersed: We’re moving to Silicon Valley! @IfcSummit November 5 & 6, 2019 at the Computer History Museum in Mountain View, CA. Call…
MTBS3D Julien Le Corre, Lead Developer at @InnerspaceVR , talked about their latest #VR escape room title The Corsair's Cu… https://t.co/uuOT6SG0NA
MTBS3D As fun as Arizona Sunshine is in traditional #VR, @Vertigo_Games took it up a notch by transforming it into a locat… https://t.co/YkGpv2wLMM
MTBS3D .@OfficialGDC would not be complete without visiting SVVR's annual #VR Mixer! In today's interview, we catch up wi… https://t.co/hibivrbYdq
MTBS3D Spencer Jackson, Software Engineer at @NordicTrack, talks about their latest iFit #VR Bike paired with an #HTCVivehttps://t.co/5b2uD9Hoa9
MTBS3D William Provancher is the CEO of @TacticalHaptics. He demonstrated their latest haptics controllers for us in this… https://t.co/Ir1Cog8bRI
MTBS3D Gaspar Ferreiro is the CEO of Project Ghost Studios. In this interview, he talks about their new Project Ghost dem… https://t.co/T2xz1VdtGI
MTBS3D .@EpicGames had loads of news to share at @OfficialGDC. Marc Petit is the General Manager of #Epic's @UnrealEnginehttps://t.co/CnqpGAB2f4
MTBS3D Chris Hook, Graphics & Visual Technologies Marketing Chief for @intel spoke to us during @OfficialGDC. We talked ab… https://t.co/ji6AKJpfwM
MTBS3D We interviewed @networknextinc at #GDC2019. They are in the business of ensuring the best connectivity and lowest l… https://t.co/87b06uMAm7
MTBS3D .@reality_clash is a developing #AugmentedReality combat game. We got to interview Tony Pearce, the CCO and Co-Fou… https://t.co/24P5kLz0Ef
MTBS3D Robots explode at #GDC2019 with @FuturLab. They have a new title for #PSVR called Mini Mech Mayhem. #GDC19https://t.co/JiIuJgGZ64
MTBS3D .@zerolatencyVR has a number of #VR out-of-home entertainment centers around the world, and we got to catch up with… https://t.co/NZJBVyRUWz
MTBS3D RT @GetImmersed: Dr. Ofer Shai is the Director of Omnia AI at @DeloitteCanada. He talked about the misconceptions about #ArtificialIntellig
MTBS3D RT @GetImmersed: The use of #futurecomputing in #healthcare was one of the prominent tracks at #Immersed2018, and we got to see some really…
MTBS3D RT @GetImmersed: Ricardo Wagner, Director of Product Marketing for #Office365 at @microsoftcanada, talked about their efforts to make moder…

GDC 2013 in 3D, Part I

Running the VR Gauntlet – VR Ready, Are You?
Nate Mitchell and Michael Antonov (Oculus VR, Inc.)
DESCRIPTION: Virtual reality may be poised to revolutionize the way we play our favorite games, but creating a great VR game is surprisingly challenging. Developers have to carefully consider latency, user input, rendering performance, UI design, and overall user experience. We'll discuss what developers need to know about supporting the Oculus Rift, how to tackle the major technical hurdles associated with truly immersive virtual reality, and what we've learned so far from building a new platform for VR games.


Nate started off the talk with a general overview of what Oculus is all about - building a platform for VR games. The Oculus Rift is a devkit for game developers that includes a head-mounted display, head tracking, computer interface and software SDK.

Many of us have seen VR demos in the past, various head-mounted displays and other pieces of the puzzle, but for the first time the supporting technology is finally here: Lightweight mobile components, MEMS sensors, high resolution mobile screens, fast GPUs. All of this is coming together to allow for affordable hardware that we can use to start seeing what virtual really can really be.

There are lots of differences between the development of a VR game and developing games for traditional monitors. The inputs, gameplay and storytelling are unique in VR and the ultimate goal is immersion: the player should feel as though he or she is a part of the virtual world.


This really is just the beginning though, there is so much to learn and the message was repeated many times that we are at day zero. The development Oculus has done so far has been focused primarily on the production of the platform of the Rift hardware and SDK.  It's the game developers getting the equipment who are going to be the pioneers discovering what they can do with it. This is a community driven effort and we are going to learn a lot from each other.

Michael took over on the technical side, explaining how immersion requires linking the motion input to your senses. Player interaction matching up with expected reality. With the Rift there is head tracking input and rendering with the goal of maximizing performance and minimizing the motion to photon latency.

On the game design side, immersion is bringing the player into a different world. Key to that is avoiding standard things that would break out of gameplay. Keeping the player in control of at least their point of view and controlling how information is displayed are really important.

In terms of VR input, the Rift today provides head tracking from a system with gyro, accelerometer and magnetic sensitivity. The SDK lets you query orientation to get a quaternion which you can apply to the view transformation to look around. Using that information with a model of the head and neck helps a lot to position the cameras and make the motion more natural and convincing. Most of the demos and games so far are combining orientation from the head tracking and a controller – roll and pitch from the Rift, yaw combined with the controller input. More advanced input systems would include translation of the head tracking system (not in the current Rift), hand tracking, and full body kinematics.


For stereo rendering, the 7” devkit display is a 1280x800 panel, which gets split into 640x800 per eye. The virtual cameras use a parallel Z axis, a computed field of view (FOV) based on the HMD and custom eye separation adjustments. To account for the correction of lens distortion its best to render at a much higher resolution of 2170x1360 and then correct the image in a post process shader:

k0 * r + k1 r^3 + k2*r^5 (where r is the radius from lens center)

The increased render size for the relatively low resolution on the HMD panel was something I hadn't realized before, and its a little daunting to realize how large the render would have to be to support what we would hope to see in significantly higher resolution panels in the future; especially with the performance requirements for maintaining 60Hz without tearing or stuttering. One suggestion Michael had was moving the sensor sampling onto the render thread which he reported from their experiments showed a ~20ms improvement on motion latency.

Switching to Nate's part of the presentation, the focus shifted from the technical considerations to the realization that building VR games is fun! In many ways it is different from the way games have been made for traditional monitor/controller systems.

In a VR simulation, the player has a much more accurate intuition for the relative scale of themselves and the environment. Where in 2D games player size has been something which has been set arbitrarily or to accommodate other considerations, players in VR games know right away if they are four feet tall or a giant. Other aspects of level design and art direction which may have been necessary to exaggerate scale or sense of speed to come across on a monitor may well be overwhelming in VR.


Providing information to players is also particularly challenging. In a conventional game, 2D screen space HUD elements are the common way of showing data to the player – but in VR that becomes problematic. It may be tempting to just shrink an existing HUD and place it where the player can't help but look at it in the center of the screen, but that certainly affects the sense of immersion and introduces potential conflicts or visual contradictions in terms of the depth of UI elements and the scene. If a static HUD element is being drawn at any depth the player will see it double when their convergence shifts to look at other elements in the world. Alternate approaches to show the information players need on more natural objects within the scene seems promising but can be new territory in UI design.

Simulation sickness and individual player tolerance for intense VR experiences is also a serious consideration. While there is no single widely accepted cause of simulation sickness, many players do experience symptoms to various degrees particularly when they start with VR. The factors of latency, tracking precision, game content and the player themselves are all involved and something that developers need to be aware of. There are lots of things that players do in games that would make any person nauseous if they attempted them in the real world, and effective VR can have the same result.

If VR is about putting the player into the world, what types of games provide the best experiences? Nate presented two ends of the spectrum with Call Of Duty and Flower – both would be very powerful experiences in VR and obviously very different.

We are at the very beginning.  This is Day Zero for virtual reality gaming.