MTBS3D RT @GetImmersed: Come down to the Ontario Science Centre & experience #VirtualReality! We’re here w/our FREE public exhibit until 5:00pm! #…
MTBS3D RT @GetImmersed: .@pumcypuhoy from @tomshardware brought Acer’s #WindowsMixedReality headset! #Immersed2017! #MixedReality #MR https://t.c…
MTBS3D RT @GetImmersed: Don’t miss this talk with Olga! She’s an award winning #VirtualReality artist! #Immersed2017 #VR #TiltBrush
MTBS3D RT @GetImmersed: We’re at the Science Centre getting ready for tomorrow! Tickets available. #Immersed2017 #VirtualReality #VR #AR #MR #http
MTBS3D @All_Hail_Cesar Hey, did you get your confirmation?
MTBS3D RT @Lytro: Don't miss out @buzzhays speaking @GetImmersed in Toronto, Canada. #immersed2017 Register w/ $100 discount code:
MTBS3D RT @onabatova: I’ll be SHOWING MY VR creations @GetImmersed . Great speakers. See you there. #vr #tiltbrush #vrart
MTBS3D RT @developerWorks: #Immersed2017 October 19 – 21, 2017, Toronto, @mrjohncutter AI in Virtual Reality. https://t.c…
MTBS3D RT @onabatova: Come to @GetImmersed oct, 19-21 ! Amazing speakers , I’ll tell the story of becoming VR artist
MTBS3D RT @tomshardware: ▸ Immersed 2017 Brings The VR Industry Together From Oct 19-21
MTBS3D RT @VuzeCamera: CanadaVuzers, @GetImmersed with our discount code! Come see our booth and join the 3D/Immersive co…
MTBS3D .@GetImmersed is this week! Don’t miss it. Register today! #Immersed2017 #VirtualReality #VR #AR #MR #AI
MTBS3D RT @StereoDToronto: Stereo D's Nick Brown will be at Immersed 2017 next weekend. As part of the Immersive Cinema Panel Friday at 2PM and Sa…
MTBS3D RT @GetImmersed: Welcome to the Future of Intelligent Digital Reality at #Immersed2017! #VirtualReality #VR #AR #MR #AI
MTBS3D RT @Suometry3D360: We will be speaking amongst a line up of the top in the business. @GetImmersed
MTBS3D RT @GetImmersed: Don't miss the Futurists panel moderated by @AjayFry from @SpaceChannel & speakers from @intel & @HP #Immersed2017! https:…
MTBS3D RT @IOnews: .@GetImmersed Oct 19-21! Hear from HP, AMD, Google & exhibits from Intel, Pimax & more! #Immersed2017 #VR #AR #MR

Oculus VR at GDC Part II

Kris Roberts is back to cover the second GDC 2014 Oculus VR Session: Developing Virtual Reality Games and Experiences.

Tom Forsyth | Software Architect, Oculus VR

Virtual reality is significantly different to monitor-based games in many ways. Many choices that are merely stylistic for traditional games become incredibly important in VR, while others become irrelevant. Working on the Team Fortress 2 VR port to taught me a lot of surprising things. Some of these lessons are obvious when looking at the shipped product, but there are many paths we explored that ended in failure, and minor subtleties that were highly complex and absolutely crucial to get right. In this talk, I'll focus on a handful of the most important challenges that designers and developers should be considering when exploring virtual reality for the first time.

Tom started his presentation with a quick history of Oculus, an overview of the specs for DK2 and shared some interesting statistics about just how fast their developer community has grown. In March 2013 they shipped the first 10K kickstarter and initial order devkits. Over the course of the rest of the year 55K more devkits have shipped. But interestingly, there are 70K developers registered on the Oculus dev portal. That means that there are five thousand developers who are registered but don't have a devkit!

Before getting into the meat of the content of his talk, Tom asked the audience to allow him to do a little "preaching" and the message was loud and clear: be kind to your players. His feeling is that as developers we tend to get used to the VR we are working on and build up a tolerance to aspects or issues which can be jarring and uncomfortable for our users. It's important to keep in mind that everyone responds to VR differently and that care needs to be taken to keep the intensity down so that the experience is enjoyable for the majority of players. He suggests having options that allow eager players to turn up effects and movement if that's what they want, but to have the default be low and make it easy for players to change and find the level that is best for them.

VOR-Gain Explained

The Vestibulo-Optical Reflex (VOR) is the adaptation we have which helps keep our eyes fixed on an object even while our head moves. It's a smooth motion in our eye muscles controlled by our ears sensitivity to rotation – it's involuntary, happens whether we are seeing anything or not (eyes closed or in the dark) and usually gives a 1:1 compensation between head rotation and eye motion. The tuning of the system is also extremely slow – on the order of weeks and most commonly experienced by people in the real world when they get a new eyeglass prescription. VOR-Gain can be thought of as the ratio between ear motion and eye response. Like when we get new glasses, VR can change the proportion and mess with the way our brain responds to the difference in VOR-Gain, and it's almost always unpleasant. To preserve VOR Gain, it's imperative that the simulation must render images that match the HMD and user characteristics. Unlike a desktop game, FOV is not an arbitrary choice but rather needs to be calculated with regard to the physical pitch of the display and the user's IPD. The SDK helps you match this precisely with data from the user configuration tool and we are discouraged from changing the settings no matter how tempting that may be.

Moving on to talking about the IPD, Tom explained that its more complex that most people think. Instead of just being the distance between the eyes, its actually two components per eye: nose to pupil distance and eye relief (distance from the lens surface to the pupil) and neither of these are related to the dimensions of the HMD. It was interesting to note that these are seldom symmetrical. Taken together, the components form a center-to-eye vector which is set during user configuration and stored in the user profile. This center eye position is roughly where players "feel" they are and is a good place to use for positional things like audio, line of sight checks and the origin for reticule/crosshair ray-casts. Within the application, there should be an easy way for users to reset their position when they are in a neutral forward pose, set by calling sensor->Recenter().

Although Tom was emphatic about not messing with the user's settings, scaling them uniformly is a way of effectively changing the world scale – and something he suggests we do experiment with. In general most users find reducing the world scale helps reduce the overall intensity as it scales down all motions and accelerations – but dont go too far or convergence can get tricky.

One question that every VR application needs to answer is how tall is the player? The SDK does provide a value for the eye height off the ground calculated from the user's provided height in real life. Sometimes that makes sense to use, and other times it doesnt. If your game is about being a character of a particular stature, the player's real life size may not be a good value to use. In other applications, using the players real size may help them feel comfortable and ease them into presence. Another interesting observation is the phenomenon of "floor dragging" which is the distance your brain tells you is how far away the floor is. The same VR experience can feel very different with the player seated as opposed to standing up!

Animating the player character presents a set of problems that most every game is going to have to consider. There are often unavoidable transition animations when you enter/exit vehicles, get up after being knocked down interacting with elements in the world and the like. There is the temptation to animate the camera as you would in a desktop game, but in Tom's experience from TF2 this almost never works well for the player. In practice his advice is to almost always do snap cuts or fade out and fade back in while never taking camera control away from the player.

Meathook Avatars

Animating the player's avatar can have a strong positive impact, especially with first person actions like high fives, or calling for a medic in TF2. But they need to play without moving the camera position – the virtual camera should always move with the player's real head and the position of the avatar's head should coincide with the camera position. To accomplish this, Tom suggests an approach he calls "Meathook Avatars". The idea is pretty simple, in that you find the avatar's animated head position, eliminate (scale to zero) the avatar's head geometry and then move the body so it lines up with the player's virtual camera position. Visualized by hanging the animating body of the avatar from a meathook located at the camera position.

The last couple topics Tom talked about had to do with maintaining framerate. For a normal game, a fluctuating framerate can be annoying but in VR it will almost certainly break the player's sense of presence. Rendering a stereo scene at the higher resolution required by the DK2 at 75FPS is challenging for even the strongest PCs and GPUs today and the main costs are draw calls and fillrate.

This is not news to developers who have worked on stereoscopic projects in the past, but for many people working in VR doing 3D is new as well. For good VR the trick of doing 2D plus depth doesn't work very well and it is strongly recommended to do two renders – which in general results in twice as many draw calls, but a number of things can be done once: culling, animation, shadows, some distant reflections/effects and certain deferred lighting techniques. Fill rate on the DK2 is set with the 1080x1920 frame buffer (and dont change this!) but the camera-eye typically renders 1150x1450 per eye and is determined by the user's face and eye position (set by the profile & SDK). The advice is that it's okay to change the size of the virtual camera renders, but not the framebuffer size. The distortion correction pass will resample and filter it anyway. It's also okay to dynamically scale it every frame – if you have lots of particles or explosion effects that frame, drop the size. The SDK supports this use case explicitly.

Lessons Learned

In conclusion, my impression was that both the VR talks were well attended and well received. The Oculus guys have made a lot of progress this year from the initial devkit to the DK2 and the introduction of the Sony HMD means developers will have more platform options for their VR projects. These are certainly amazing days to be involved with game development, and the fact that virtual reality equipment is being developed to a higher quality than ever by some of the very smartest people makes it that much more exciting. We are almost there...