Search

MTBS3D RT @IfcSummit: We are excited to welcome Director in Privacy and Security, Paul Lanois, for @Fieldfisher as a speaker at #IFCSummit. Paul…
MTBS3D Jim Jeffers talked about @intel’s efforts to enable over a billion users with creative and computing tools.… https://t.co/Z9fi0pS8xp
MTBS3D RT @IfcSummit: Adshir to discuss ray tracing at #IFCSummit. Adshir is strategically important to the #ClienttoCloud Revolution because it’…
MTBS3D RT @IfcSummit: We are honored to welcome Linda Sellheim, Education Manager for @EpicGames, to the upcoming #IFCSummit #IFCSummit2019 speake…
MTBS3D RT @tifcagroup: Enabling Breakthrough Innovations in the #ClienttoCloud Revolution. https://t.co/dglxcxLO3D
MTBS3D RT @tifcagroup: TIFCA’s Client-to-Cloud Vision document has been published. We are meeting during #SIGGRAPH2019 to jointly address the cont…
MTBS3D It’s been a major boon for the Client-to-Cloud Revolution at #E3. #E32019 #E319 #GoogleStadia #BethesdaE3https://t.co/IqIrR81D8o
MTBS3D RT @IfcSummit: Sixth International Future Computing Summit Moves to Silicon Valley November 5-6, 2019. Open Call for Visionary Speakers and…

Oculus Rift Development Kit Review


Software

After plugging everything in, I ran into a couple problems. The first issue was minor: a dead neon green pixel. The bigger issue was head tracking. Thankfully, after four hours of digging around, the solution was to disable the macro software that came with my AZiO mouse, a solution Logitech users should also take note of.

Oculus Rift Screencap

Now I can get started! I'm still playing around with what's out there, but for the purposes of this review, I focused on the included Oculus demos, and Valve's Half Life 2 and Team Fortress 2. The demos were quick, but I think they provided a fair taste of the effect and let me gauge my aptitude for VR. I was happy to see that I didn't need much practice to get my "VR legs".

Tiny Room: A straightforward demo that provides a small sampling of clipping-free geometry: two chairs, two tables, some posts and a shelf. This simple and visually minimal environment is a perfect place to focus on the Rift's actual capacity for stereoscopy. The plain geometry creates simple and identifiable perspective, which it turns out is perfectly suited to the low resolution of the developer model.

Tuscany: My visual cortex says I can touch this plant, but my cerebral cortex says it's not real.
Tuscany: The Tuscany demo is a fantastic example of what an HMD can achieve. The stereoscopic effect is pronounced, and objects visibly pop and have dimensionality. The environment itself is quick and dirty, with muddy textures in the distance and free 3D props, but all of this is lost within the Rift. This is partly due to the Rift's low resolution, but also because of a palpable sense of immersion created by the head tracking.

I found Tuscany to be an interesting demo of the Rift's potential; partly because it gets its interpupillary distance (IPD) setting from an in-house configuration tool packaged with the developer kit. So far, I think Tuscany is the least visually strenuous experience I've had on the Rift. My eyes naturally relax while retaining focus, and this is an experience I hope to have in games designed to support the Rift from the ground up. It also has the most spatial depth from the software I chose to test.

Team Fortress 2: I have to commend Valve for their post-release VR game enhancements. Their configuration tool is both easy to use and functional, and to add free Oculus support to a free game so quickly is a measure all developers (and publishers!) should be held to by today's customer.

With a console command, I entered Valve's configuration utility. This utility has you move the periphery of each screen, having you adjust all four sides twice. Along with configuration, there are many options for look and aim control. I settled on head and mouse look, with a small aiming keyhole.

By turning my mouse DPI down, I was able to combine head movements and mouse adjustments to almost replicate the freedom of motion I typically enjoy with the game. It's a natural feeling: you look with your head, and the mouse controls your arm until it reaches the keyhole periphery, where it then controls the rotation of your body. This is a control schema I'll be looking to use in first person shooters in the future. Decoupling the gun feels natural, and if the HUD could be ray traced out into the game world it would be perfect.

Overall, TF2's 3D depth effects aren't as great as I hoped for. When I sit still and gawk at an interesting piece of geometry, it's tangible. While moving and focusing on play, it's too easy to ignore the stereoscopy - perhaps even preferable. In this case, the fast pace of a competitive FPS didn't lend well to an immersive experience, and I found it easier to play with full mouse look while using the head tracking to refine my aim.

I think there is room for exploration here. With some time acclimating to an HMD as another control input, I predict that an increase in accuracy and situational awareness will develop, which would make a lighter high-res HMD a competitive advantage in action gaming.

HalfLife 2: Is the Oculus Rift a future home for masochists?
Half Life 2: To get the game working, all I had to do was copy the Oculus Rift configuration settings created in TF2 and paste them into the HL2 config file. It was with HalfLife 2 that the Rift's low resolution became a real problem.

The biggest issue is the model geometry loses its detail and becomes a pixelated soup as you look deep into the distance. Trying to make out enemies more than five meters away is like trying to aim while looking at an old TV with a magnifying glass because characters become nothing more than moving blobs. I expect this problem will be resolved in the consumer model because the resolution will be much higher than what we have today. For me, it's a problem I can ignore for now, but this could be a major issue for others using the Oculus Rift development kit.

HalfLife 2: Nine years later, and the world feels fuller than ever.
Setting aside the limitations of draw detail, the experiences I've had playing HalfLife2 on the Rift have been excellent. The sense of your character's height relative to the world is fantastic. Normally when I play an FPS, I tend to map my character's eyes to the top of the screen which makes the game world feel too small. In contrast, the Rift and its head tracking creates a proper sense of my place in the environment. The result is swinging that crowbar has never felt so engaging, and I can watch my character's arm move independently. On a side note, as a fan of horror, I look forward to Ravenholm.

HalfLife 2 had occasional problems with their head tracking support. For example, during the speed boat gunning section, your gun can get stuck positioned sideways and is impossible to aim with.

Separate from the VR gaming experiences, I'm most excited about Valve's demonstrated commitment to this technology, and I'm looking forward to what they come up with in the not too distant future.