MTBS3D Check out this interview with Tony Zheng, Co-Founder of @matatalab. They have developed a new robotic toy for child…
MTBS3D Adshir makes ray tracing technology that works on countless platforms and device types. They’re are the first on r…
MTBS3D We interviewed Steve Venuti, VP of Marketing for @KeyssaTech. They've developed a solid state connector that is cha…
MTBS3D Jim Malcolm, Chief Marketing Officer for @HumanEyesTech demonstrated their latest combination 360 degree plus 180…
MTBS3D .@Dlink spoke to us last week at #CES2020 about their latest Wifi 6 mesh router and what it means for end users.…
MTBS3D RT @tifcagroup: Our Client-to-Cloud Revolution Portal is LIVE! Computing is heading towards a world of any place, any device user experienc…
MTBS3D RT @IFCSummit: .@thekhronosgroup President @neilt3d talks open standards and client-to-cloud at #IFCSummit. #CloudComputing
MTBS3D Please help @tifcagroup by completing their survey by December 16th!
MTBS3D RT @IFCSummit: #IFCSummit Visionaries Panel with @IntelSoftware @intel @AMD and @jonpeddie talks Client-to-Cloud. #CloudComputing #FutureCo
MTBS3D RT @IFCSummit: The Futurists Panel moderated by @deantak of @VentureBeat is online. #IFCSummit #CloudComputing #FutureComputing #Futurists
MTBS3D RT @IFCSummit: Daryl Sartain is Director & Worldwide Head of #XR, Displays, and Wireless Ecosystems for @Radeon at @AMD. He is also Chair o…
MTBS3D RT @IFCSummit: Arvind Kumar is a Senior Principal Engineer for @intel @IntelSoftware. At #IFCSummit he explained the workings of the Client…
MTBS3D RT @IFCSummit: Neil Schneider’s #IFCSummit opening presentation. #CloudComputing
MTBS3D RT @IFCSummit: Our #videogames in the #clienttocloud revolution is going on now featuring @playhatchglobal @AccelByteInc @awscloud and @Sha

Oculus Rift Development Kit Review


After plugging everything in, I ran into a couple problems. The first issue was minor: a dead neon green pixel. The bigger issue was head tracking. Thankfully, after four hours of digging around, the solution was to disable the macro software that came with my AZiO mouse, a solution Logitech users should also take note of.

Oculus Rift Screencap

Now I can get started! I'm still playing around with what's out there, but for the purposes of this review, I focused on the included Oculus demos, and Valve's Half Life 2 and Team Fortress 2. The demos were quick, but I think they provided a fair taste of the effect and let me gauge my aptitude for VR. I was happy to see that I didn't need much practice to get my "VR legs".

Tiny Room: A straightforward demo that provides a small sampling of clipping-free geometry: two chairs, two tables, some posts and a shelf. This simple and visually minimal environment is a perfect place to focus on the Rift's actual capacity for stereoscopy. The plain geometry creates simple and identifiable perspective, which it turns out is perfectly suited to the low resolution of the developer model.

Tuscany: My visual cortex says I can touch this plant, but my cerebral cortex says it's not real.
Tuscany: The Tuscany demo is a fantastic example of what an HMD can achieve. The stereoscopic effect is pronounced, and objects visibly pop and have dimensionality. The environment itself is quick and dirty, with muddy textures in the distance and free 3D props, but all of this is lost within the Rift. This is partly due to the Rift's low resolution, but also because of a palpable sense of immersion created by the head tracking.

I found Tuscany to be an interesting demo of the Rift's potential; partly because it gets its interpupillary distance (IPD) setting from an in-house configuration tool packaged with the developer kit. So far, I think Tuscany is the least visually strenuous experience I've had on the Rift. My eyes naturally relax while retaining focus, and this is an experience I hope to have in games designed to support the Rift from the ground up. It also has the most spatial depth from the software I chose to test.

Team Fortress 2: I have to commend Valve for their post-release VR game enhancements. Their configuration tool is both easy to use and functional, and to add free Oculus support to a free game so quickly is a measure all developers (and publishers!) should be held to by today's customer.

With a console command, I entered Valve's configuration utility. This utility has you move the periphery of each screen, having you adjust all four sides twice. Along with configuration, there are many options for look and aim control. I settled on head and mouse look, with a small aiming keyhole.

By turning my mouse DPI down, I was able to combine head movements and mouse adjustments to almost replicate the freedom of motion I typically enjoy with the game. It's a natural feeling: you look with your head, and the mouse controls your arm until it reaches the keyhole periphery, where it then controls the rotation of your body. This is a control schema I'll be looking to use in first person shooters in the future. Decoupling the gun feels natural, and if the HUD could be ray traced out into the game world it would be perfect.

Overall, TF2's 3D depth effects aren't as great as I hoped for. When I sit still and gawk at an interesting piece of geometry, it's tangible. While moving and focusing on play, it's too easy to ignore the stereoscopy - perhaps even preferable. In this case, the fast pace of a competitive FPS didn't lend well to an immersive experience, and I found it easier to play with full mouse look while using the head tracking to refine my aim.

I think there is room for exploration here. With some time acclimating to an HMD as another control input, I predict that an increase in accuracy and situational awareness will develop, which would make a lighter high-res HMD a competitive advantage in action gaming.

HalfLife 2: Is the Oculus Rift a future home for masochists?
Half Life 2: To get the game working, all I had to do was copy the Oculus Rift configuration settings created in TF2 and paste them into the HL2 config file. It was with HalfLife 2 that the Rift's low resolution became a real problem.

The biggest issue is the model geometry loses its detail and becomes a pixelated soup as you look deep into the distance. Trying to make out enemies more than five meters away is like trying to aim while looking at an old TV with a magnifying glass because characters become nothing more than moving blobs. I expect this problem will be resolved in the consumer model because the resolution will be much higher than what we have today. For me, it's a problem I can ignore for now, but this could be a major issue for others using the Oculus Rift development kit.

HalfLife 2: Nine years later, and the world feels fuller than ever.
Setting aside the limitations of draw detail, the experiences I've had playing HalfLife2 on the Rift have been excellent. The sense of your character's height relative to the world is fantastic. Normally when I play an FPS, I tend to map my character's eyes to the top of the screen which makes the game world feel too small. In contrast, the Rift and its head tracking creates a proper sense of my place in the environment. The result is swinging that crowbar has never felt so engaging, and I can watch my character's arm move independently. On a side note, as a fan of horror, I look forward to Ravenholm.

HalfLife 2 had occasional problems with their head tracking support. For example, during the speed boat gunning section, your gun can get stuck positioned sideways and is impossible to aim with.

Separate from the VR gaming experiences, I'm most excited about Valve's demonstrated commitment to this technology, and I'm looking forward to what they come up with in the not too distant future.