Search

MTBS3D As fun as Arizona Sunshine is in traditional #VR, @Vertigo_Games took it up a notch by transforming it into a locat… https://t.co/YkGpv2wLMM
MTBS3D .@OfficialGDC would not be complete without visiting SVVR's annual #VR Mixer! In today's interview, we catch up wi… https://t.co/hibivrbYdq
MTBS3D Spencer Jackson, Software Engineer at @NordicTrack, talks about their latest iFit #VR Bike paired with an #HTCVivehttps://t.co/5b2uD9Hoa9
MTBS3D William Provancher is the CEO of @TacticalHaptics. He demonstrated their latest haptics controllers for us in this… https://t.co/Ir1Cog8bRI
MTBS3D Gaspar Ferreiro is the CEO of Project Ghost Studios. In this interview, he talks about their new Project Ghost dem… https://t.co/T2xz1VdtGI
MTBS3D .@EpicGames had loads of news to share at @OfficialGDC. Marc Petit is the General Manager of #Epic's @UnrealEnginehttps://t.co/CnqpGAB2f4
MTBS3D Chris Hook, Graphics & Visual Technologies Marketing Chief for @intel spoke to us during @OfficialGDC. We talked ab… https://t.co/ji6AKJpfwM
MTBS3D We interviewed @networknextinc at #GDC2019. They are in the business of ensuring the best connectivity and lowest l… https://t.co/87b06uMAm7
MTBS3D .@reality_clash is a developing #AugmentedReality combat game. We got to interview Tony Pearce, the CCO and Co-Fou… https://t.co/24P5kLz0Ef
MTBS3D Robots explode at #GDC2019 with @FuturLab. They have a new title for #PSVR called Mini Mech Mayhem. #GDC19https://t.co/JiIuJgGZ64
MTBS3D .@zerolatencyVR has a number of #VR out-of-home entertainment centers around the world, and we got to catch up with… https://t.co/NZJBVyRUWz
MTBS3D RT @GetImmersed: Dr. Ofer Shai is the Director of Omnia AI at @DeloitteCanada. He talked about the misconceptions about #ArtificialIntellig
MTBS3D RT @GetImmersed: The use of #futurecomputing in #healthcare was one of the prominent tracks at #Immersed2018, and we got to see some really…
MTBS3D RT @GetImmersed: Ricardo Wagner, Director of Product Marketing for #Office365 at @microsoftcanada, talked about their efforts to make moder…
MTBS3D RT @GetImmersed: Pascal Langlois, Founder of Collective Intent, talks about the potential of using motion capture technologies to re-enable…
MTBS3D RT @GetImmersed: David Parker, Founder of @teamwishplay, talked at #Immersed2018 about how they are using #immersivetechnologies like #Virt
MTBS3D RT @GetImmersed: Richard Huddy, Head of the Game Ecosystem at the Samsung Research Institute (UK), was the second keynote at #Immersed2018.…
MTBS3D RT @GetImmersed: .@JoanneAska, Co-Founder of @TribeOfPan, talks about @TheChoice_VR their innovative #VR project that addresses the topic o…
MTBS3D .@ArozziChairs makes high-end #gaming chairs and tables. Scott Nishi, Sales Manager for Arozzi, spoke to us at… https://t.co/4U4LyU1SJn

MTBS Tries Out Oculus Touch at E3 Expo


If all goes to plan, I'm going to try out the full CV1 demo this morning, so my only experience has been with the Toy Box demo with Oculus' new VR controllers. It was a solitary demo, so I figure it deserves an article of its own.

The demo had you standing on a 5' floor mat facing away from the computer monitors. Ahead of you are two tracking cameras mounted high up at approximately 45 degrees apart. Putting on the CV1 and half-moon controllers was easy enough; Oculus did a great job of simplifying the HMD especially. Right away, I can tell you that the tracking for both the HMD and controllers were excellent. I'm going to reserve judgement on the image quality until I see the full-blown demos later; I just want to focus on input today.

The controllers use two types of sensors to detect what your fingers are doing. They have traditional buttons and triggers, and they also have capacitive sensors similar to what your smartphone uses to detect whether or not your fingers are touching the device's surface.  This is best used for hand gestures like waiving, thumbs up, pointing, etc.

As I mentioned, I was really happy with the tracking and how it detected exactly where my hands were in the virtual and physical space. I was also amazed that the size of my hands in VR seemed to be one to one. When I asked how this was accomplished, they explained that the hand sizing is based on the average size of a human adult and where fingers are likely be located based on how the controller is designed. So if I had tiny hands or extremely large hands, I don't think that one to one experience would have been quite as good.

As the majority of the industry has been very focused on picking up the fine movements of your digits and working to let you grab things as you naturally do, I had very similar expectations from Oculus. Instead, they have taken a far simpler approach where grabbing is an on-off type of motion, some fingers can be extended or closed - nothing in-between, and there are buttons and mini-sticks for everything else.


The software demo was excellent because it required you to pick up toys and manipulate them, you had another virtual buddy interacting with you in another room - it was a great testing bed for this. Unfortunately, I struggled through the whole thing (MTBS' Kris Roberts is my witness!). It was consistently trial and error as I attempted to pick things up because I often failed to get the actual object I was aiming for. In the fifteen minutes I was working with this, I didn't get that ahah moment of easily manipulating the environment and was instead a virtual klutz. I've used other VR controllers which I had a much easier time with, so something about this was disconnected from what my body expected.

Now remember that I mentioned there were two tracking cameras in the room. The CV1 has been promoted as having one camera, so I'm hypothesizing that a second camera will be added alongside the controller to widen the room's coverage. I don't know why both cameras were placed on one side instead of either side of the room to maximize detection range. Maybe it's because a camera can only work for one device at a time; I don't know. I asked about occlusion, and I was told the controllers have magnetometers and accelerometers to compensate; I didn't get a chance to test this.

While input is the main attraction, the half-moons are in desperate need of some serious haptics. I mean SERIOUS haptics. I asked about this during the session, and she explained that you feel a nudge as you pick things up. After she mentioned it, I did eventually take notice. I know that Oculus can't break the laws of physics and deliver a Novint Falcon level of haptics with unbound controllers, but they really need to take it up a level - especially if they are calling this Oculus Touch. If there is a way they can make it possible to experience things like texture or get a physical sense of object shape, that would be impressive. Their hardware might already be capable of this with some creative software programming.

So how would I rate the half-moon controllers? I think that if I had more than fifteen minutes, I would have eventually mastered them - and maybe that's the point. I would have had to master them; it wasn't the pick up and go experience that I expected from a VR controller where my physical hands and virtual hands were speaking the same language.

Kris Roberts had a blast, and the rest of the press seem to be super content with Oculus' Touch controllers which makes my experience a possible exception to the rule. Still, Oculus likes to say that input is hard, and I'm hopeful we will see further improvements and innovations as they get closer to launch. Or better yet, maybe we will see multiple controller options so people can interact as they do best.

As it is, I'll describe Oculus Touch as very promising.  Pardon the pun, but these ain't no XBOX controllers!  Oculus has definitely come up with something custom and innovative for the VR space.  They just didn't immediately mesh as I expected them to.  When I get more time with them or future revisions, I'll revisit this - I really don't think I had enough time to make a fair assessment for the long haul.