Search

MTBS3D RT @GetImmersed: .@pumcypuhoy from @tomshardware brought Acer’s #WindowsMixedReality headset! #Immersed2017! #MixedReality #MR https://t.c…
34mreplyretweetfavorite
MTBS3D RT @GetImmersed: HumanEyes has an EXCLUSIVE promotion on the @vusecamera just for attendees of #Immersed2017! Stop by their booth during sh…
35mreplyretweetfavorite
MTBS3D RT @GetImmersed: Don’t miss this talk with Olga! She’s an award winning #VirtualReality artist! #Immersed2017 #VR #TiltBrush https://t.co/I
3hreplyretweetfavorite
MTBS3D RT @onabatova: My talk @GetImmersed on Friday 20th. Lots of great speakers #TiltBrush #MadeWithBlocks #masterpiecevr https://t.co/YkaXqKn1Th
3hreplyretweetfavorite
MTBS3D RT @GetImmersed: We’re at the Science Centre getting ready for tomorrow! Tickets available. #Immersed2017 #VirtualReality #VR #AR #MR #http
5hreplyretweetfavorite
MTBS3D @All_Hail_Cesar Hey, did you get your confirmation?
5hreplyretweetfavorite
MTBS3D RT @Lytro: Don't miss out @buzzhays speaking @GetImmersed in Toronto, Canada. #immersed2017 Register w/ $100 discount code: https://t.co/KK
MTBS3D RT @onabatova: I’ll be SHOWING MY VR creations @GetImmersed . Great speakers. See you there. #vr #tiltbrush #vrart https://t.co/Pu54rvsHaK
MTBS3D RT @developerWorks: #Immersed2017 October 19 – 21, 2017, Toronto, @mrjohncutter AI in Virtual Reality. https://t.co/QRlk5vy7iD https://t.c…
MTBS3D RT @onabatova: Come to @GetImmersed oct, 19-21 ! Amazing speakers , I’ll tell the story of becoming VR artist https://t.co/BH1P1x1LWw
MTBS3D RT @tomshardware: ▸ Immersed 2017 Brings The VR Industry Together From Oct 19-21 https://t.co/gkokpItHpk
MTBS3D RT @VuzeCamera: CanadaVuzers, @GetImmersed with our discount code! https://t.co/VrLKI2lkR9 Come see our booth and join the 3D/Immersive co…
MTBS3D .@GetImmersed is this week! Don’t miss it. Register today! #Immersed2017 #VirtualReality #VR #AR #MR #AIhttps://t.co/Jcomh7nfrk
MTBS3D RT @StereoDToronto: Stereo D's Nick Brown will be at Immersed 2017 next weekend. As part of the Immersive Cinema Panel Friday at 2PM and Sa…
MTBS3D RT @GetImmersed: Welcome to the Future of Intelligent Digital Reality at #Immersed2017! #VirtualReality #VR #AR #MR #AI https://t.co/vdMESI
MTBS3D RT @Suometry3D360: We will be speaking amongst a line up of the top in the business. @GetImmersed https://t.co/XqqvQAcKlP
MTBS3D RT @GetImmersed: Don't miss the Futurists panel moderated by @AjayFry from @SpaceChannel & speakers from @intel & @HP #Immersed2017! https:…
MTBS3D RT @IOnews: .@GetImmersed Oct 19-21! Hear from HP, AMD, Google & exhibits from Intel, Pimax & more! #Immersed2017 #VR #AR #MR https://t.co/

MTBS Tries Out Oculus Touch at E3 Expo


If all goes to plan, I'm going to try out the full CV1 demo this morning, so my only experience has been with the Toy Box demo with Oculus' new VR controllers. It was a solitary demo, so I figure it deserves an article of its own.

The demo had you standing on a 5' floor mat facing away from the computer monitors. Ahead of you are two tracking cameras mounted high up at approximately 45 degrees apart. Putting on the CV1 and half-moon controllers was easy enough; Oculus did a great job of simplifying the HMD especially. Right away, I can tell you that the tracking for both the HMD and controllers were excellent. I'm going to reserve judgement on the image quality until I see the full-blown demos later; I just want to focus on input today.

The controllers use two types of sensors to detect what your fingers are doing. They have traditional buttons and triggers, and they also have capacitive sensors similar to what your smartphone uses to detect whether or not your fingers are touching the device's surface.  This is best used for hand gestures like waiving, thumbs up, pointing, etc.

As I mentioned, I was really happy with the tracking and how it detected exactly where my hands were in the virtual and physical space. I was also amazed that the size of my hands in VR seemed to be one to one. When I asked how this was accomplished, they explained that the hand sizing is based on the average size of a human adult and where fingers are likely be located based on how the controller is designed. So if I had tiny hands or extremely large hands, I don't think that one to one experience would have been quite as good.

As the majority of the industry has been very focused on picking up the fine movements of your digits and working to let you grab things as you naturally do, I had very similar expectations from Oculus. Instead, they have taken a far simpler approach where grabbing is an on-off type of motion, some fingers can be extended or closed - nothing in-between, and there are buttons and mini-sticks for everything else.


The software demo was excellent because it required you to pick up toys and manipulate them, you had another virtual buddy interacting with you in another room - it was a great testing bed for this. Unfortunately, I struggled through the whole thing (MTBS' Kris Roberts is my witness!). It was consistently trial and error as I attempted to pick things up because I often failed to get the actual object I was aiming for. In the fifteen minutes I was working with this, I didn't get that ahah moment of easily manipulating the environment and was instead a virtual klutz. I've used other VR controllers which I had a much easier time with, so something about this was disconnected from what my body expected.

Now remember that I mentioned there were two tracking cameras in the room. The CV1 has been promoted as having one camera, so I'm hypothesizing that a second camera will be added alongside the controller to widen the room's coverage. I don't know why both cameras were placed on one side instead of either side of the room to maximize detection range. Maybe it's because a camera can only work for one device at a time; I don't know. I asked about occlusion, and I was told the controllers have magnetometers and accelerometers to compensate; I didn't get a chance to test this.

While input is the main attraction, the half-moons are in desperate need of some serious haptics. I mean SERIOUS haptics. I asked about this during the session, and she explained that you feel a nudge as you pick things up. After she mentioned it, I did eventually take notice. I know that Oculus can't break the laws of physics and deliver a Novint Falcon level of haptics with unbound controllers, but they really need to take it up a level - especially if they are calling this Oculus Touch. If there is a way they can make it possible to experience things like texture or get a physical sense of object shape, that would be impressive. Their hardware might already be capable of this with some creative software programming.

So how would I rate the half-moon controllers? I think that if I had more than fifteen minutes, I would have eventually mastered them - and maybe that's the point. I would have had to master them; it wasn't the pick up and go experience that I expected from a VR controller where my physical hands and virtual hands were speaking the same language.

Kris Roberts had a blast, and the rest of the press seem to be super content with Oculus' Touch controllers which makes my experience a possible exception to the rule. Still, Oculus likes to say that input is hard, and I'm hopeful we will see further improvements and innovations as they get closer to launch. Or better yet, maybe we will see multiple controller options so people can interact as they do best.

As it is, I'll describe Oculus Touch as very promising.  Pardon the pun, but these ain't no XBOX controllers!  Oculus has definitely come up with something custom and innovative for the VR space.  They just didn't immediately mesh as I expected them to.  When I get more time with them or future revisions, I'll revisit this - I really don't think I had enough time to make a fair assessment for the long haul.