Meet the HDMI LLC Family!

In this video, Steven Venuti, President of HDMI LLC, talks about HDMI 2.0, VR, and new technologies for gamers.  Lots of fun!
Comments (6)   Add comment

MTBS Interviews Tactical Haptics: Part 2!

William Provancher, Tactical Haptics

A few weeks ago, we had the opportunity to interview Dr. William Provancher, Founder of Tactical Haptics.  His team has been working on an exciting new controller (the "Reactive Grip" controller) for gamers and VR enthusiasts, and today marks the launch of their Kickstarter!

William is back to share the latest details, and highlights some content development ideas for future haptics professionals!

MTBS: Today is the big day! How much money do you need to raise and what can people expect for their investment?

WP: We're looking to raise $175,000. For their investment, people will receive one or more of the following: project updates, access to our developer forums, a Tactical Haptics "Get a Grip" T-shirt, Reactive Grip controller(s), or even spend a day with us to see our past designs and related research.

MTBS: When would you like to see Tactical Haptics controllers in people's hands?

WP: We are targeting to deliver our first production development kits in October 2014. We'll need several months after the Sixense STEM design is finalized to ensure proper integration of the STEM modules into our design before our manufacturing can begin. Early delivery development kits will also available and are planned to be delivered in April 2014.

Tactical Haptics' Controller Prototype

MTBS: How much do you expect Tactical Haptics controllers to cost via Kickstarter?

WP: Our Reactive Grip development kits will list for $179 for a 1-handed development kit and $349 for a 2-handed development kit.

MTBS: What are the updated Tactical Haptics controller specs?  How does it compare to earlier models?

WP: Our formal feature list is written as follows:
  • Features three responsive and independently driven sliding plates in the device handle for immersion in games and VR that goes well beyond "buzz"
  • Compact, ambidextrous design for comfortable left or right hand use (allows 2-player gaming with a 2-handed development kit)
  • 8 traditional controller buttons, 1 analog thumbstick, 1 analog trigger
  • Compatibility with the Sixense STEM tracking system
  • Mounting points to connect your own tracking solution
  • SDK support for Unity and C++ games and projects

The controller is also less than half the weight and two-thirds the size of the Orange device that we demoed at GDC earlier this year.

MTBS: Up until now, the console market has been locked down to the XBOX, Sony and Nintendo world - and most if not all the supported peripherals are determined behind closed doors. Valve's Steambox is base on Linux, and I would venture the hardware compatibility will be a somewhat more open model. Is Valve's Steambox strategically important to Tactical Haptics? Why or why not?

WP: I don't know if Steambox will be strategically important to Tactical Haptics, but any platform that is open like the PC will certainly be beneficial to new peripherals.

MTBS: You recently visited Valve to show off your controller prototype. What did they think? Do you think some opportunities will come to fruition from this? What would you have liked to accomplish from your meeting?

WP: There was some enthusiasm for our technology, but I'd prefer not to speculate on what will result from the meeting. My goal was to make their developers aware of our technology and the sense of presence it can provide in a virtual or gaming environment. However, I was told that their VR group was exclusively focused on commercially available and standard peripherals at this time, so things may take a while to percolate to get somewhere. I've since had some other interesting meetings with other game developers that have shown more promise, so we'll see what the future holds... ;-)

MTBS: Software practically MAKES the hardware. After people get their kit, what start-up software will they have access to?

WP: Development kits are scheduled to start shipping in Oct. 2014. When they do, we'll have SDK support for Unity and C++ games and projects. We'll also make source for several of our demos available so that people have examples of implementing our touch feedback in several gaming scenarios.

Tactical Haptics' Controller

MTBS: Since we spoke last, castAR has really been storming the world not only with a really cool method for augmented reality, but they have excellent VR potential as well. Their glasses tracking techniques supposedly have 7mm positional accuracy which strikes me as being very impressive; at least on paper! Do you think your controllers' positioning would benefit from their technology? Do you foresee any compatibility issues?

WP: Our haptic controllers will be "STEM-ready" but are tracker independent, so could also use optical tracking methods like those used by Technical Illusions in their castAR system.

MTBS: When most people buy your controller, this will be the first haptics experience they have outside the confines of a vibrating joystick. What are some content creation lessons you have learned that should help future content makers?

WP: The key is to pick some type of real world physical interaction and mimic this in the game. Whether it is a simplified version of the interaction or trying to represent complex physical interactions, if the user/gamer has a good mental model for that interaction, that is that they can imagine how it would feel, it will be more successful when it is implemented. We've demonstrated this type of physical interaction through demos that range from fishing, to melee combat, to projectile weapons. This also works for interactions as simple as groping for a wall to know your destination "in the dark."

MTBS: There is a lot of excitement around your product, so let's be optimists and say that your Kickstarter gets funded, the tech lands in developers' hands, people get a taste of what they can do with your controller...what then? What is the long term hurdle you'd like to overcome beyond this first round of technology development?

WP: The real key is to get beyond the next hurdle, which is having AAA game developers integrate support for this type of touch feedback into their games. We need to break the cycle and end the chicken-egg scenario where developers won't touch a new game peripheral until there is install base and consumers won't buy new peripherals until lots of great game content exists. We hope that we can rally support to break this cycle by getting gamers and VR enthusiasts involved at the ground level with our Kickstarter.

MTBS: VR has come and gone before. Haptics devices have existed in different forms as well - though yours works differently than what has been on the market to date. Why will VR succeed this time around? What are the markers that have convinced you that this is the right direction for your career and the industry at large?

WP: I think that the required technologies have simply come of age. And that the experience is now good enough at a low-enough price point that this will draw in a substantial number of people into VR. And this influx will cause VR to explode, due simply to the fact that so many more people are involved... which in turn will cause companies to see this as a viable market and drive further hardware innovation, performance improvements and cost reductions.

Thanks for visiting MTBS, and congratulations on getting this exciting Kickstarter launched.  I'm certain countless gamers and enthusiasts have been impatiently waiting to help back this effort, and we wish you much success.  Good luck!
Comments (0)   Add comment

MTBS Interviews InfinitEye

Something MTBS has always been proud of are the brilliant minds that frequent our forums and help build up the immersive technology industry from scratch.  While Palmer Luckey's Oculus Rift has captured plenty of headline space, Lionel Anton is also in the VR running with his do-it-yourself InfinitEye head mounted display (HMD).  Featuring super high field of view and affordable components, there is a lot of potential behind his team's VR work in progress - check it out!

MTBS: Welcome to the interrogation chair, Lionel! How did you first get interested in VR?

Lionel: I can't remember exactly how I got interested in VR in the first place, but I have a memory of a video game magazine from my childhood where there was a picture of someone with an HMD and a data glove interacting with a virtual world.  The graphics would seem absolutely awful nowadays! I knew then that I wanted to experience something with such an immersion, but I never had the chance to even get close to that until I decided to make my own stereoscopic high field of view (FOV) display which I posted the concept on MTBS3D forums in August 2011.

MTBS: HMDs have been around for decades. What suddenly made them more practical than before?

Lionel: There are many reasons, but the main reason is that the new generation of HMDs is really better and cheaper than before. Due to the fast development of mobile devices, we have better screens, better tracking chips, all for really low prices. There's also the fact that with more powerful GPUs, the correction of optical flaws is managed on the software side. This allows us to simplify the optics and use tablet-sized screens compared to complex optics in older HMDs that were made with micro displays.

MTBS: For years, the consumer resolution limit for affordable HMDs was 640X480 pixels per eye. What was the fundamental change in how HMDs are put together that broke this barrier?

Lionel: The first affordable HMDs that broke the 640x480 barrier are Sony HMZ-T* series which are based on small OLED displays and the ST1080 which is based on LCOS displays - but none of them are designed for VR due to their limited FOV. On the other hand, the booming of mobile devices pushed manufacturers to improve the size and resolution of their panels which then started to be used by do-it-yourselfers (DIYers) for their homemade HMDs. That was the fundamental change that led to the emergence of true VR HMDs like the Oculus Rift and InfinitEye.

MTBS: Tell us about InfinitEye! How did this come about?

Lionel: Everything started back in 2006 while I was building a DIY projector. I bought an old slide projector in which I found an aspheric lens used for light collimation. I was playing with it and I noticed how easy it was to focus on close objects looking through it. Being passionate about stereoscopy for a long time I immediately thought about making a high FOV 3D display; in other words a HMD. However until 2010 this was just an idea and screens were quite expensive anyways. Then I found a 7.2" HD LCD and the same kind of lenses on the internet, which I used to design my first wide FOV HMD. I then shared my work in progress (WIP) on the MTBS3D forums in August 2011. I left it there until I saw how enthusiastic people were about VR during the Oculus Rift Kickstarter campaign.

I was seeing a lot of people making their DIY Rifts and I knew that a 90° field of view wouldn't be enough to satisfy my FOV addiction. I definitely wanted to design an HMD with a bigger FOV that could get the interest of the DIY community so I started doing research on a way to achieve what seemed to be unreachable, a field of view of at least 180°. Since flexible screens are unavailable for DIYers, two screens at an angle are necessary to reach that number, so I ordered two 5.6" 1280x800 panels and a ton of different lenses on the internet - including Fresnel lenses that were made out of polymethyl methacrylate (PMMA). Fresnel lenses used to be very bad due to smearing and bad light transmission, but not these ones because PMMA is a very clear material.  I ended up stacking them up to get enough magnification and I made some calculations to optimize the angle between the screens to get the best compromise between stereoscopic overlap and peripheral vision.

Later I heard that the 5.6" screens would no longer be available when Oculus had to switch to a 7" panel for their developer kit, so I ordered 7" screens and they turned out to be a better fit. I posted the design in the MTBS forums in February 2013, iterated a bit (actually there were three lenses stacked for each eye at the beginning, and now two of them), and asked for ideas on the name.  MTBS member MSat proposed "InfinitEye" and I liked it.

MTBS: On MTBS, you've described InfinitEye as an open source DIY ("do it yourself") effort. What does this mean? What are the benefits and limitations of open source HMDs? Can hardware really be open source?

Lionel: That's what I wanted at the beginning. I wanted the community to improve the design, share their ideas and start something like an open source SDK for it. Unfortunately, I realized that besides the fact that people were enthusiastic about it, only one guy tried to build his own prototype based on my concept; the Dual Portal project from Hannibalj2. Then a friend of mine, Stephane, who saw the potential of the InfinitEye in its early stage, convinced me to enter the Samsung contest to try to get funding to make the technology available to people through a manufactured product. I think open source hardware can be a good thing by sharing knowledge between a lot of talented people to improve the technology, but it's possibly not what most people are waiting for when it comes to HMDs. My view is that they expect an end product which is a really different thing and requires a lot of personal and financial investment that automatically conflicts with open source philosophy.

MTBS: Are you at this alone? How big is your team? Who's involved?

Lionel: Fortunately I'm not alone at this, two very good friends and I are forming what we unofficially call the InfinitEye team. We don't have a company yet, and since all three of us have full-time jobs, we are running the project on our free time.

First there was only me. I'm a Software Engineer with seven years of professional experience in image processing and compression - I've been coding for about 16 years though. As a hobby, I've always been involved in personal projects, sometimes related to software, sometimes to hardware (even if I'm much more a software guy), the current one being the InfinitEye which combines both aspects.

Then Stephane joined me. He has a strong technical background in IT and works as an international Project Manager for a famous aircraft manufacturing company on behalf of an IT services company. He is the one who pushed me to try to make something out of the InfinitEye project. Without him, I would have left the project as it was in February and moved on to something different. Within the project, he's focused on the management and communications parts, building up a network and raising funds.

Later, Robin joined us to take care of the software part of the project. He's a talented 3D expert Software Engineer and Software Architect. He has notably developed the 3D game engine we are using for our demos and has implemented all the specific algorithms related the head-tracking and distortion corrections.

MTBS: This looks like a big honking HMD! Is it as heavy as it looks? How much does it weigh compared to other products on the market?

Lionel: Yes it can be seen as a little bulky, but I guess that's not really a problem since it's not as heavy as it looks. Actually the InfinitEye is only an empty box with two screens on one side, Fresnel lenses on the opposite side, and air in between.  The two screens weigh 160g and the four lenses about 90g. The total weight including the case, light blocking foam and elastic band is about 390g which is approximately the same as the Oculus Rift developers kit (DK). The new prototype with a different head mount weighs 100g more but it is actually more comfortable since the weight is better distributed over the head.

MTBS: What is its current resolution? Are you aiming for more? Will panels be easy enough to find?

Lionel: The resolution of our current prototypes are 1280x800 per eye, but what really matters in terms of resolution in HMDs is the angular resolution. With the InfinitEye, the 1280 pixels are spread over a 150° FOV for each eye which gives an angular resolution of 8.5px/°. Ideally we'd like to source 7" LCDs with a 4:3 aspect ratio to increase the vertical FOV (for instance 1920x1440), but I don't think such panels exist so our best target would be WUXGA panels (1920x1200) like the ones used in the new Nexus 7. This would lead to an angular resolution of 12.5px/°for the InfinitEye Full HD compared to 10.6px/°for the Oculus Rift HD.

MTBS: Field of View (FOV) is a major selling point for you. How much FOV does InfiniteEye offer, and how does this compare to what we are actually capable of seeing? What is "horizontal stereoscopic FOV" and why is this different from the bigger number?

Lionel: I would say that for a device whose only purpose is to fake the sense of vision, the FOV is the key factor to get a true immersion. To give you an idea, a 60° FOV or less is like looking at the virtual world through a window. With a 90° FOV like the Oculus Rift, you are closer to the window and you start feeling a great sense of immersion, but with the 210° FOV that the InfinitEye offers, you actually are inside the world with the window behind you. Horizontal stereoscopic FOV is the part of the vision where the two eyes see the same objects in the virtual world and the brain can make a 3D representation of them. Because of the nose, humans have a limited stereoscopic vision between 90°and 120°. Everything beyond that is only seen by one eye on each side and is called peripheral vision. The human FOV is typically 180°when looking straight forward, and up to 270° when turning the eyes left and right. The InfinitEye headset tries to mimic that as much as possible with its 210°FOV total and a 90° stereoscopic overlap.

Fresnel to Conventional Lens Comparison
MTBS: You repeatedly talk about Fresnel lenses.  What is a Fresnel lens? What characteristics does it bring to the InfinitEye and why is this important?

Lionel: A Fresnel lens is a type of lens invented by a French physicist Augustin Fresnel in the beginning of the 19th century. Its main advantage compared to conventional lenses is its flatness which allows it to be relatively large without a significant weight gain.

The InfinitEye design takes advantage of this feature to maximize the size of the lenses without having them overly thick, so the FOV is not restricted by the lense's edges unlike other HMDs. The largest dimension of the lenses is 125mm while conventional lenses of the same size could not even be used to achieve the required magnification since it would have to be thicker than the distance between the eyes and the screens, and I can't imagine how much they would weigh.

MTBS: What technology are you using for head tracking? Are there ideas you've been playing with?

Lionel: We are using an off-the-shelf component with 3 DOF (degrees of freedom) tracking, the YEI 3-space sensor embedded. It is a really fast component, capable of streaming the data from the three axis gyro, accelerometers and compass at a 1000Hz rate. In the videos that we showed until now on my YouTube channel, we used the integrated orientation computation embedded in the chip but the perceived latency wasn't really good due to the 250Hz refresh rate and the lack of prediction. However, Robin has recently implemented the sensor fusion algorithm on the computer side and the orientation prediction based on angular acceleration to minimize the error in the best way. We will soon post a new video showing our improvements on this topic.

We are thinking of other solutions because the YEI chip is quite expensive but we haven't got much time to play with other ideas for the moment.

MTBS: What is positional tracking? Do you think positional tracking will be an important part of your HMD's future design?

Lionel: The movements of an object can be defined along six degrees of freedom, three rotations and three translations. Positional tracking is the tracking of the three translations (X, Y and Z axis) that are missing in the kind of chips used by InfinitEye or Oculus to track the movements of the head. There is no doubt that positional tracking is really important to get a true immersion and help reduce nausea felt by users when the whole virtual world is moving along with them when strafing or leaning down. Of course we would like to include a six DOF tracking solution in future versions of the design.

MTBS: What are your thoughts on Sixsense's STEM System. In particular, do you think their STEMs (their positional tracking devices) are adequate for being used as a head-tracking add-on for HMDs like yours? Why or why not?

Lionel: We didn't get the chance to test the STEM system but if it is as good as they say, it could even replace the YEI chip for full six DOF tracking - I don't know if it is accurate enough though. There's also the CastAR optical tracking with infrared LEDs which seems to be really accurate but it requires a line of sight between the HMD and the markers, which could be a show stopper.

MTBS: The Oculus Rift, which was also founded on MTBS, had the benefit of magic. John Carmack was posting in the MTBS forums and connected with Palmer here, Oculus got instant backing that almost seemed to fall from the sky - they pretty much had every advantage. What have InfinitEye's biggest challenges been so far, and how can the industry and gamers help?

Lionel: You're right. Thanks to John Carmack at E3 2012, the public became aware of Palmer's work and then magic started. Our biggest challenges, since we don't have a guy like Carmack on our side, is to get people to know about the InfinitEye and convince them that the dual screen HMD technology is working really well and can provide a whole new level of immersion. Any help is welcome to achieve this, for instance we recently have been invited to a VR event that should take place in a few months.

MTBS: One of the byproducts of the Oculus Rift is the software development around it. Native VR games, VR drivers, new content...there seems to be a lot of software support we never had before with devices like this. Is Oculus content compatible with InfinitEye? What VR or 3D formats are best suited for InfinitEye?

Lionel: Oculus-ready games will not natively be compatible with InfinitEye because we don't use the same tracker and we require a much wider scene rendering with four cameras versus two. However, I guess that since the Oculus Rift's FOV is totally included in the InfinitEye's FOV and the tracker have similar characteristics, Rift games could be (illegally) patched to run on the InfinitEye headset with a limited 90° FOV.

Vireio Perception 2.0 Logo

For a developer, adapting games that are designed with VR in mind should not be very difficult, hopefully nothing more than a recompilation with our display and tracking methods.  We are also thinking about contributing to the open source Vireio Perception drivers to make them compatible with the InfinitEye.

MTBS: How are you connecting the InfinitEye to a PC? Is there more than enough bandwidth to push the imagery through at a high enough frame rate?

InfinitEye Controller Box
Lionel: We use two HDMI cables plugged in the graphics card through DVI-HDMI adapters and a USB cable to connect the YEI chip. For the moment we only need two 1280x800@60Hz links but each HDMI connection is able to carry much more bandwidth. We would like to develop a single controller board to drive the two screens simultaneously with a single video input (e.g. DisplayPort) so the computer would only see one big screen instead of two.

MTBS: Is there anything about current VR software development that goes against the grain of what InfinitEye needs? What changes should developers consider if any?

Lionel: Not really, a recompilation with the InfinitEye SDK and probably minor changes should be enough. As an example, in FPS games, when the player is damaged, the red surrounding blood indicating the level of energy could be moved further away in the peripheral vision and the flashes indicating the direction of incoming shots could be more accurate. Regarding the InfinitEye SDK, we will probably release the distortion correction, tracking and prediction methods as a small SDK under an open source license.

MTBS: What is your next step? What do you need to get your product launched, and how are you going about making that happen?

Lionel: Our next step is to show the prototype to as many people as possible, first to journalists, to get feedback and press coverage. Paul James from RoadToVR is flying to Toulouse very soon and he will be the first to test the prototype in person, it's going to be a very important day for us. Then we would like to attend VR events to get even more feedback from people and see how enthusiastic they are. Afterwards, we will either decide to start a crowd funding campaign to make a consumer product or target other markets.

MTBS: Can fellow gamers and users recreate your work? What are the costs involved? Do you have to be an electrical engineer to do it?

Lionel: Anyone with DIY skills and approximately 450$ can make an InfinitEye V1 which is very close to the current design.  The plans have been available on the MTBS forums since February, but keep in mind that the design is under Creative Commons license with the obligation to share any related work and give credit to the original author.

MTBS: As a VR hardware maker, what software and products are you most looking forward to and why?

Lionel: We are not defining ourselves as a "VR hardware maker" yet since the project is still early prototypes and demo software. We are looking forward to any developments in products related to VR and especially Omni, Cyberith, Avegant, castAR, PrioVR, Sixense STEM, Sony's yet to be announced PS4 VR system and of course Oculus Rift DK2 and consumer version. We are passionate about VR and we're happy to live these exciting times!

Thank you for joining us Lionel!  We will be tracking your progress very closely!  Fellow MTBS inventors and immersive tech movers and shakers are encouraged to reach out to us at This email address is being protected from spambots. You need JavaScript enabled to view it. for articles and interviews.

More to come!
Comments (0)   Add comment