MTBS: Welcome to the interrogation chair, Lionel! How did you first get interested in VR?
Lionel: I can’t remember exactly how I got interested in VR in the first place, but I have a memory of a video game magazine from my childhood where there was a picture of someone with an HMD and a data glove interacting with a virtual world. The graphics would seem absolutely awful nowadays! I knew then that I wanted to experience something with such an immersion, but I never had the chance to even get close to that until I decided to make my own stereoscopic high field of view (FOV) display which I posted the concept on MTBS3D forums in August 2011.
MTBS: HMDs have been around for decades. What suddenly made them more practical than before?
Lionel: There are many reasons, but the main reason is that the new generation of HMDs is really better and cheaper than before. Due to the fast development of mobile devices, we have better screens, better tracking chips, all for really low prices. There’s also the fact that with more powerful GPUs, the correction of optical flaws is managed on the software side. This allows us to simplify the optics and use tablet-sized screens compared to complex optics in older HMDs that were made with micro displays.
MTBS: For years, the consumer resolution limit for affordable HMDs was 640X480 pixels per eye. What was the fundamental change in how HMDs are put together that broke this barrier?
Lionel: The first affordable HMDs that broke the 640×480 barrier are Sony HMZ-T* series which are based on small OLED displays and the ST1080 which is based on LCOS displays – but none of them are designed for VR due to their limited FOV. On the other hand, the booming of mobile devices pushed manufacturers to improve the size and resolution of their panels which then started to be used by do-it-yourselfers (DIYers) for their homemade HMDs. That was the fundamental change that led to the emergence of true VR HMDs like the Oculus Rift and InfinitEye.
MTBS: Tell us about InfinitEye! How did this come about?
Lionel: Everything started back in 2006 while I was building a DIY projector. I bought an old slide projector in which I found an aspheric lens used for light collimation. I was playing with it and I noticed how easy it was to focus on close objects looking through it. Being passionate about stereoscopy for a long time I immediately thought about making a high FOV 3D display; in other words a HMD. However until 2010 this was just an idea and screens were quite expensive anyways. Then I found a 7.2″ HD LCD and the same kind of lenses on the internet, which I used to design my first wide FOV HMD. I then shared my work in progress (WIP) on the MTBS3D forums in August 2011. I left it there until I saw how enthusiastic people were about VR during the Oculus Rift Kickstarter campaign.
I was seeing a lot of people making their DIY Rifts and I knew that a 90° field of view wouldn’t be enough to satisfy my FOV addiction. I definitely wanted to design an HMD with a bigger FOV that could get the interest of the DIY community so I started doing research on a way to achieve what seemed to be unreachable, a field of view of at least 180°. Since flexible screens are unavailable for DIYers, two screens at an angle are necessary to reach that number, so I ordered two 5.6″ 1280×800 panels and a ton of different lenses on the internet – including Fresnel lenses that were made out of polymethyl methacrylate (PMMA). Fresnel lenses used to be very bad due to smearing and bad light transmission, but not these ones because PMMA is a very clear material. I ended up stacking them up to get enough magnification and I made some calculations to optimize the angle between the screens to get the best compromise between stereoscopic overlap and peripheral vision.
Later I heard that the 5.6″ screens would no longer be available when Oculus had to switch to a 7″ panel for their developer kit, so I ordered 7″ screens and they turned out to be a better fit. I posted the design in the MTBS forums in February 2013, iterated a bit (actually there were three lenses stacked for each eye at the beginning, and now two of them), and asked for ideas on the name. MTBS member MSat proposed “InfinitEye” and I liked it.
MTBS: On MTBS, you’ve described InfinitEye as an open source DIY (“do it yourself”) effort. What does this mean? What are the benefits and limitations of open source HMDs? Can hardware really be open source?
Lionel: That’s what I wanted at the beginning. I wanted the community to improve the design, share their ideas and start something like an open source SDK for it. Unfortunately, I realized that besides the fact that people were enthusiastic about it, only one guy tried to build his own prototype based on my concept; the Dual Portal project from Hannibalj2. Then a friend of mine, Stephane, who saw the potential of the InfinitEye in its early stage, convinced me to enter the Samsung contest to try to get funding to make the technology available to people through a manufactured product. I think open source hardware can be a good thing by sharing knowledge between a lot of talented people to improve the technology, but it’s possibly not what most people are waiting for when it comes to HMDs. My view is that they expect an end product which is a really different thing and requires a lot of personal and financial investment that automatically conflicts with open source philosophy.
MTBS: Are you at this alone? How big is your team? Who’s involved?
Lionel: Fortunately I’m not alone at this, two very good friends and I are forming what we unofficially call the InfinitEye team. We don’t have a company yet, and since all three of us have full-time jobs, we are running the project on our free time.
First there was only me. I’m a Software Engineer with seven years of professional experience in image processing and compression – I’ve been coding for about 16 years though. As a hobby, I’ve always been involved in personal projects, sometimes related to software, sometimes to hardware (even if I’m much more a software guy), the current one being the InfinitEye which combines both aspects.
Then Stephane joined me. He has a strong technical background in IT and works as an international Project Manager for a famous aircraft manufacturing company on behalf of an IT services company. He is the one who pushed me to try to make something out of the InfinitEye project. Without him, I would have left the project as it was in February and moved on to something different. Within the project, he’s focused on the management and communications parts, building up a network and raising funds.
Later, Robin joined us to take care of the software part of the project. He’s a talented 3D expert Software Engineer and Software Architect. He has notably developed the 3D game engine we are using for our demos and has implemented all the specific algorithms related the head-tracking and distortion corrections.
MTBS: This looks like a big honking HMD! Is it as heavy as it looks? How much does it weigh compared to other products on the market?
Lionel: Yes it can be seen as a little bulky, but I guess that’s not really a problem since it’s not as heavy as it looks. Actually the InfinitEye is only an empty box with two screens on one side, Fresnel lenses on the opposite side, and air in between. The two screens weigh 160g and the four lenses about 90g. The total weight including the case, light blocking foam and elastic band is about 390g which is approximately the same as the Oculus Rift developers kit (DK). The new prototype with a different head mount weighs 100g more but it is actually more comfortable since the weight is better distributed over the head.
MTBS: What is its current resolution? Are you aiming for more? Will panels be easy enough to find?
Lionel: The resolution of our current prototypes are 1280×800 per eye, but what really matters in terms of resolution in HMDs is the angular resolution. With the InfinitEye, the 1280 pixels are spread over a 150° FOV for each eye which gives an angular resolution of 8.5px/°. Ideally we’d like to source 7″ LCDs with a 4:3 aspect ratio to increase the vertical FOV (for instance 1920×1440), but I don’t think such panels exist so our best target would be WUXGA panels (1920×1200) like the ones used in the new Nexus 7. This would lead to an angular resolution of 12.5px/°for the InfinitEye Full HD compared to 10.6px/°for the Oculus Rift HD.
MTBS: Field of View (FOV) is a major selling point for you. How much FOV does InfiniteEye offer, and how does this compare to what we are actually capable of seeing? What is “horizontal stereoscopic FOV” and why is this different from the bigger number?
Lionel: I would say that for a device whose only purpose is to fake the sense of vision, the FOV is the key factor to get a true immersion. To give you an idea, a 60° FOV or less is like looking at the virtual world through a window. With a 90° FOV like the Oculus Rift, you are closer to the window and you start feeling a great sense of immersion, but with the 210° FOV that the InfinitEye offers, you actually are inside the world with the window behind you. Horizontal stereoscopic FOV is the part of the vision where the two eyes see the same objects in the virtual world and the brain can make a 3D representation of them. Because of the nose, humans have a limited stereoscopic vision between 90°and 120°. Everything beyond that is only seen by one eye on each side and is called peripheral vision. The human FOV is typically 180°when looking straight forward, and up to 270° when turning the eyes left and right. The InfinitEye headset tries to mimic that as much as possible with its 210°FOV total and a 90° stereoscopic overlap.
MTBS: You repeatedly talk about Fresnel lenses. What is a Fresnel lens? What characteristics does it bring to the InfinitEye and why is this important?
Lionel: A Fresnel lens is a type of lens invented by a French physicist Augustin Fresnel in the beginning of the 19th century. Its main advantage compared to conventional lenses is its flatness which allows it to be relatively large without a significant weight gain.
The InfinitEye design takes advantage of this feature to maximize the size of the lenses without having them overly thick, so the FOV is not restricted by the lense’s edges unlike other HMDs. The largest dimension of the lenses is 125mm while conventional lenses of the same size could not even be used to achieve the required magnification since it would have to be thicker than the distance between the eyes and the screens, and I can’t imagine how much they would weigh.
MTBS: What technology are you using for head tracking? Are there ideas you’ve been playing with?
Lionel: We are using an off-the-shelf component with 3 DOF (degrees of freedom) tracking, the YEI 3-space sensor embedded. It is a really fast component, capable of streaming the data from the three axis gyro, accelerometers and compass at a 1000Hz rate. In the videos that we showed until now on my YouTube channel, we used the integrated orientation computation embedded in the chip but the perceived latency wasn’t really good due to the 250Hz refresh rate and the lack of prediction. However, Robin has recently implemented the sensor fusion algorithm on the computer side and the orientation prediction based on angular acceleration to minimize the error in the best way. We will soon post a new video showing our improvements on this topic.
We are thinking of other solutions because the YEI chip is quite expensive but we haven’t got much time to play with other ideas for the moment.
MTBS: What is positional tracking? Do you think positional tracking will be an important part of your HMD’s future design?
Lionel: The movements of an object can be defined along six degrees of freedom, three rotations and three translations. Positional tracking is the tracking of the three translations (X, Y and Z axis) that are missing in the kind of chips used by InfinitEye or Oculus to track the movements of the head. There is no doubt that positional tracking is really important to get a true immersion and help reduce nausea felt by users when the whole virtual world is moving along with them when strafing or leaning down. Of course we would like to include a six DOF tracking solution in future versions of the design.
MTBS: What are your thoughts on Sixsense’s STEM System. In particular, do you think their STEMs (their positional tracking devices) are adequate for being used as a head-tracking add-on for HMDs like yours? Why or why not?
Lionel: We didn’t get the chance to test the STEM system but if it is as good as they say, it could even replace the YEI chip for full six DOF tracking – I don’t know if it is accurate enough though. There’s also the CastAR optical tracking with infrared LEDs which seems to be really accurate but it requires a line of sight between the HMD and the markers, which could be a show stopper.
MTBS: The Oculus Rift, which was also founded on MTBS, had the benefit of magic. John Carmack was posting in the MTBS forums and connected with Palmer here, Oculus got instant backing that almost seemed to fall from the sky – they pretty much had every advantage. What have InfinitEye’s biggest challenges been so far, and how can the industry and gamers help?
Lionel: You’re right. Thanks to John Carmack at E3 2012, the public became aware of Palmer’s work and then magic started. Our biggest challenges, since we don’t have a guy like Carmack on our side, is to get people to know about the InfinitEye and convince them that the dual screen HMD technology is working really well and can provide a whole new level of immersion. Any help is welcome to achieve this, for instance we recently have been invited to a VR event that should take place in a few months.
MTBS: One of the byproducts of the Oculus Rift is the software development around it. Native VR games, VR drivers, new content…there seems to be a lot of software support we never had before with devices like this. Is Oculus content compatible with InfinitEye? What VR or 3D formats are best suited for InfinitEye?
Lionel: Oculus-ready games will not natively be compatible with InfinitEye because we don’t use the same tracker and we require a much wider scene rendering with four cameras versus two. However, I guess that since the Oculus Rift’s FOV is totally included in the InfinitEye’s FOV and the tracker have similar characteristics, Rift games could be (illegally) patched to run on the InfinitEye headset with a limited 90° FOV.
For a developer, adapting games that are designed with VR in mind should not be very difficult, hopefully nothing more than a recompilation with our display and tracking methods. We are also thinking about contributing to the open source Vireio Perception drivers to make them compatible with the InfinitEye.
MTBS: How are you connecting the InfinitEye to a PC? Is there more than enough bandwidth to push the imagery through at a high enough frame rate?
Lionel: We use two HDMI cables plugged in the graphics card through DVI-HDMI adapters and a USB cable to connect the YEI chip. For the moment we only need two 1280×800@60Hz links but each HDMI connection is able to carry much more bandwidth. We would like to develop a single controller board to drive the two screens simultaneously with a single video input (e.g. DisplayPort) so the computer would only see one big screen instead of two.
MTBS: Is there anything about current VR software development that goes against the grain of what InfinitEye needs? What changes should developers consider if any?
Lionel: Not really, a recompilation with the InfinitEye SDK and probably minor changes should be enough. As an example, in FPS games, when the player is damaged, the red surrounding blood indicating the level of energy could be moved further away in the peripheral vision and the flashes indicating the direction of incoming shots could be more accurate. Regarding the InfinitEye SDK, we will probably release the distortion correction, tracking and prediction methods as a small SDK under an open source license.
MTBS: What is your next step? What do you need to get your product launched, and how are you going about making that happen?
Lionel: Our next step is to show the prototype to as many people as possible, first to journalists, to get feedback and press coverage. Paul James from RoadToVR is flying to Toulouse very soon and he will be the first to test the prototype in person, it’s going to be a very important day for us. Then we would like to attend VR events to get even more feedback from people and see how enthusiastic they are. Afterwards, we will either decide to start a crowd funding campaign to make a consumer product or target other markets.
MTBS: Can fellow gamers and users recreate your work? What are the costs involved? Do you have to be an electrical engineer to do it?
Lionel: Anyone with DIY skills and approximately 450$ can make an InfinitEye V1 which is very close to the current design. The plans have been available on the MTBS forums since February, but keep in mind that the design is under Creative Commons license with the obligation to share any related work and give credit to the original author.
MTBS: As a VR hardware maker, what software and products are you most looking forward to and why?
Lionel: We are not defining ourselves as a “VR hardware maker” yet since the project is still early prototypes and demo software. We are looking forward to any developments in products related to VR and especially Omni, Cyberith, Avegant, castAR, PrioVR, Sixense STEM, Sony’s yet to be announced PS4 VR system and of course Oculus Rift DK2 and consumer version. We are passionate about VR and we’re happy to live these exciting times!
Thank you for joining us Lionel! We will be tracking your progress very closely! Fellow MTBS inventors and immersive tech movers and shakers are encouraged to reach out to us at press@mtbs3D.com for articles and interviews.
More to come!