MTBS3D RT @GetImmersed: .@pumcypuhoy from @tomshardware brought Acer’s #WindowsMixedReality headset! #Immersed2017! #MixedReality #MR https://t.c…
MTBS3D RT @GetImmersed: Don’t miss this talk with Olga! She’s an award winning #VirtualReality artist! #Immersed2017 #VR #TiltBrush
MTBS3D RT @GetImmersed: We’re at the Science Centre getting ready for tomorrow! Tickets available. #Immersed2017 #VirtualReality #VR #AR #MR #http
MTBS3D @All_Hail_Cesar Hey, did you get your confirmation?
MTBS3D RT @Lytro: Don't miss out @buzzhays speaking @GetImmersed in Toronto, Canada. #immersed2017 Register w/ $100 discount code:
MTBS3D RT @onabatova: I’ll be SHOWING MY VR creations @GetImmersed . Great speakers. See you there. #vr #tiltbrush #vrart
MTBS3D RT @developerWorks: #Immersed2017 October 19 – 21, 2017, Toronto, @mrjohncutter AI in Virtual Reality. https://t.c…
MTBS3D RT @onabatova: Come to @GetImmersed oct, 19-21 ! Amazing speakers , I’ll tell the story of becoming VR artist
MTBS3D RT @tomshardware: ▸ Immersed 2017 Brings The VR Industry Together From Oct 19-21
MTBS3D RT @VuzeCamera: CanadaVuzers, @GetImmersed with our discount code! Come see our booth and join the 3D/Immersive co…
MTBS3D .@GetImmersed is this week! Don’t miss it. Register today! #Immersed2017 #VirtualReality #VR #AR #MR #AI
MTBS3D RT @StereoDToronto: Stereo D's Nick Brown will be at Immersed 2017 next weekend. As part of the Immersive Cinema Panel Friday at 2PM and Sa…
MTBS3D RT @GetImmersed: Welcome to the Future of Intelligent Digital Reality at #Immersed2017! #VirtualReality #VR #AR #MR #AI
MTBS3D RT @Suometry3D360: We will be speaking amongst a line up of the top in the business. @GetImmersed
MTBS3D RT @GetImmersed: Don't miss the Futurists panel moderated by @AjayFry from @SpaceChannel & speakers from @intel & @HP #Immersed2017! https:…
MTBS3D RT @IOnews: .@GetImmersed Oct 19-21! Hear from HP, AMD, Google & exhibits from Intel, Pimax & more! #Immersed2017 #VR #AR #MR
MTBS3D RT @GetImmersed: Why Immersed is so unique this year + a 100 dollar discount code! #Immersed2017 #VR #AugmentedReality #AR #MR https://t.c…

Nvidia's Light Field HMD at SIGGRAPH 2013

And so begins MTBS' coverage of SIGGRAPH 2013!  Today, Kris Roberts checks out Nvidia's Light Field HMD protoype.  Obviously at the proof of concept stage, this new display technique holds a lot of promise for VR's impending future.

I was really excited to see the Nvidia research project's HMD prototype. Using a light-field display has a number of significant advantages over conventional display techniques that are very attractive for virtual reality. I very much wanted to see how it looks for myself.

The demonstration equipment they had on display was basically divided into two groups. One was a working real time stereoscopic HMD prototype built from off the shelf components and using a pair of small microlens-covered 1440x720 OLED panels and a 3D printed housing. The other was a set of film slides with a loose microlens to demonstrate what the display could look like with much higher resolution.

With the goal of producing perceptions indistinguishable from reality, a light-field display has the unique property of letting the viewer's eye decide what to focus on in the image. With a conventional display either the entire scene is in focus, or the focus is determined by the rendering/photographic system. A light-field display presents something much more natural and realistic in letting the viewer decide not only what part of a scene to converge on, but also which part to focus – and the areas not in focus blur out exactly as they do in reality. Another really interesting aspect of this approach is that the display itself can be calibrated to accommodate the flaws in a users' vision, eliminating the need to wear both corrective lenses and the HMD!

The stereoscopic prototype did demonstrate the focus aspect of the display very well with scenes that had fish swimming in an aquarium. It was really cool to switch between the close and distant fish and see them go in and out of focus. In my view, this plays an important part in tricking my mind into thinking what I'm seeing is actually real and not just a flat image being held in front of my eye.

Another advantage is the size, particularly the thickness of the display assembly. With a normal HMD there are one or more lenses in front of the image panel that require some significant distance to focus properly – and the result is a large and often heavy piece of equipment. With the light-field approach, both the lens membrane and the image panel are thin, light, and require a focal distance measured in millimeters. The demonstration prototype was about 1 centimeter thick. Since they were using components from an off the shelf HMD, they chose to keep it simple and mount the controlling electronics on top of the eye pieces, but really that could be relocated to a package that would go in your pocket or elsewhere and is not necessary to have be on the headset itself. Despite the extra bulk, the entire unit was still much smaller and lighter than any other HMD I have seen.

The primary shortcoming of the system in my opinion is the effective resolution of the image seen by the user. With the 720p panels in the stereoscopic prototype, I was told the image you perceive is in the range of 200p – and honestly, that seemed generous. The color, contrast and stereoscopic depth were all reasonably good, but my impression of the resolution of the actual image was very low. So, how fine a resolution would be required to meet or exceed the perceived resolution of the ultra realistic HMD we would all like to have? Well, the demonstration slides they were using were actual film with a resolution of 3000dpi, and they looked pretty good – but not flawless in clarity. So with the best contemporary mobile device screens in the ~350dpi range it seems like it will be some time before we have affordable panels that are large enough to provide satisfactory field of view and fine enough to have an acceptable perceived resolution.

Another difference which may be a significant factor for the light-field approach is the nature of the rendering process. Unlike a traditional single view display, a light-field display uses many small views of the scene. The GPUs and rendering pipelines we have today have been developed and optimized for a single output image, and their suitability for a system that requires potentially thousands of simultaneous views may not be ideal.

The stereoscopic prototype on display was running on a consumer level graphics card, but was rendering a 1440x720 image with 144 individual images which I believe were each 80x80. I'm not sure how well that will scale to the ultra high number of scenes that would be required to produce a really convincing high resolution light-field display, but Douglass was jovial when talking about how Nvidia is after all a rendering company and ideally positioned to solve those problems.

So in practice, what is easily available now with a light-field display falls quite a bit short of the image quality we can see with the current traditional HMD displays (and resolution is often cited as one of the main areas for improvement in those). I am very glad to have had the opportunity to see the prototype and do think there is tremendous potential and unique advantages with this approach – we just need ultra high resolution panels and rendering equipment that can pump out a tremendous number of tiny views.

This is just the beginning!  Come back regularly for a lot more SIGGRAPH 2013 coverage!