Search

MTBS3D RT @GetImmersed: We’re moving to Silicon Valley! @IfcSummit November 5 & 6, 2019 at the Computer History Museum in Mountain View, CA. Call…
MTBS3D Julien Le Corre, Lead Developer at @InnerspaceVR , talked about their latest #VR escape room title The Corsair's Cu… https://t.co/uuOT6SG0NA
MTBS3D As fun as Arizona Sunshine is in traditional #VR, @Vertigo_Games took it up a notch by transforming it into a locat… https://t.co/YkGpv2wLMM
MTBS3D .@OfficialGDC would not be complete without visiting SVVR's annual #VR Mixer! In today's interview, we catch up wi… https://t.co/hibivrbYdq
MTBS3D Spencer Jackson, Software Engineer at @NordicTrack, talks about their latest iFit #VR Bike paired with an #HTCVivehttps://t.co/5b2uD9Hoa9
MTBS3D William Provancher is the CEO of @TacticalHaptics. He demonstrated their latest haptics controllers for us in this… https://t.co/Ir1Cog8bRI
MTBS3D Gaspar Ferreiro is the CEO of Project Ghost Studios. In this interview, he talks about their new Project Ghost dem… https://t.co/T2xz1VdtGI
MTBS3D .@EpicGames had loads of news to share at @OfficialGDC. Marc Petit is the General Manager of #Epic's @UnrealEnginehttps://t.co/CnqpGAB2f4
MTBS3D Chris Hook, Graphics & Visual Technologies Marketing Chief for @intel spoke to us during @OfficialGDC. We talked ab… https://t.co/ji6AKJpfwM
MTBS3D We interviewed @networknextinc at #GDC2019. They are in the business of ensuring the best connectivity and lowest l… https://t.co/87b06uMAm7
MTBS3D .@reality_clash is a developing #AugmentedReality combat game. We got to interview Tony Pearce, the CCO and Co-Fou… https://t.co/24P5kLz0Ef
MTBS3D Robots explode at #GDC2019 with @FuturLab. They have a new title for #PSVR called Mini Mech Mayhem. #GDC19https://t.co/JiIuJgGZ64
MTBS3D .@zerolatencyVR has a number of #VR out-of-home entertainment centers around the world, and we got to catch up with… https://t.co/NZJBVyRUWz
MTBS3D RT @GetImmersed: Dr. Ofer Shai is the Director of Omnia AI at @DeloitteCanada. He talked about the misconceptions about #ArtificialIntellig
MTBS3D RT @GetImmersed: The use of #futurecomputing in #healthcare was one of the prominent tracks at #Immersed2018, and we got to see some really…
MTBS3D RT @GetImmersed: Ricardo Wagner, Director of Product Marketing for #Office365 at @microsoftcanada, talked about their efforts to make moder…
MTBS3D RT @GetImmersed: Pascal Langlois, Founder of Collective Intent, talks about the potential of using motion capture technologies to re-enable…

The Rest of SIGGRAPH 2013



Foveated 3D Display
Mark Finch – Microsoft Research

I have always been curious about eye tracking and its implications for user interface, input and display. The project that Mark Finch has been working on with Microsoft Research was really fascinating to see. Their system works to focus the rendering quality in a 3D display right in the region where your vision is the sharpest; the "fovea", which is a remarkably small area compared to the overall field of view.  The fovea is described as being about the size of your thumb nail when your arm is fully stretched out in front of you.


Their demonstration had a standard PC connected to nine 1920x1200 displays and an off the shelf eye tracking device. The software they have developed uses the information about where the user is looking to dynamically change the area of the scene that is rendered at the highest quality.

One thing that was really compelling about the demonstration system was watching other people using it. It was obvious where the system thought they were looking – the clear/high-resolution area moved around the screen and the contrast with the rest of the display was easy to see.  But when I sat down and had it working for me, it was shocking how I could not tell it was working that way! Wherever I looked was indeed sharp and the rest of the image did not appear to be lacking in visual quality. The clear advantage of the system was that it was rendering an overall 5760x3600 image at a higher frame rate focusing only on the area it knows the user is seeing clearly than if it was trying to produce the same quality over the entire display.


Autostereoscopic Projector Array Optimized for 3D Facial Display
XueMing Yu – USC Institute for Creative Technologies and Activision


I'm usually a little skeptical of systems that promise holographic autostereoscopic displays, particularly ones that say they support multiple viewers - but the projector array system on display by XueMing Yu and his colleagues from USC does look very good.

Their demonstration system uses 72 pico projectors arranged on a parabola all shining on a vertically anisotropic lenticular screen. Viewers are identified and their positions tracked with a Microsoft Kinnect, and the system warps multiperspective rendering according to who will see each column of projected pixels.

The actual display area is fairly small, but it surprised me how well it produced the illusion of there being a real object – especially as the viewer moves around to look at it from various angles.