Search

MTBS3D RT @GetImmersed: Dr. Ofer Shai is the Director of Omnia AI at @DeloitteCanada. He talked about the misconceptions about #ArtificialIntellig
MTBS3D RT @GetImmersed: The use of #futurecomputing in #healthcare was one of the prominent tracks at #Immersed2018, and we got to see some really…
MTBS3D RT @GetImmersed: Ricardo Wagner, Director of Product Marketing for #Office365 at @microsoftcanada, talked about their efforts to make moder…
MTBS3D RT @GetImmersed: Pascal Langlois, Founder of Collective Intent, talks about the potential of using motion capture technologies to re-enable…
MTBS3D RT @GetImmersed: David Parker, Founder of @teamwishplay, talked at #Immersed2018 about how they are using #immersivetechnologies like #Virt
MTBS3D RT @GetImmersed: Richard Huddy, Head of the Game Ecosystem at the Samsung Research Institute (UK), was the second keynote at #Immersed2018.…
MTBS3D RT @GetImmersed: .@JoanneAska, Co-Founder of @TribeOfPan, talks about @TheChoice_VR their innovative #VR project that addresses the topic o…
MTBS3D .@ArozziChairs makes high-end #gaming chairs and tables. Scott Nishi, Sales Manager for Arozzi, spoke to us at… https://t.co/4U4LyU1SJn
MTBS3D .@pimaxofficial interview from #CES2019 includes news about their latest #5K and #8K #HMDs, eye tracking and new co… https://t.co/mmgw69jRTa
MTBS3D .@HP unleashes the #VR dinosaurs at #CES2019. 🦕 🦖 https://t.co/Ufed2K99F5 https://t.co/Rd5irCXzMZ
MTBS3D Today’s interview is with Jan Ludvig from @SenseArena. Jan was a professional #NHL #hockey player. He talked about… https://t.co/3fT7zWGmyI
MTBS3D Chia Chin Lee of, CEO of @BigBoxVR talks Population One at #CES2019. #VR #eSports https://t.co/xfIWYboVkQ https://t.co/3pW2AEPaxG
MTBS3D At #CES2019 we met with Rikard Steiber, President of #HTCViveport, and he talked about their new @htcvive Pro Eye,… https://t.co/WjugF0l5gJ
MTBS3D We met with Ryan McCall, Director of Strategy and Business Development for @UL_Benchmarks at #CES2019. He talked ab… https://t.co/lo8HZkYs5p
MTBS3D .@OmronAutomation talked about their ping pong playing robot at #CES2019. 🏓🤖 #Robotics #technologyhttps://t.co/SvdLiCYlbZ
MTBS3D MSI showcased their latest 17" GS75 Stealth laptop computer and talked about the availability of #VR readiness in t… https://t.co/3UrISM7nWK

The Rest of SIGGRAPH 2013



Foveated 3D Display
Mark Finch – Microsoft Research

I have always been curious about eye tracking and its implications for user interface, input and display. The project that Mark Finch has been working on with Microsoft Research was really fascinating to see. Their system works to focus the rendering quality in a 3D display right in the region where your vision is the sharpest; the "fovea", which is a remarkably small area compared to the overall field of view.  The fovea is described as being about the size of your thumb nail when your arm is fully stretched out in front of you.


Their demonstration had a standard PC connected to nine 1920x1200 displays and an off the shelf eye tracking device. The software they have developed uses the information about where the user is looking to dynamically change the area of the scene that is rendered at the highest quality.

One thing that was really compelling about the demonstration system was watching other people using it. It was obvious where the system thought they were looking – the clear/high-resolution area moved around the screen and the contrast with the rest of the display was easy to see.  But when I sat down and had it working for me, it was shocking how I could not tell it was working that way! Wherever I looked was indeed sharp and the rest of the image did not appear to be lacking in visual quality. The clear advantage of the system was that it was rendering an overall 5760x3600 image at a higher frame rate focusing only on the area it knows the user is seeing clearly than if it was trying to produce the same quality over the entire display.


Autostereoscopic Projector Array Optimized for 3D Facial Display
XueMing Yu – USC Institute for Creative Technologies and Activision


I'm usually a little skeptical of systems that promise holographic autostereoscopic displays, particularly ones that say they support multiple viewers - but the projector array system on display by XueMing Yu and his colleagues from USC does look very good.

Their demonstration system uses 72 pico projectors arranged on a parabola all shining on a vertically anisotropic lenticular screen. Viewers are identified and their positions tracked with a Microsoft Kinnect, and the system warps multiperspective rendering according to who will see each column of projected pixels.

The actual display area is fairly small, but it surprised me how well it produced the illusion of there being a real object – especially as the viewer moves around to look at it from various angles.