Search

MTBS3D RT @BelayIP: First meeting of #CES19 in the books. Online interview with Neil Schneider of #mtbs3d. Come say hi to me and @BasemarkLtd if y…
MTBS3D RT @GetImmersed: .@AffordStudio Co-Founder Avery Rueb talked about the status of #technology in the classroom and new innovations that will…
MTBS3D RT @GetImmersed: Dereck Orr of National Institute of Standards and Technology @usnistgov gave the first keynote at #Immersed2018. He talked…
MTBS3D RT @GetImmersed: At #Immersed2018, Charlie Choo of @studio216 talked about how they are using #immersivetechnologies like #AR, #VR, and #MR
MTBS3D RT @GetImmersed: #ArtificialIntelligence is an important part of what's next in #futurecomputing and was a vital topic at #Immersed2018. B…
MTBS3D What better way to exemplify future computing than to talk about prototyping using #VirtualReality and… https://t.co/zeNEOR7Uhw
MTBS3D MTBS-TV: Rama Krishna Aravind is the Founder and Creative Director of Poco Loco Amusements. He shared his inspirati… https://t.co/AR2VAvcZBP
MTBS3D MTBS-TV: Dr. David Rolston @dwrolston speaks about the future of #AI & #VR at @GetImmersed. #Immersed2018 https://t.co/5V9A18c6Vk
MTBS3D RT @GetImmersed: Kevin Williams, Chairman of the @DNA_Conference is easily one of the market's go-to-guys for all things dealing with out o…
MTBS3D RT @GetImmersed: Bob Raikes is the Founder of Meko Ltd. who publish the leading industry journal @Display_Daily. @brmeko spoke about displ…
MTBS3D RT @GetImmersed: Easily one of the highlights at #Immersed2018, Mike Domaguing, @Survios' VP of Marketing, gave a rundown of the #VR projec…
MTBS3D RT @GetImmersed: .@elumenati makes immersive projection solutions that multiple people can enjoy at the same time. At #Immersed2018, Hilary…
MTBS3D Daryl Sartain talked #immersive #AI & #Blockchain at @GetImmersed. Sartain is the Director and Worldwide Head of… https://t.co/LmqWZjQn0w
MTBS3D RT @GetImmersed: #Immersed2018 starts on Thursday! Tickets still available: https://t.co/5CJiYUiNKF #Immersed18 #business #healthcare #AI #…

The Rest of SIGGRAPH 2013



Foveated 3D Display
Mark Finch – Microsoft Research

I have always been curious about eye tracking and its implications for user interface, input and display. The project that Mark Finch has been working on with Microsoft Research was really fascinating to see. Their system works to focus the rendering quality in a 3D display right in the region where your vision is the sharpest; the "fovea", which is a remarkably small area compared to the overall field of view.  The fovea is described as being about the size of your thumb nail when your arm is fully stretched out in front of you.


Their demonstration had a standard PC connected to nine 1920x1200 displays and an off the shelf eye tracking device. The software they have developed uses the information about where the user is looking to dynamically change the area of the scene that is rendered at the highest quality.

One thing that was really compelling about the demonstration system was watching other people using it. It was obvious where the system thought they were looking – the clear/high-resolution area moved around the screen and the contrast with the rest of the display was easy to see.  But when I sat down and had it working for me, it was shocking how I could not tell it was working that way! Wherever I looked was indeed sharp and the rest of the image did not appear to be lacking in visual quality. The clear advantage of the system was that it was rendering an overall 5760x3600 image at a higher frame rate focusing only on the area it knows the user is seeing clearly than if it was trying to produce the same quality over the entire display.


Autostereoscopic Projector Array Optimized for 3D Facial Display
XueMing Yu – USC Institute for Creative Technologies and Activision


I'm usually a little skeptical of systems that promise holographic autostereoscopic displays, particularly ones that say they support multiple viewers - but the projector array system on display by XueMing Yu and his colleagues from USC does look very good.

Their demonstration system uses 72 pico projectors arranged on a parabola all shining on a vertically anisotropic lenticular screen. Viewers are identified and their positions tracked with a Microsoft Kinnect, and the system warps multiperspective rendering according to who will see each column of projected pixels.

The actual display area is fairly small, but it surprised me how well it produced the illusion of there being a real object – especially as the viewer moves around to look at it from various angles.