Search

MTBS3D RT @IFCSummit: Daryl Sartain is Director & Worldwide Head of #XR, Displays, and Wireless Ecosystems for @Radeon at @AMD. He is also Chair o…
MTBS3D RT @IFCSummit: Arvind Kumar is a Senior Principal Engineer for @intel @IntelSoftware. At #IFCSummit he explained the workings of the Client…
MTBS3D RT @IFCSummit: Neil Schneider’s #IFCSummit opening presentation. #CloudComputing https://t.co/CFqiNxSzPV
MTBS3D RT @IFCSummit: Our #videogames in the #clienttocloud revolution is going on now featuring @playhatchglobal @AccelByteInc @awscloud and @Sha
MTBS3D RT @IFCSummit: On stage now, the #Crypto and #Blockchain markets with Professor Bebo White @bebo and Melissa Brown, Senior Director Develop…
MTBS3D RT @IFCSummit: Finishing off our busy morning is Mark Morrison, Director of Sales for @magicleap. He was talking about Building ROI in the…
MTBS3D RT @IFCSummit: Dominique Pouliquen CEO of @CintooCloud did a live demo of their #clienttocloud technology. #IFCSummit #CloudComputing #VR h…
MTBS3D RT @IFCSummit: Next up is Stefan Holzer CTO of @fyusioninc on Democratizing #3D Content for #ecommerce. #IFCSummit #FutureComputing https:/…
MTBS3D RT @IFCSummit: Enabling an Immersive World panel moderated by Kevin Krewel of @tiriasresearch with Alex Hornstein CTO of @LKGGlass Rama Kri…
MTBS3D RT @IFCSummit: Vinay Narayan VP at @htcvive starts off the morning for us here at #IFCSummit. He is speaking on how #5G is powering the nex…
MTBS3D RT @IFCSummit: We finished the first day of #IFCSummit with a great Futurists Panel moderated by @deantak of @VentureBeat with @awscloud @D
MTBS3D RT @IFCSummit: Some highlights of our busy morning here at #IFCSummit. With @IntelSoftware @AMDPC @AMDGaming @jonpeddie #futurecomputing #C
MTBS3D RT @IFCSummit: Hours left in our #Halloween  flash sale! Take over 600.00 off the price of a ticket today only! 👻🎃🧛🏼‍♂️🧟‍♀️#IFCSummit #Clou
MTBS3D RT @thekhronosgroup: Today only! Get access to the @IFCSummit for only $349. Happy Halloween! https://t.co/eg2jOce4eQ
MTBS3D RT @IFCSummit: #Halloween flash sale! Take over 600.00 off the price of a ticket today only! 👻🎃🧛🏼‍♂️🧟‍♀️#IFCSummit #CloudComputing #CloudGa
MTBS3D RT @TheFoundryTeam: Mathieu Mazerolle, our Cloud Product Manager will be speaking at The @IFCSummit running November 5-6, 2019 at the Compu…

The Rest of SIGGRAPH 2013



Foveated 3D Display
Mark Finch – Microsoft Research

I have always been curious about eye tracking and its implications for user interface, input and display. The project that Mark Finch has been working on with Microsoft Research was really fascinating to see. Their system works to focus the rendering quality in a 3D display right in the region where your vision is the sharpest; the "fovea", which is a remarkably small area compared to the overall field of view.  The fovea is described as being about the size of your thumb nail when your arm is fully stretched out in front of you.


Their demonstration had a standard PC connected to nine 1920x1200 displays and an off the shelf eye tracking device. The software they have developed uses the information about where the user is looking to dynamically change the area of the scene that is rendered at the highest quality.

One thing that was really compelling about the demonstration system was watching other people using it. It was obvious where the system thought they were looking – the clear/high-resolution area moved around the screen and the contrast with the rest of the display was easy to see.  But when I sat down and had it working for me, it was shocking how I could not tell it was working that way! Wherever I looked was indeed sharp and the rest of the image did not appear to be lacking in visual quality. The clear advantage of the system was that it was rendering an overall 5760x3600 image at a higher frame rate focusing only on the area it knows the user is seeing clearly than if it was trying to produce the same quality over the entire display.


Autostereoscopic Projector Array Optimized for 3D Facial Display
XueMing Yu – USC Institute for Creative Technologies and Activision


I'm usually a little skeptical of systems that promise holographic autostereoscopic displays, particularly ones that say they support multiple viewers - but the projector array system on display by XueMing Yu and his colleagues from USC does look very good.

Their demonstration system uses 72 pico projectors arranged on a parabola all shining on a vertically anisotropic lenticular screen. Viewers are identified and their positions tracked with a Microsoft Kinnect, and the system warps multiperspective rendering according to who will see each column of projected pixels.

The actual display area is fairly small, but it surprised me how well it produced the illusion of there being a real object – especially as the viewer moves around to look at it from various angles.