MTBS3D RT @IFCSummit: .@Dell’s Director of Virtualization and Commercial #VR and #AR is speaking at #IFCSummit.
MTBS3D RT @IFCSummit: .@tifcagroup’s International Future Computing Summit ushers in #clientocloudrevolution. #IFCSummit #PC #cloud #XR #VR #AR #A
MTBS3D RT @IFCSummit: Dr. Ali Khayrallah, Engineering Director for @ericsson speaking at #IFCSummit. #clientocloudrevolution #cloudcomputing #futu
MTBS3D RT @tifcagroup: TIFCA releases new #ClienttoCloud Vision Document and a $200 off code for @IFCSummit tickets. #TIFCA #IFCSummit #cloud #cli
MTBS3D RT @IFCSummit: .@tifcagroup releases new #ClienttoCloud Vision Document and a $200 off code for #IFCSummit tickets. #TIFCA #cloud #clientot
MTBS3D RT @MTBS3D: Interview with Shawn Frayne, CEO of @LKGGlass, #3D footage included. Alex Hornstein, CTO of Looking Glass Factory, will be spe…
MTBS3D Interview with Shawn Frayne, CEO of @LKGGlass, #3D footage included. Alex Hornstein, CTO of Looking Glass Factory,…
MTBS3D RT @IFCSummit: #IFCSummit is proud to announce @intel as a Platinum Sponsor! #Intel #futurecomputing #cloud #gamedev #AI #AR #VR https://t.…
MTBS3D RT @IFCSummit: IFC Summit is proud to announce @AMD as a Silver Sponsor for #IFCSummit! #CloudComputing #FutureComputing #AI #gamedev #AR #…
MTBS3D RT @IfcSummit: IFC Summit welcomes Professor Bebo White to our futurists panel. @beboac is a Department Associate (Emeritus) at the SLAC Na…
MTBS3D RT @IfcSummit: Nima Baiati Global Head of Cybersecurity Solutions for @Lenovo is speaking at #IFCSummit. #IFCSummit2019 #CyberSecurity http…
MTBS3D RT @IfcSummit: Jeffrey Shih Lead Product Manager for @unity3d’s efforts in #ArtificialIntelligence is speaking at #IFCSummit. #IFCSummit201
MTBS3D RT @IfcSummit: We are excited to welcome Director in Privacy and Security, Paul Lanois, for @Fieldfisher as a speaker at #IFCSummit. Paul…
MTBS3D Jim Jeffers talked about @intel’s efforts to enable over a billion users with creative and computing tools.…

MTBS' VR Settings Guide

Differences Between Stereoscopic 3D Displays and Head Mounted Displays

Neil Schneider, CEO of MTBS, in front of 150" Panasonic 3D Ready plasma.
The beauty of a 3D monitor or television is that there is plenty of room to customize the experience. By moving the cameras apart (the "separation") and adjusting the convergence to your liking, you can have a depth experience where you are seeing straight into the scene like a car windshield, you can have a lot of out of screen experiences where things seem to reach out and grab you, or you can have a mixture of both. S-3D monitors really are magic computer game windows if you know how to play with the separation and convergence settings for the best 3D effects.

Oculus Rift PhotoCrop Contest Entry

Head mounted displays need to be handled very differently. The principles are similar, but our personal biology and eye placement suddenly becomes a lot more important. We can still get a rich 3D experience, but we no longer have the flexibility to spread the separation apart the same way, and the convergence setting is no longer about choosing what is inside the screen and what's grabbing hold of our collar!

We have to remember that the whole goal behind an HMD is to be an immersive display that closely resembles how we see things in real life. In life, we don't have cameras, we have eyeballs! An HMD's lenses are practically pressed up against our pupils which means we can't move our eyes further apart or closer together (that would be a gross trick), and unlike a 3D display which can be placed a distance away, HMDs place greater pressure on our eyes to rotate according to what they see.

Therein lies the problem! When we put our HMD on, each eye is seeing a separate image that we can't easily compare, and we could be inadvertently forcing our eyes to diverge in uncomfortable directions without even realizing it. This is a no-no!

Therefore, the separation of the game's cameras is determined by the distance between our eyes. This measurement is known as the Interpupillary Distance (IP). When choosing our game settings, we have to figure out a way to make sure our eyes never have to look at something that goes beyond our IP and point outward.

Also, with a 3D monitor we were happy to go crazy with our convergence and have objects fly out of our screens because it's fun and it looks great! With an HMD, this technique no longer makes sense because the lenses are up against our eyes which means there is no screen for things to fly out of. We instead see things as we would in real life with our own eyes. We will still want to avoid that flying hatchet, mind you - we just don't have to use the same tricks to get the same results! If we did force a high convergence setting, it not only wouldn't be necessary, it would make the viewing experience painful to watch because our eyes would be forced to point inward.

Convergence is still an important adjustment, just not for the same reasons. Stereoscopic 3D effectiveness is best demonstrated with up close objects. If I were to take a 3D picture of an apple on a table with a stereoscopic 3D camera and overlap the images, it would be very clear that each view is very different. It's these differences that make the apple look interesting. If I took a picture of the same apple from far away, that part of the picture would look as though it could completely overlap. The aspects of 3D where you see the curvature of objects and fine detailing really happen up close, though you still maintain that chasm-like depth for objects in the distance.

With an HMD, the purpose of the convergence setting is to make sure we see that fine detailing up close, we still benefit from the depth in the distance, and we do so in a way that doesn't make our eyes twist and turn in painful ways.