MTBS3D It's Always Sunny in Philadelphia goes #VR 360 tonight! @alwayssunny #SunnyFXX #VirtualReality
MTBS3D On this week's show #Zenimax vs #Oculus verdict, and an interview with Tero Sarkkinen, Founder of @BasemarkLtd! #VR
MTBS3D Judgement Served With a $500 Million Bill For Oculus. #Oculus #Facebook #ZeniMax #VR
MTBS3D Bertrand Nepveu of @Vrvana talked with us about their latest HMD upgrades at @CES. #CES2017 #VR #AR #MixedReality
MTBS3D .@panasonic discussed their new 220 degree FOV VR HMD with us at @CES. #CES2017 #Panasonic #VR #VirtualReality
MTBS3D .@NGCodec is working on a cloud-based #VR platform. They showed us the potential at @CES. @OlyG #CES2017
MTBS3D We interviewed @SamsungCanada's Chief Marketing Officer at @CES. #CES2017 #GearVR #Oculus #VR #VirtualReality
MTBS3D We spoke w/@ImmerVision at @CES. They make 360 panamorph lenses for several leading 360 camera systems. #CES2017
MTBS3D We spoke w/@ImmerVision at @CES. They make 360 panamorph lenses for several leading 360 camera systems. #CES2017
MTBS3D We spoke to Mark Childs, Chief Marketing Officer for @SamsungCanada at @CES. #CES2017 #GearVR #Oculus #VR
MTBS3D We spoke to @kwikvr_sgx about their new tetherless #VR enabling prototype at @CES. #CES2017 #VirtualReality
MTBS3D We interviewed @GoTouchVR at @CES about their new haptics device prototype for #VirtualReality devices. #CES2017
MTBS3D RT @Robertsmania: Still freaks me out to see myself on videos... Do I really sound like that?!? #VR #SimRacing @MTBS3D #CES2017 https://t.…
MTBS3D We spoke with @htc at @CES about their new audio add-on for the @htcvive, Vive Port, #VR arcades & more! #CES2017
MTBS3D We spoke to @FibrumVR at @CES about their HMDs for mobile phones and growing ecosystem of games! #CES2017 #VR
MTBS3D .@AMD spoke with us at @CES about the long awaited Vega GPU architecture coming up around the bend! #CES2017 #AMD
MTBS3D @DARKFiB3R Have you received it yet?

Possibly the REAL Story of ILMxLab!

ILMxLab, a new subsidiary of Lucasfilm, has been earning a lot of press because they are introducing augmented reality, virtual reality, and stereoscopic 3D components to the Star Wars franchise and are developing new forms of storytelling. In our opinion, this is just a small part of a much grander picture. It's clearly a far older and more thought out process than a sudden ILMxLab branding!

John Gaeta is Lucasfilm's Creative Director of New Media and Experiences, and we happened upon this interview he did in 2011 where he effectively spelled out where things are headed and why. The whole thing makes for great viewing, and things get extra juicy at the eighteen minute mark.

Gaeta starts by acknowledging the limitations of cinema. A core issue with cinema innovation is it's not a continual enhancement of the technology. Innovations are created for a specific movie, the film is released, and maybe those innovations are used again in the next franchise when needed. Innovation is effectively a staggered series of starts and stops as movie studios move from film to film and franchise to franchise. Over the course of 30+ years, the biggest innovations have effectively run their course, and it's time for the next step in cinematic storytelling.

While film will continue in a fixed perspective form, there is also a completely new class of storytelling that is being created. It's no longer just about what you see on the camera; viewers will be able to have a "god-view" and look around the scene and get nuances to the story they wouldn't otherwise. While this may resemble concepts we take for granted in video games, the goal is to add true to life fidelity to the mix.

One point he highlighted that I found very interesting is that if we look at the core of film, it's baked imagery. The filmed content is fixed, and while the special and visual effects can be elaborate, they too are baked on to the image - never changing once recorded. The future is dynamic content or "omni-capture" using real-time engine technology. The physical actors are universally captured and can then manifest in vr media as perfect holo-digital equivalents.  In the interview, Gaeta calls this "volumetric video", though he likely meant to say "volumetric cinematography".

While the concept of the Bullet Time effects in The Matrix movie exemplified the physics-breaking magic that can occur in a virtual world versus the real world, it was no more than a hack of the readily available and limited camera technology. Using technology like the Kinect and other devices, all the gesture and body capture data is really there to change the virtual world around us according to how our body influences it.

Remembering that this interview was done in 2011, Gaeta insisted that this vision isn't pie in the sky. He expected holographic viewing technologies available within five years, and the processing power needed to make true-to-life rendering possible within ten. Ironically, when an audience member quipped about having an elephant in her house, Gaeta was quick to point out that she will have that elephant within three years!

Pretty close, John!

Of course, Gaeta highlighted the potential evils of the technology. In addition to storytelling, we are also looking at the development of a new culture; even a new "punk" culture based on this media. The cartoon metaverse will eventually be replaced with true-to-life imagery as technology allows, and core to making many of these new experiences possible are cameras and immeasurable amounts of data being collected from each user. The vendors will effectively know everything about you. In 2011, Gaeta used the Facebook reference and that he made a conscious choice to not create a Facebook account. This new era will be far more intrusive than that.

I'm not doing John justice. Best you watch the video for yourself and share your thoughts! Very interesting!