Search

MTBS3D Check out this interview with Tony Zheng, Co-Founder of @matatalab. They have developed a new robotic toy for child… https://t.co/PPnrmshvQ5
MTBS3D Adshir makes ray tracing technology that works on countless platforms and device types. They’re are the first on r… https://t.co/yqLobXhx14
MTBS3D We interviewed Steve Venuti, VP of Marketing for @KeyssaTech. They've developed a solid state connector that is cha… https://t.co/081Ie1799L
MTBS3D Jim Malcolm, Chief Marketing Officer for @HumanEyesTech demonstrated their latest combination 360 degree plus 180… https://t.co/dfnwLmc1Os
MTBS3D .@Dlink spoke to us last week at #CES2020 about their latest Wifi 6 mesh router and what it means for end users.… https://t.co/n2HXCcufMX
MTBS3D RT @tifcagroup: Our Client-to-Cloud Revolution Portal is LIVE! Computing is heading towards a world of any place, any device user experienc…
MTBS3D RT @IFCSummit: .@thekhronosgroup President @neilt3d talks open standards and client-to-cloud at #IFCSummit. #CloudComputing https://t.co/T
MTBS3D Please help @tifcagroup by completing their survey by December 16th! https://t.co/nInslvJ1HM
MTBS3D RT @IFCSummit: #IFCSummit Visionaries Panel with @IntelSoftware @intel @AMD and @jonpeddie talks Client-to-Cloud. #CloudComputing #FutureCo
MTBS3D RT @IFCSummit: The Futurists Panel moderated by @deantak of @VentureBeat is online. #IFCSummit #CloudComputing #FutureComputing #Futurists
MTBS3D RT @IFCSummit: Daryl Sartain is Director & Worldwide Head of #XR, Displays, and Wireless Ecosystems for @Radeon at @AMD. He is also Chair o…
MTBS3D RT @IFCSummit: Arvind Kumar is a Senior Principal Engineer for @intel @IntelSoftware. At #IFCSummit he explained the workings of the Client…
MTBS3D RT @IFCSummit: Neil Schneider’s #IFCSummit opening presentation. #CloudComputing https://t.co/CFqiNxSzPV
MTBS3D RT @IFCSummit: Our #videogames in the #clienttocloud revolution is going on now featuring @playhatchglobal @AccelByteInc @awscloud and @Sha

Possibly the REAL Story of ILMxLab!


ILMxLab, a new subsidiary of Lucasfilm, has been earning a lot of press because they are introducing augmented reality, virtual reality, and stereoscopic 3D components to the Star Wars franchise and are developing new forms of storytelling. In our opinion, this is just a small part of a much grander picture. It's clearly a far older and more thought out process than a sudden ILMxLab branding!

John Gaeta is Lucasfilm's Creative Director of New Media and Experiences, and we happened upon this interview he did in 2011 where he effectively spelled out where things are headed and why. The whole thing makes for great viewing, and things get extra juicy at the eighteen minute mark.

Gaeta starts by acknowledging the limitations of cinema. A core issue with cinema innovation is it's not a continual enhancement of the technology. Innovations are created for a specific movie, the film is released, and maybe those innovations are used again in the next franchise when needed. Innovation is effectively a staggered series of starts and stops as movie studios move from film to film and franchise to franchise. Over the course of 30+ years, the biggest innovations have effectively run their course, and it's time for the next step in cinematic storytelling.

While film will continue in a fixed perspective form, there is also a completely new class of storytelling that is being created. It's no longer just about what you see on the camera; viewers will be able to have a "god-view" and look around the scene and get nuances to the story they wouldn't otherwise. While this may resemble concepts we take for granted in video games, the goal is to add true to life fidelity to the mix.


One point he highlighted that I found very interesting is that if we look at the core of film, it's baked imagery. The filmed content is fixed, and while the special and visual effects can be elaborate, they too are baked on to the image - never changing once recorded. The future is dynamic content or "omni-capture" using real-time engine technology. The physical actors are universally captured and can then manifest in vr media as perfect holo-digital equivalents.  In the interview, Gaeta calls this "volumetric video", though he likely meant to say "volumetric cinematography".

While the concept of the Bullet Time effects in The Matrix movie exemplified the physics-breaking magic that can occur in a virtual world versus the real world, it was no more than a hack of the readily available and limited camera technology. Using technology like the Kinect and other devices, all the gesture and body capture data is really there to change the virtual world around us according to how our body influences it.

Remembering that this interview was done in 2011, Gaeta insisted that this vision isn't pie in the sky. He expected holographic viewing technologies available within five years, and the processing power needed to make true-to-life rendering possible within ten. Ironically, when an audience member quipped about having an elephant in her house, Gaeta was quick to point out that she will have that elephant within three years!


Pretty close, John!

Of course, Gaeta highlighted the potential evils of the technology. In addition to storytelling, we are also looking at the development of a new culture; even a new "punk" culture based on this media. The cartoon metaverse will eventually be replaced with true-to-life imagery as technology allows, and core to making many of these new experiences possible are cameras and immeasurable amounts of data being collected from each user. The vendors will effectively know everything about you. In 2011, Gaeta used the Facebook reference and that he made a conscious choice to not create a Facebook account. This new era will be far more intrusive than that.

I'm not doing John justice. Best you watch the video for yourself and share your thoughts! Very interesting!