Today, MTBS is joined by David Cole, founding partner of NEXT3D. Among other things, NEXT3D has taken a direct interest in virtual reality by developing techniques for recording real life VR experiences. David shares his story, explains how their technology works, and tips his hand on things to come!
MTBS: Hi David! Tell us about Next3D. What do you guys do?
We’re heavily involved in stereoscopic compression. We have a lot of IP and several products built on a process for preserving stereoscopic quality in significantly compressed content.
MTBS: I know you have been watching the Oculus story unfold, but I understand you have prior VR experience as well. Please tell us about your earlier work.
I was in the right place at the right time in the 90s (So Cal) for a ring-side seat to the first VR revolution (you remember, Jaron Lanier, great white-guy dreadlocks…). It was a heady time. Artists and computer scientists, Hollywood directors paying attention, Timothy Leary telling all that VR was better than LSD. Basically, any world, any application than could be imagined was suddenly theoretically possible. Venture and strategic capital was flowing into startups like Virtual I/O.
My company built electro-optics (in fact, we built 3D LCD shutter glasses and the SEGA HMD). We KNEW that HMDs were going to suck for a while and that rendering capacities were gonna be terribly inadequate. But that wasn’t enough to diminish the promise…until it was. Almost overnight the VR bubble burst. Money dried up, start-ups failed and everyone ran for the door (which turned out to be the internet).
I ran for Location-Based Entertainment. I developed an immersive ride called Cyberfin that simulated swimming with dolphins. It had some mild success – sold to aquariums and dolphinariums. In fact, some of the DNA in the Cyberfin stereoscopic ride controller is in Next3D’s compression technology.
In any case, I’m sure that there are a lot of hearts and minds out there that never gave up on the VR dream. Those folks, coupled with all of the new talent that is attracted to the promise of VR add up to a spring-loaded rebirth of a revolution. Only this time, the consumer version doesn’t suck.
MTBS: Now that the Oculus Rift is around the bend, what kinds of things do you think we will be able to do that we haven’t until now? Looking back at your previous work, why is Oculus a game changer for you?
The Rift; it’s all about field-of-view and tracker latency. That’s the revolution. Narrow FOV is a deal killer because it totally blows the illusion of immersion. And now it’s a solved problem. Tracker lag is a curse that has been lifted. Combined that with GPUs that can deliver well over three billion triangles per second for less that $400. What’s that Eric Gullichsen quote, “reality is rendered at eight million shaded polygons per second”?
MTBS: Have modern movies reached their technological limit? I mean, yes, we can have bigger screens, better special effects, and effective 3D…but is the medium of the white screen in front of an audience reaching its innovative peak? Why or why not?
No way. We all get to watch while GREAT technical moviemakers and GREAT creative minds (sometimes in the same body e.g. James Cameron) are going to spend hundreds of millions of dollars to take us to worlds unknown. It’s brute force innovation. HUGE budgets buying HUGE on-screen firework shows. Framerates are going up, resolution is increasing and laser projectors are gonna…well…use frick’n laser beams to BLAST the movie into our retinas. And wait ’till Doug Trumbull gets a hold of audiences in a dark room again!
I truly believe that the 3D rebirth in cinema is part of a sea change in entertainment that includes virtual reality and total immersion. There is probably more innovation in entertainment happening now than ever before.
MTBS: What is Full Court? How does it work?
Full-Court is a method of acquiring a 180-degree or 360-degree video image that is rendered orthostereoscopically in a HMD. Simply put, it’s like being there. Everything is right-sized and the characteristics of the stereo (parallax and roundness) are all AS IF you were standing exactly where the camera is. When you look around, the view tracks exactly as if you were “on location”.
We start with a very high resolution stereo camera rig with 180-degree fish-eye lenses. Currently, we’re supporting RED Scarlets & Epics and Sony F55s. We are very hopeful to add support for GoPro Hero3’s in 2.7K mode, as soon as GoPro releases the 3D sync kit. (BTW – David Newman from GoPro says, “It is coming. Certainly we expect to be in the market well before consumer launch of Oculus Rift.”).
The video is then processed to extract depth information and transformed to maintain orthostereo alignment throughout the viewing area. The resulting video and depth information is placed in a transport stream and transmitted to the end-user where it’s reconstructed in a proprietary player.
MTBS: Am I correct that’s it’s always based on a static camera? Why can’t the camera move around?
The camera cannot pan or tilt, unless there is a foreground element that is panning and tilting with the view. An example of that would be an airplane cockpit or the dashboard of a car. The camera CAN dolly forward and back. Here’s why….imagine that we’re panning the camera to the left when you decide to turn your head to the left – the result is that your heard turns twice as fast. It’s a recipe for barf-o-vision.
MTBS: Are you able to zoom in and out with a camera using Full Court, or does its nature prevent that? Would digital zoom work?
We maintain the orthostereo correlation between subject and viewer, which results in a one-to-one apparent zoom (a 7 foot tall basketball player, 10 feet in front of the camera appears to be a 7 foot tall basketball player, 10 feet in front of the viewer). There ARE exceptions, however. We’ve found some very cool “Alice in Wonderland” effects that break orthography in a cool way.
There is a need to include non-ortho content, however. Imagine a basketball game in Full-Court. You’re gonna WANT to see the televised close-ups, replays, hoop-mount cam views, etc. For that, we include a skybox. Just have to look up at the scoreboard and you can see all the non-ortho content (just as if you were sitting in at the venue). Interestingly, we know when the viewer is looking at the scoreboard (thanks to head tracking), so we don’t have to service scoreboard rendering until the viewer is looking.
MTBS: I understand it’s still a work in progress. What challenges are you working to overcome?
Getting producers beyond demo-ware. It’s very hands-on. One very practical issue is that MOST of our producers are still waiting on their Oculus Dev Kits. We only have two ourselves, so far (and are damn lucky to have ’em). So, we’re all over the place trying to help producers get their heads around shooting in Full-Court.
Also, there are gaping holes in the plans for content distribution, business models, consumptive patterns and expected demographics. It’s taking a real leap of faith for a content producer to really sink time into producing something in Full-Court. Especially with NO LAUNCH DATE for the consumer version.
Technically, things are going very well. The player is solid. The ortho-calibration is GREAT for RED cams with any of the three fish-eye lenses that we support. We’re working now on a player that is embedded within a VR world; the “lobby” for the player is a world.
MTBS: I immediately see the potential with sporting events or anything where a static camera is needed to view a wide area – a rock concert, an event…maybe even for mall or airport security. How would it be for movie making? Could a VR angle change the way we tell..or the way we experience…stories? How so?
We’re just beginning to get a feel for this. The question of moving a narrative forward when the producer cannot direct the viewer’s attention is a big, hairy, mind-bender of an issue. We have a few very bold experiments happening right now. One is likely to result in the first released content in Full-Court. It’s a reality show that is shot entirely in Full-Court. The Executive Producer is a very big thinker and I think he’s cracked the nut.
MTBS: Do you think we will see new types of movies and stories thanks to VR film technology like this? Any guesses on what kinds of experiences we will see?
I think we’ll see a lot of the obvious stuff – rock concerts, sports, theater..both live and canned. Then, I’m not sure. The most exotic blend of content that we’ve had producers kicking around is a hybridization of VR and Full-Court video. That could diverge from the “norm” pretty darn quick. You know, there were GREAT minds thinking, taking, debating about non-linear narrative back in the first VR revolution. I’m hoping we see that kind of innovative thinking put in motion now.
MTBS: Do you believe in ghosts? Prove it!
Ah…if there was one question I’d like to dodge 😉 So…I’ve spent a little time involved in a 3D paranormal reality production called Anomaly. It’s required quite a bit of my attention and as a result, I’ve been in some of America’s most allegedly haunted locations. I would be lying if I told you that my skepticism isn’t fading a little. I don’t have proof, but of the list of most often reported phenomena (voices, footsteps, shadows, things moving), I’ve directly experienced them all. I’ll offer you this challenge – spend an hour or so on the third floor of Castle Warden Hotel (now Ripley’s Museum in St. Augustine, FL) in the middle of the night by yourself. Let me know how your objectivity fairs.
MTBS: Recognizing that the Oculus Rift hasn’t actually announced a release date for consumers, when is NEXT3D aiming to have marketable solutions for content makers?
We aim to have a decent content catalog ready for the consumer release. We already have content production guides, encoders, beta-players (with an SDK for customization) and licensing for producers. We expect that a few pieces of content will be avaiable to the dev community pretty soon.
Great stuff! Share your thoughts below! What do you think of VR experiences based on real life movie captures?