MTBS3D We interviewed Steve Venuti, VP of Marketing for @KeyssaTech. They've developed a solid state connector that is cha…
MTBS3D Jim Malcolm, Chief Marketing Officer for @HumanEyesTech demonstrated their latest combination 360 degree plus 180…
MTBS3D .@Dlink spoke to us last week at #CES2020 about their latest Wifi 6 mesh router and what it means for end users.…
MTBS3D RT @tifcagroup: Our Client-to-Cloud Revolution Portal is LIVE! Computing is heading towards a world of any place, any device user experienc…
MTBS3D RT @IFCSummit: .@thekhronosgroup President @neilt3d talks open standards and client-to-cloud at #IFCSummit. #CloudComputing
MTBS3D Please help @tifcagroup by completing their survey by December 16th!
MTBS3D RT @IFCSummit: #IFCSummit Visionaries Panel with @IntelSoftware @intel @AMD and @jonpeddie talks Client-to-Cloud. #CloudComputing #FutureCo
MTBS3D RT @IFCSummit: The Futurists Panel moderated by @deantak of @VentureBeat is online. #IFCSummit #CloudComputing #FutureComputing #Futurists
MTBS3D RT @IFCSummit: Daryl Sartain is Director & Worldwide Head of #XR, Displays, and Wireless Ecosystems for @Radeon at @AMD. He is also Chair o…
MTBS3D RT @IFCSummit: Arvind Kumar is a Senior Principal Engineer for @intel @IntelSoftware. At #IFCSummit he explained the workings of the Client…
MTBS3D RT @IFCSummit: Neil Schneider’s #IFCSummit opening presentation. #CloudComputing
MTBS3D RT @IFCSummit: Our #videogames in the #clienttocloud revolution is going on now featuring @playhatchglobal @AccelByteInc @awscloud and @Sha
MTBS3D RT @IFCSummit: On stage now, the #Crypto and #Blockchain markets with Professor Bebo White @bebo and Melissa Brown, Senior Director Develop…

Novel Way to Shoot VR With GoPro Cameras

While there is a seemingly endless amount of enthusiasm for 360-degree cameras; especially cameras that can record in stereoscopic 3D, the availability of such tools is actually very rare. It's not just that most of the camera rigs are in continual development and are largely experimental, the tools needed to effectively cut and edit content like this is very processor intensive. For example, if it takes several hours of processing time to encode a regular video, imagine what it is to encode a film based on 14+ cameras or a full-resolution 360-degree solution?

This is the conundrum faced by Toronto independent filmmakers Elli Raynai and Alexander Kondratskiy, and their quest to record virtual reality on a tiny budget. The result was "I am You", a short film that gives the HMD user the perspective of the wearer. Here is Elli's explanation of how they got it to work with just two GoPro cameras:

I am You Movie Theater Scene
"We shot the first half of the film with a regular DSLR camera and built a 3D cinema from scratch. This way a viewer who hasn't experienced VR before could get a 'comfortable' introduction to the platform. Then halfway through the film the viewer gets pulled into the body of the character and they see the remaining part of the movie from their perspective. We use two gopro's which gave us an approximate FOV of 120 degrees, I don't know how much we had vertically, but it wasn't much. We shot that part in 3D, but not 360 of course, as we chose a different approach for immersion and directing the viewer's attention. In that way when the actor moved left or up and down, so did the screen. So basically the viewer has to follow it, which makes them feel like they are there."

The advantage of shooting 360-degree video is the HMD user can move their head all over the scene and get the full experience. Remember that they only need to see 90 to 120 degrees of field of view at a time depending on the hardware they are using, so while there is a full 360-degrees of content being sent to the user, they have full head tracking capability because they only need to see smaller slivers of it at a time. This is how it's possible to remotely enjoy a basketball game, for example.

The challenge is if you don't have a 360-degree camera, you only have a fraction of the material to work with when making your VR films, so it's no longer possible to offer head tracking. What happens instead is the HMD user is forced to see experiences that are completely disconnected from where their head is pointed, and this can easily cause nausea and discomfort.

I am You First Perspective
The solution is in how the content is captured along with additional positional data. They were still limited to 120 degrees of FOV, but as the scene was captured from the stereoscopic 3D GoPros on the actor's head, he was also wearing an Oculus Rift that was capturing head tracking data. The filmmakers then created a digital 360-degree map, and using this head tracking data, placed the image where it should be according to what the actor experienced. For the final user, they just have to keep a watchful eye and follow the placement of the scene in this 360 environment and move their head around to follow it accordingly.

Very interesting experiment! The film was also featured by Canada's CBC.  Congratulations!