Search

MTBS3D RT @tifcagroup: .@joshokane of the @globeandmail and @globebusiness will be moderating this year’s Futurists panel at @GetImmersed. #Immers
MTBS3D RT @GetImmersed: Immersed 2018 Calls For #FutureComputing Speakers and Visionaries. Fifth Annual Conference Highlights #AI, Compute, #Immer
MTBS3D RT @tifcagroup: .@AjayFry will be speaking at @GetImmersed! Ajay has a new #eSports show on @GinxTVCanada. HUD (Heads Up Daily) on @SuperCh
MTBS3D RT @GetImmersed: Tips to maximize your chances at becoming an accepted event speaker for #Immersed2018! #FutureComputing #MachineLearning #…
MTBS3D RT @GetImmersed: Join us in Toronto for #Immersed2018! November 8-9, 2018. Now in our 5th year! https://t.co/KPOLJbNrGm #VR #AR #MR #AI #Fu
MTBS3D RT @GetImmersed: Dr. David Rolston, CEO of Tirocorp is speaking at #Immersed2018. His is a best selling author in #AI development, and curr…
MTBS3D TIFCA PATH Revealed For New Members. #FutureComputing #TIFCA https://t.co/w9kVPLkCrw
MTBS3D RT @GetImmersed: We are pleased to announce that Jamie Fleming, CEO of @studio216 will be speaking at #Immersed2018. Studio 216 specializes…
MTBS3D RT @GetImmersed: Pure Strategy’s Founder & CEO Discusses #AI and #ML in Business at #Immersed2018. Their Automated Neural Intelligence Engi…
MTBS3D .@GetImmersed Immersed Announces New Rapid Business Development Program. #Immersed2018 https://t.co/ubyvq8Pay3
MTBS3D RT @GetImmersed: Do you want to speak at #Immersed2018? Download our speakers guide! #Immersed #VR #AR #AI #cloud #futurecomputing #futuret
MTBS3D RT @usnistgov: Join #PublicSafety Communications Research Division Chief Dereck Orr at #Immersed2018 where he’ll deliver a keynote this Nov…

Novel Way to Shoot VR With GoPro Cameras


While there is a seemingly endless amount of enthusiasm for 360-degree cameras; especially cameras that can record in stereoscopic 3D, the availability of such tools is actually very rare. It's not just that most of the camera rigs are in continual development and are largely experimental, the tools needed to effectively cut and edit content like this is very processor intensive. For example, if it takes several hours of processing time to encode a regular video, imagine what it is to encode a film based on 14+ cameras or a full-resolution 360-degree solution?

This is the conundrum faced by Toronto independent filmmakers Elli Raynai and Alexander Kondratskiy, and their quest to record virtual reality on a tiny budget. The result was "I am You", a short film that gives the HMD user the perspective of the wearer. Here is Elli's explanation of how they got it to work with just two GoPro cameras:

I am You Movie Theater Scene
"We shot the first half of the film with a regular DSLR camera and built a 3D cinema from scratch. This way a viewer who hasn't experienced VR before could get a 'comfortable' introduction to the platform. Then halfway through the film the viewer gets pulled into the body of the character and they see the remaining part of the movie from their perspective. We use two gopro's which gave us an approximate FOV of 120 degrees, I don't know how much we had vertically, but it wasn't much. We shot that part in 3D, but not 360 of course, as we chose a different approach for immersion and directing the viewer's attention. In that way when the actor moved left or up and down, so did the screen. So basically the viewer has to follow it, which makes them feel like they are there."

The advantage of shooting 360-degree video is the HMD user can move their head all over the scene and get the full experience. Remember that they only need to see 90 to 120 degrees of field of view at a time depending on the hardware they are using, so while there is a full 360-degrees of content being sent to the user, they have full head tracking capability because they only need to see smaller slivers of it at a time. This is how it's possible to remotely enjoy a basketball game, for example.

The challenge is if you don't have a 360-degree camera, you only have a fraction of the material to work with when making your VR films, so it's no longer possible to offer head tracking. What happens instead is the HMD user is forced to see experiences that are completely disconnected from where their head is pointed, and this can easily cause nausea and discomfort.

I am You First Perspective
The solution is in how the content is captured along with additional positional data. They were still limited to 120 degrees of FOV, but as the scene was captured from the stereoscopic 3D GoPros on the actor's head, he was also wearing an Oculus Rift that was capturing head tracking data. The filmmakers then created a digital 360-degree map, and using this head tracking data, placed the image where it should be according to what the actor experienced. For the final user, they just have to keep a watchful eye and follow the placement of the scene in this 360 environment and move their head around to follow it accordingly.

Very interesting experiment! The film was also featured by Canada's CBC.  Congratulations!