Search

MTBS3D RT @GetImmersed: Do you want to speak at #Immersed2018? Download our speakers guide! #Immersed #VR #AR #AI #cloud #futurecomputing #futuret
MTBS3D RT @usnistgov: Join #PublicSafety Communications Research Division Chief Dereck Orr at #Immersed2018 where he’ll deliver a keynote this Nov…
MTBS3D RT @GetImmersed: Immersed 2018 Calls For Future Computing Speakers and Visionaries! Fifth Annual Conference Highlights #AI, Compute, #Immer
MTBS3D RT @GetImmersed: We are proud to announce that Dereck Orr, Division Chief at the Public Safety Communications Research Division Laboratory…
MTBS3D Intel introducing discrete GPUs in 2020. #Intel https://t.co/tsMRHvc42T
MTBS3D RT @GetImmersed: New schedule and contact form for speakers online! #Immersed2018 #getimmersed #VR #AR #MR #AI #immersivetech https://t.co/
MTBS3D This could be the biggest and most influential development in #VirtualReality! The #ViveFocus is a mobile #VR devic… https://t.co/0DYVX45Kk4
MTBS3D Are you going to go for the GO? The #OculusGo is #Facebook's first foray into standalone HMDs. #FacebookF8 #VRhttps://t.co/3iPcvd7lRG
MTBS3D Ana Ribeiro talked to Neil about her new 80's style @pixelripped #VirtualReality game at @OfficialGDC. #GDC 2018… https://t.co/P1uTSlhczW
MTBS3D MTBS interviewed @3rdeyestudiosfi at @OfficialGDC about their new game Downward Spiral Horus Station. #GDC2018 #VRhttps://t.co/xYZlN5mdlr
MTBS3D We interviewed @Survios at @OfficialGDC about their new #VR boxing title, Creed Rise to Glory. #GDC2018https://t.co/boztTx4iab
MTBS3D .@tifcagroup was launched last month. This is the introductory presentation for future computing, TIFCA, and things… https://t.co/vRVPBoAnMi

Novel Way to Shoot VR With GoPro Cameras


While there is a seemingly endless amount of enthusiasm for 360-degree cameras; especially cameras that can record in stereoscopic 3D, the availability of such tools is actually very rare. It's not just that most of the camera rigs are in continual development and are largely experimental, the tools needed to effectively cut and edit content like this is very processor intensive. For example, if it takes several hours of processing time to encode a regular video, imagine what it is to encode a film based on 14+ cameras or a full-resolution 360-degree solution?

This is the conundrum faced by Toronto independent filmmakers Elli Raynai and Alexander Kondratskiy, and their quest to record virtual reality on a tiny budget. The result was "I am You", a short film that gives the HMD user the perspective of the wearer. Here is Elli's explanation of how they got it to work with just two GoPro cameras:

I am You Movie Theater Scene
"We shot the first half of the film with a regular DSLR camera and built a 3D cinema from scratch. This way a viewer who hasn't experienced VR before could get a 'comfortable' introduction to the platform. Then halfway through the film the viewer gets pulled into the body of the character and they see the remaining part of the movie from their perspective. We use two gopro's which gave us an approximate FOV of 120 degrees, I don't know how much we had vertically, but it wasn't much. We shot that part in 3D, but not 360 of course, as we chose a different approach for immersion and directing the viewer's attention. In that way when the actor moved left or up and down, so did the screen. So basically the viewer has to follow it, which makes them feel like they are there."

The advantage of shooting 360-degree video is the HMD user can move their head all over the scene and get the full experience. Remember that they only need to see 90 to 120 degrees of field of view at a time depending on the hardware they are using, so while there is a full 360-degrees of content being sent to the user, they have full head tracking capability because they only need to see smaller slivers of it at a time. This is how it's possible to remotely enjoy a basketball game, for example.

The challenge is if you don't have a 360-degree camera, you only have a fraction of the material to work with when making your VR films, so it's no longer possible to offer head tracking. What happens instead is the HMD user is forced to see experiences that are completely disconnected from where their head is pointed, and this can easily cause nausea and discomfort.

I am You First Perspective
The solution is in how the content is captured along with additional positional data. They were still limited to 120 degrees of FOV, but as the scene was captured from the stereoscopic 3D GoPros on the actor's head, he was also wearing an Oculus Rift that was capturing head tracking data. The filmmakers then created a digital 360-degree map, and using this head tracking data, placed the image where it should be according to what the actor experienced. For the final user, they just have to keep a watchful eye and follow the placement of the scene in this 360 environment and move their head around to follow it accordingly.

Very interesting experiment! The film was also featured by Canada's CBC.  Congratulations!