It is currently Sun Dec 08, 2019 3:16 am



 [ 18 posts ] 
 Improved Pre-Warping for Wide Angle, Head Mounted Displays 
Author Message
One Eyed Hopeful
User avatar

Joined: Mon Jun 10, 2013 11:54 am
Posts: 5
Hi folks,

I would like to introduce you to our new research paper regarding improvements to the optical distortion compensation that is used in wide angle HMDs like the Oculus Rift.

http://blog.qwrt.de/improved-pre-warpin ... ngle-hmds/
(https://twitter.com/Da_Pohl/status/374856819581022208)

Image

Please let me know your thoughts on this.

Thanks!
Daniel


Tue Sep 03, 2013 7:18 am
Certif-Eyed!

Joined: Sun Mar 25, 2012 12:33 pm
Posts: 661
Woah, that is certifiably AWESOME.

I'd always wondered why images in the rift were inexplicably blurry, and now I know why!

Help me understand the Object Space distortion. You first tessellate the scene so there's a bunch of vertices. Then you warp each vertex according to a barrel warp with a vertex shader? Would this be as easy as using a vertex shader on a tessellated scene? Or does it need to be done with raytracing?

Thank you for pinning down the cause of this issue (and for finding a solution!)

EDIT: I couldn't tell, does rendering onto a mesh provide any benefits? Would Aniostropic filtering help in this case?

EDIT: Here's the Bilinear to Object Space Comparison (what we are currently getting vs. what we WANT):
Image
EDIT: In-case you can't tell, it's an animated gif switching between the two modes. Open it up in a new browser and zoom in for the full effect.


Last edited by zalo on Wed Sep 04, 2013 5:33 pm, edited 1 time in total.



Wed Sep 04, 2013 4:27 pm
Two Eyed Hopeful

Joined: Thu Mar 05, 2009 9:14 pm
Posts: 85
:woot

I want this implemented ASAP, Call the president so he can get on it ASAP. This is a VR emergency.


Wed Sep 04, 2013 5:06 pm
Sharp Eyed Eagle!
User avatar

Joined: Sun Jan 06, 2013 4:54 am
Posts: 450
yep, warp the world rather than the image. good stuff. i suggested it at one time and people said it couldn't be done cos no engines supported it. which was true until now.

eidt: i have a question; how is the performance of this method compared to the image warping hack?


Last edited by PasticheDonkey on Thu Sep 05, 2013 7:19 am, edited 1 time in total.



Thu Sep 05, 2013 6:52 am
Sharp Eyed Eagle!

Joined: Fri Jan 18, 2013 7:13 am
Posts: 427
Location: Gothenburg, Sweden
Really interesting read!


Thu Sep 05, 2013 7:19 am
One Eyed Hopeful
User avatar

Joined: Mon Jun 10, 2013 11:54 am
Posts: 5
Thanks for your feedback!

Quote:
Help me understand the Object Space distortion. You first tessellate the scene so there's a bunch of vertices. Then you warp each vertex according to a barrel warp with a vertex shader? Would this be as easy as using a vertex shader on a tessellated scene? Or does it need to be done with raytracing?


For object-space correction there are multiple methods:
1) vertex shader
2) tesselation shader
3) sampling the scene differently, e.g. casting barrel-distorted rays in a ray tracer or voxel ray caster

1) The vertex shader approach takes existing vertices and looks at which location they are after the ModelView and Projection matrix has been applied. With the same equation as used in the post-processing pixel shaders the vertices will be moved according to their distance to the lens distortion center to simulate the barrel distortion. The downside is that if you have only a low-poly mesh then there are not enough vertices to nicely approximate the barrel distortion.
2) Using instead a tesselation shader new vertices can be created on the fly. That way there is more detail and the approximation to the barrel distortion will be better. However, it is hard to know in advance how many new vertices will be needed so this could become pretty performance intensive and inefficient. (for 1) and 2) there are Linux samples from Philip Rideout: http://github.prideout.net/barrel-distortion/)
3) Here all vertices are left at their original position. Now the sampling is changed. In DX/OGL you sample at your regular, rectangular grid. Using a custom written rendered (rasterizer, ray tracer, voxel ray caster, ...) the sampling can be changed to barrel-distorted sampling points.

Quote:
I couldn't tell, does rendering onto a mesh provide any benefits? Would Aniostropic filtering help in this case?

In general distortion meshes are good to use if the graphics architecture does not support shaders (e.g. older mobile phone chips) and if no custom written renderer is used.

First the case without anisotropic filtering. The samples would be taken in a bilinear fashion. Therefore for a very highly tesselated mesh the quality will converge towards the one using post-processing pixel shaders with bilinear texture filtering.

Distortion mesh using anisotropic filtering: Interesting idea. I did not test this myself yet. The way AF works (http://en.wikipedia.org/wiki/Anisotropic_filtering) there will be multiple texture samples "at oblique viewing angles with respect to the camera". For the center area of the distortion mesh, which will be the most important area looking through the lens, there will probably be no difference as this is almost directly facing the camera. The angles for the outer areas are facing a little bit away from the camera, so here AF would probably be a bit better compared to just regular bilinear filtering. Using the Oculus A-cup lenses I am not able to see the most outer regions, so the total benefit might be rather small. Still thanks for the idea, I might run some more tests on this.

Quote:
how is the performance of this method compared to the image warping hack?

There are a lot of performance numbers in the pdf linked from the blog.

Thanks!
Daniel


Mon Sep 09, 2013 7:20 am
One Eyed Hopeful

Joined: Sat Nov 10, 2012 4:27 am
Posts: 27
Good work on this, I had wondered how long it would take to see something like this, and how practical it would be to achieve in a rasteriser rather than the more obvious raycasting application. Pleased to see that it's achievable without a prohibitive performance hit too. Congratulations, and keep up the good work :)

A question though, what are the implications for antialiasing? would we see an increase in sample cost for non post-process methods? If so, how significant an increase?


Mon Sep 09, 2013 10:27 am
Petrif-Eyed
User avatar

Joined: Sat Sep 01, 2012 10:47 pm
Posts: 2708
Paul Bourke has been pushing this approach since 2004:
the solution to that problem is to distort the object geometry BEFORE RENDERING, so that the final image looks correct with no added "pre-warp" pixel distortion
The technique involves creating a mirror or lens as a geometric primitive that is placed in front of a camera. The mirror is designed precisely so that the captured image, when projected, will look correct on the intended projection screen.
pre-warp is done too late in the rendering pipeline and is responsible for the blurring. There are threads at OculusVR where I have been pushing people to use in-game fisheye lens modelling (as proposed and demonstrated by Paul Bourke in 2004), instead of doing it later and causing blur. There is a real-time Rift ray-tracing project over there, that uses this method very effectively. No blur whatsoever in the final pre-warped images.
This method of modelling in-game pre-warp before rendering has been proven superior to the current method of 2-D warping the final framebuffer images in the past, and this thread shows that it is also better for our current Rift system.

@elchtest: Thanks for advancing the state-of-the-art in Rift pre-warp another step in the right direction!

_________________
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. Image


Mon Sep 09, 2013 12:28 pm
Cross Eyed!

Joined: Fri May 03, 2013 12:48 am
Posts: 189
Is this more or less resource intensive for the computer?


Tue Sep 10, 2013 1:25 am
One Eyed Hopeful

Joined: Tue Jan 15, 2013 9:44 pm
Posts: 5
Is this similar to what the spherical mirror projection guys are doing? The meshwrap application? Or am I completely wrong?

Edit; didn't see cybers post, looks like it is.


Thu Sep 12, 2013 11:47 pm
One Eyed Hopeful

Joined: Mon Aug 27, 2012 10:35 am
Posts: 7
Location: QC, Canada
WOW!!!


Fri Sep 13, 2013 10:29 am
Sharp Eyed Eagle!
User avatar

Joined: Sun Jan 06, 2013 4:54 am
Posts: 450
Ziggurat wrote:
Is this more or less resource intensive for the computer?

from what i read in the pdf it's more resource intensive (but not too significantly so), but worth it particularly since the detail improvement effects the most sensitive area of your eye when looking straight ahead.


Fri Sep 13, 2013 11:00 am
Cross Eyed!

Joined: Mon Mar 15, 2010 8:16 am
Posts: 100
elchtest wrote:
I would like to introduce you to our new research paper regarding improvements to the optical distortion compensation that is used in wide angle HMDs like the Oculus Rift.

Nice work!

After seeing that it's being published with ACM I can only conclude that SIGGraph will be even more awesome than usual in the coming years when we start seeing researchers use Oculus Rift and other stuff that make VR cheap. There has of course been VR work done before but I imagine that high costs have hampered people before.


Wed Sep 18, 2013 5:03 am
Petrif-Eyed
User avatar

Joined: Sat Sep 01, 2012 10:47 pm
Posts: 2708
...
pre-warp is done too late in the rendering pipeline and is responsible for the blurring. There are threads at OculusVR where I have been pushing people to use in-game fisheye lens modelling (as proposed and demonstrated by Paul Bourke in 2004), instead of doing it later and causing blur. There is a real-time Rift ray-tracing project over there, that uses this method very effectively. No blur whatsoever in the final pre-warped images.
...
Hmm... MTBS3D seems to be a bit borked after the restoration. The above link goes to the WRONG forum (f=14 instead of f=140) and shows the wrong post. Also, searching for keywords in the above quote does not find the missing post...

Neil, where might I find that missing post the WAS at the above link (as quoted)?

EDIT: Even though I originally did a copy/paste on the link above, it seems to be missing a trailing '9' (p=12952 instead of p=129529). Adding the missing character makes it work:
http://www.mtbs3d.com/phpBB/viewtopic.php?f=140&t=17715&p=129529#p129529
However, that does not explain why the missing digit links to a wrong post in a wrong forum (try clicking it). Also, I normally test my links, so I believe that old link actually worked in the past... Strange stuff.

_________________
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. Image


Mon Nov 11, 2013 6:09 pm
Terrif-eying the Ladies!
User avatar

Joined: Tue Jan 29, 2013 2:05 am
Posts: 910
In the world of digital audio, at the studio or end user manipulation level..each time you enact a level of digital to digital manipulation, the more the final signal ends up sounding like hammered fecal matter. The more it turns into indistinct/distorted hash. Until recently and probably right up to this date, the best sounding audio albums have been handed off to the mastering studios on pure analog tape: 2 inch wide, 30 inch per second, analog tape. At my last recollection, about 70-80% of the albums at Bob Ludwig's 'Masterdisk' mastering studio came in to him, in that format. (Bob is one of the most respected mastering engineers in the audio business) (look at your best sounding records and discs, you will see his name on some of them)

As a matter of fact, what some in that world have turned to, to stop 'numerical remainders' from falling between the subsequent bit gaps..and slowly becoming a state of cumulative error, is to convert to analog tape, do the transform..and then return to being digital.

This problem can be ameliorated by making the given digital manipulation stage have a magnitude better of bit depth. However, the accumulation of error is inevitable, it can only be temporarily, in one given stage at a time - ameliorated.

Ameliorated, via very careful and considered understanding of exactly what is being done at the digital/numerical level, and what the final output is, and what that means to the ear-brain combination.

In the case of visual media manipulation at the digital level as a direct comparison of this issue in digital audio... and with the overlay of the grid issue of digital signal reproduction, ie pixels of a display, this accumulation of error issue is very much a concern. Especially when it is being handled as a staged render situation, in real time manipulation of said data.

Any form of skipping stages of manipulation and committing to a more refined (and targeted) single stage, can, if properly handled, increase perceived image fidelity. The audio industry's experience in digital audio shows this to be abundantly true.

In this case... in a complex digital visual environment, the best bet is to target ..(KBK shuts his trap) whoops.....well.....I'll leave that alone, and hopefully Oculus is doing the right thing.

_________________
Intelligence... is not inherent - it is a point in understanding. Q: When does a fire become self sustaining?


Wed Nov 20, 2013 11:35 am
Petrif-Eyed
User avatar

Joined: Sat Sep 01, 2012 10:47 pm
Posts: 2708
After reading KBK's post drift from audio to displays, I thought I would add a little to the mix:
Apple Retina display at 26x:
Image

Magazine typography at 26x:
Image

Apple Retina at 375x:
Image

Magazine at 400x:
Image
Steve Jobs wrote:
“It turns out there’s a magic number right around 300 pixels per inch, that when you hold something around to 10 to 12 inches away from your eyes, is the limit of the human retina to differentiate the pixels.”
Although Steve Jobs’ claims the human eye can’t perceive detail beyond 300 pixels per inch were immediately debunked, to this day almost everyone believes that what he said is still true. I became interested in this topic after seeing what I considered to be obvious differences between the highest-resolution smartphone displays. If Jobs’ claims were true, this shouldn’t have been possible.
...
The chart many eye doctors still use to determine whether you have 20/20 vision is a crude method that dates back to 1862. Eye charts were created to test vision, but we’re talking about something that goes beyond just text. We’re trying to determine whether a human can see the pixels on a display — and more importantly whether there is a benefit of using displays with resolutions higher than 300 PPI.
...
When a mobile display goes under a microscope, it’s easy to see major differences between the types pixels used. The size varies, the shape varies, the placement varies. Even the color varies because some displays are now including white pixels (in addition to RGB). Some have pen-tile displays, others don’t. Even the type of displays used on popular smartphones vary. Companies like Samsung use OLED displays, while Apple uses LCD displays. Each has its own advantages and disadvantages. You can even see differences in the pixels on Samsung phones that have Super AMOLED displays.
Regarding KBK's post, once audio or video is digitized, that "bit drift" issue he talks about should not occur (ever) unless to convert back to analog (or PWM) and redigitize. Bits are bits, and with sufficient FEC (Forward Error Correction) should not decay. However, digital media is subject to decay, and so are data storage formats. That is why backup copies are good. Also, beware of using lossy compression formats like MP3. For archival preservation, lossless audio (and video) compression is recommended, so that editing/remixing/color correction can be done later without significant loss of quality. Even JPEG for photos is lossy, which is why I take raw photos.

But that "bit drift" and "error accumulation" cannot happen in the strictly lossless digital domain. That can only happen when converting back and forth between analog and digital, much like the decay from a multi-generational Xerox photocopy.

And staying strictly analog is not so good either, because each copy adds new analog noise (such as tape hiss) to the mix.

_________________
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. Image


Last edited by geekmaster on Thu Nov 21, 2013 6:49 am, edited 1 time in total.



Wed Nov 20, 2013 6:22 pm
Certif-Eyable!
User avatar

Joined: Sat Dec 22, 2007 3:38 am
Posts: 990
@geekmaster - I think you hit the nail on the head with the main issue of digital these days; lossy compression. Since MP3 came about, everyone started accepting audio that loses a little of the quality for a great reduction in size. This is all well and good, but then you decide that you would rather have your files in WMA format instead, so you convert them all over. Now you realize that it would have been better to have them at a higher bitrate to begin with, but there's no getting back the uncompressed audio from source and have ended up with something that sounds like its' been processed by a nintendo 64.
I think the main thing is, if you do use lossy compression, then you should never convert it to another lossy format or mix it with other audio unless the you are happy with something sounding terrible.


Wed Nov 20, 2013 9:10 pm
Petrif-Eyed
User avatar

Joined: Sat Sep 01, 2012 10:47 pm
Posts: 2708
android78 wrote:
@geekmaster - I think you hit the nail on the head with the main issue of digital these days; lossy compression. Since MP3 came about, everyone started accepting audio that loses a little of the quality for a great reduction in size. This is all well and good, but then you decide that you would rather have your files in WMA format instead, so you convert them all over. Now you realize that it would have been better to have them at a higher bitrate to begin with, but there's no getting back the uncompressed audio from source and have ended up with something that sounds like its' been processed by a nintendo 64.
I think the main thing is, if you do use lossy compression, then you should never convert it to another lossy format or mix it with other audio unless the you are happy with something sounding terrible.
Always convert to any lossy (or lossless) digital format from the ORIGINAL digitized bits, stored only in LOSSLESS compression formats since originally being digitized. Copying lossless digital data does not accumulate generational losses. Format conversion cannot recover data that was discarded during lossy compression. And analog recording cannot be copied by any method without adding noise.

Regarding "bit drift", that can apply both to the time domain (jitter) and to the spatial domain (magnetic migration). Magnetic migration can cause digital bits to fade when stored on digital magnetic tape, from a similar (or identical) process of "print-through" between tape reel layers. In audio recording, print-through can be heard as pre- and post-echos, delayed by the amount of tape between successive layers on a tape spool. This print-through can also destroy or weaken digital bits recorded on tape, but can be minimized by fast-forward and rewind of every tape spool in your collection, every six months. You have been doing that with your archival mag tape collection (both digital and audio), right?

Also, there are stories of archival mag tape vaults where all the tapes on the lower racks were corrupted. This was tracked down to the magnetic field from floor polishing machines too close to the tapes on the lower racks. Exposing mag tape to an oscillating magnetic field lowers the coercivity (magnetic retention strength), causing premature tape print-through or even erasure (randomized magnetic domains).

There is a tremendous wealth of data from decades of space flight stored on mag tape. I hope they are still doing maintenance on all of that (periodic fast-forward and rewind to "randomize" print-through). The maintenance process essentially converts print-through to white noise, which is much less objectionable and easier to filter. However, the magnetic domains do decay over time, even on hard drives. That is why long-term hard drive archival storage should also be maintained (read and rewrite every sector on the drive).

Anyway, even non-magnetic phase-change storage media (CDs and DVDs) experience bit-decay over time. With digital data you can use FEC (forward error correction) such as ECC Viterbi codes, to minimize damage caused by archival bit rot.

Another problem is that old digital media archives were written in ad hoc storage protocols, some of which were not documented and saved. Or they are on magnetic tapes for tape drives that only exist in museums. For what future archaeologists may think of our wealth of archived information, think of how "Heechee Prayer Fans" were treated in the Heechee/Gateway series of science fiction books by Frederick Pohl:
Quote:
... The Heechee are a fictional alien race from the science fiction works of Frederik Pohl. The Heechee are portrayed as an exceedingly advanced star-travelling race that explored Earth's solar system millennia ago and then disappeared without a trace before humankind began space exploration ... "Prayer fans" that appear and unfold like traditional fans but are actually computer-related equipment containing the minds of dead Heechees and, later, dead humans. The fans also are used for data storage.
In those books, vast amounts of incredibly value information was lost because these information storage devices were sold to tourists as trinkets and souvenirs, with no understanding of their purpose or value. That could happen to us one day.

EDIT: I had a bad image URL in my previous post. Go back and compare the 26x images above, not that I inserted the correct URL. Clearly, you can see a quality difference between the text displayed on the Apple Retina display and the magazine page, despite claims by Steve Jobs that to the human eye they were indistinguishable (at arm length). Remember that in an HMD we use magnifying lenses, and the display is only a couple of inches from your eyes, so we will need something approaching the resolution of typography (or better) before we cannot tell the difference between VR and RL (Real Life).

_________________
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. Image


Thu Nov 21, 2013 7:10 am
Display posts from previous:  Sort by  
   [ 18 posts ] 

Who is online

Users browsing this forum: No registered users and 23 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Powered by phpBB® Forum Software © phpBB Group
Designed by STSoftware.