Clearing up misconceptions
-
- Binocular Vision CONFIRMED!
- Posts: 337
- Joined: Mon Jan 21, 2013 12:30 pm
Clearing up misconceptions
Hey guys! I'm a new account, but have been lurking for a few weeks. I felt I really needed to clear up some misinformation I keep seeing repeatedly. Note, I've never used the Oculus Rift, but people seemed to be getting confused more and more over basic computer/display operation. The mods might want to make a sticky thread with similar basic information. Note this is going to be as non-technical as possible.
Stereoscopic (aka stereo 3d) - How people with 2 working eyes view depth. Your brain combines both images to give you depth perception. Varying methods are used to achieve this in 3d displays. Some people (including myself) have poor depth perception, but view 3d displays just fine. Still researching why this is so.
Parallax effect - This is the wiki definition. Parallax is a displacement or difference in the apparent position of an object viewed along two different lines of sight, and is measured by the angle or semi-angle of inclination between those two lines. Sounds confusing, but here's a great example. Also shows off advanced headtracking.
[youtube]http://www.youtube.com/watch?v=Jd3-eiid-Uw[/youtube]
"Depth of field" (aka focus) - can be simulated in the game engine. Objects not near the center will blur, Rift lenses naturally do this with the pixels as well.
Separation - between the lenses/images, is the main source of depth and distance. Parallax will enhance this when your moving. Too much separation and you'll get eye strain.
Convergence - really only noticed when your close to objects (at least with shutter glasses). This gives you "pop-out" effect, like it's really coming towards you. Too much, and it will give you focus/eye strain issues.
Refresh Rate - Measured in hz (hertz), every display has horizontal and vertical refresh (vertical is considered more important). The higher a refresh rate, the less perceived flicker and eye strain. Refresh rate is important for realistic motion. This also determines your maximum perceived frame rate. A 60hz display can only show you up to 60 fps, even if the game is running at 200 fps. You get diminishing returns on both refresh rate and fps due to the human eye. Refresh rate has NO EFFECT on graphics performance.
FPS - Stands for first person shooter and frames per second (people have trouble with the latter).
Frames per second is how often THE GAME OR MOVIE updates frame "movement" per second. Most movies are set to 24 fps, but cameras can capture up to thousands of frames per second. A 2d photograph captures 1 frame, most 3d camera setups capture 2 frames (one for each eye).
Games work differently. Depending on your hardware, software, driver, and in-game configurations, you can get wildly different frame rates. In almost all games, higher fps is better. A common misconception is the Oculus or other displays slow down game performance (aka fps). Displays themselves have zero impact on performance. Unplug your display during a game stress test and you should get identical results. What does effect performance is rendering resolution and output mode. Higher resolutions and 3d output lowers fps. Exceptions can be found though. Running an emulator for a game console at 1080p won't effect the fps as only your display resolution is changed, not the in-game resolution. Another example are hardware 2d to 3d converters. These are found in 3d displays/converter boxes. These simulate 3d within the display/box itself and in theory, shouldn't effect game fps. Perceived frame rate (what you actually see) is determined by a display's refresh rate, video cable technology (hdmi, etc.) and of course game performance.
Latency (aka lag) - The time it takes 1 system to communicate with another. Usually measured in milliseconds (ms). Lower is generally desired.
Pre-warping - Means an image is rendered "warped" then corrected for with special lenses. Gives VR displays immersion characteristics.
Stereoscopic (aka stereo 3d) - How people with 2 working eyes view depth. Your brain combines both images to give you depth perception. Varying methods are used to achieve this in 3d displays. Some people (including myself) have poor depth perception, but view 3d displays just fine. Still researching why this is so.
Parallax effect - This is the wiki definition. Parallax is a displacement or difference in the apparent position of an object viewed along two different lines of sight, and is measured by the angle or semi-angle of inclination between those two lines. Sounds confusing, but here's a great example. Also shows off advanced headtracking.
[youtube]http://www.youtube.com/watch?v=Jd3-eiid-Uw[/youtube]
"Depth of field" (aka focus) - can be simulated in the game engine. Objects not near the center will blur, Rift lenses naturally do this with the pixels as well.
Separation - between the lenses/images, is the main source of depth and distance. Parallax will enhance this when your moving. Too much separation and you'll get eye strain.
Convergence - really only noticed when your close to objects (at least with shutter glasses). This gives you "pop-out" effect, like it's really coming towards you. Too much, and it will give you focus/eye strain issues.
Refresh Rate - Measured in hz (hertz), every display has horizontal and vertical refresh (vertical is considered more important). The higher a refresh rate, the less perceived flicker and eye strain. Refresh rate is important for realistic motion. This also determines your maximum perceived frame rate. A 60hz display can only show you up to 60 fps, even if the game is running at 200 fps. You get diminishing returns on both refresh rate and fps due to the human eye. Refresh rate has NO EFFECT on graphics performance.
FPS - Stands for first person shooter and frames per second (people have trouble with the latter).
Frames per second is how often THE GAME OR MOVIE updates frame "movement" per second. Most movies are set to 24 fps, but cameras can capture up to thousands of frames per second. A 2d photograph captures 1 frame, most 3d camera setups capture 2 frames (one for each eye).
Games work differently. Depending on your hardware, software, driver, and in-game configurations, you can get wildly different frame rates. In almost all games, higher fps is better. A common misconception is the Oculus or other displays slow down game performance (aka fps). Displays themselves have zero impact on performance. Unplug your display during a game stress test and you should get identical results. What does effect performance is rendering resolution and output mode. Higher resolutions and 3d output lowers fps. Exceptions can be found though. Running an emulator for a game console at 1080p won't effect the fps as only your display resolution is changed, not the in-game resolution. Another example are hardware 2d to 3d converters. These are found in 3d displays/converter boxes. These simulate 3d within the display/box itself and in theory, shouldn't effect game fps. Perceived frame rate (what you actually see) is determined by a display's refresh rate, video cable technology (hdmi, etc.) and of course game performance.
Latency (aka lag) - The time it takes 1 system to communicate with another. Usually measured in milliseconds (ms). Lower is generally desired.
Pre-warping - Means an image is rendered "warped" then corrected for with special lenses. Gives VR displays immersion characteristics.
Last edited by Direlight on Sat Feb 02, 2013 2:49 am, edited 3 times in total.
- cybereality
- 3D Angel Eyes (Moderator)
- Posts: 11407
- Joined: Sat Apr 12, 2008 8:18 pm
Re: Clearing up misconceptions
Hey, welcome to the forum man!
Interesting topic, but I think you are preaching to the choir here. I'm pretty certain most people here know the difference between FPS and refresh rates.
Interesting topic, but I think you are preaching to the choir here. I'm pretty certain most people here know the difference between FPS and refresh rates.
-
- Binocular Vision CONFIRMED!
- Posts: 337
- Joined: Mon Jan 21, 2013 12:30 pm
Re: Clearing up misconceptions
Don't want to single anyone out, but some people in the kick starter thread were getting confused. Also some things aren't very obvious or common knowledge. Your average person may not know what parallax effect is and certainly not things like pre-warping. Had to look that one up myself.
-
- Cross Eyed!
- Posts: 154
- Joined: Mon Jan 21, 2013 5:26 am
Re: Clearing up misconceptions
A higher refresh rate means at average lower latency, however. As the screen will more often check if there is a new frame to output.
- PasticheDonkey
- Sharp Eyed Eagle!
- Posts: 450
- Joined: Sun Jan 06, 2013 4:54 am
Re: Clearing up misconceptions
nope it just means it will get the material (whatever it's frame rate) on the screen in less time normally. 120hz tvs don't check for a new frame every tick. they just reduce flicker even more and allow stupid interpolation.Paladia wrote:A higher refresh rate means at average lower latency, however. As the screen will more often check if there is a new frame to output.
there are monitors that can accept 120fps over display port tho.
did you know that cinema is 72 hz but 24fps? well more accurately it was.
-
- Two Eyed Hopeful
- Posts: 65
- Joined: Tue Dec 25, 2012 7:20 pm
Re: Clearing up misconceptions
http://www.roadtovr.com/virtual-reality ... erminology - these guys have a glossary, perhaps you should suggest to them to add some of your definitions.
- Okta
- Golden Eyed Wiseman! (or woman!)
- Posts: 1515
- Joined: Tue Feb 12, 2008 5:22 am
Re: Clearing up misconceptions
TS, have you seen this miss informed youtube thread http://www.youtube.com/watch?v=qyPSsC_qf0Y that i have been trying to correct? What's amazing is that youtube video is from an Ouya rep? And he has no clue what he's talking about. The Rift has NO positional tracking and does not fill the entire view.
That wii tracking video is miss representative of the Rift dev kit and should not be included in any Rift representation at the moment. Positional tracking is intended for the consumer release which at the moment is a forming idea that may still never happen if some disaster occurs, that's a big reason i jumped into the dev kit.
That wii tracking video is miss representative of the Rift dev kit and should not be included in any Rift representation at the moment. Positional tracking is intended for the consumer release which at the moment is a forming idea that may still never happen if some disaster occurs, that's a big reason i jumped into the dev kit.
"I did not chip in ten grand to seed a first investment round to build value for a Facebook acquisition."
Notch on the FaceDisgrace buyout.
Notch on the FaceDisgrace buyout.
-
- Binocular Vision CONFIRMED!
- Posts: 337
- Joined: Mon Jan 21, 2013 12:30 pm
Re: Clearing up misconceptions
It's just showing parallax effect and is relevant as it's confirmed for commercial release. This is oculus forum, not developer kit only. You get parallax in any display, headtracking makes it stronger.
Oh yeah...
Thanks for the welcome!
Oh yeah...
Thanks for the welcome!
- android78
- Certif-Eyable!
- Posts: 990
- Joined: Sat Dec 22, 2007 3:38 am
Re: Clearing up misconceptions
@Direlight - Thanks for the contribution. While I think most of this is pretty common knowledge to the long-time guys on here, it's good for those who might be a bit new to this to have it put in simple terms.
Your description is pretty good, but for Latency, this could be a bit abstract.
I would add that:
When talking about the latency affecting HMD, generally we are talking about the time between when a movement is made by the head and the time when the display is showing the new position. Generally this is:
(1/Sensor refresh rate)+(1/Render Frame Rate)+(1/Display Refresh Rate)
The above isn't totally correct since sometimes the frame in the display won't be totally in sync with the renderer, so the delay caused by the renderer can be doubled. Also, since new displays have faster pixel update rate then the refresh rate, it is likely that it will be fully rendered in much less time then the full 1/refresh. But it can be used as an approximation.
@Okta - That really baffles me. It's obviously not what the rift will be like.
Your description is pretty good, but for Latency, this could be a bit abstract.
I would add that:
When talking about the latency affecting HMD, generally we are talking about the time between when a movement is made by the head and the time when the display is showing the new position. Generally this is:
(1/Sensor refresh rate)+(1/Render Frame Rate)+(1/Display Refresh Rate)
The above isn't totally correct since sometimes the frame in the display won't be totally in sync with the renderer, so the delay caused by the renderer can be doubled. Also, since new displays have faster pixel update rate then the refresh rate, it is likely that it will be fully rendered in much less time then the full 1/refresh. But it can be used as an approximation.
@Okta - That really baffles me. It's obviously not what the rift will be like.
-
- Certif-Eyable!
- Posts: 1139
- Joined: Tue Sep 18, 2012 10:32 pm
Re: Clearing up misconceptions
Actually yep. TVs and cinemas are strange, and talk about refresh rate differently (perhaps dishonestly) since they are usually dealing with material that has a limited fixed rate. But for everything else, eg. the Oculus Rift, a higher refresh rate means a lower latency.PasticheDonkey wrote:nope...Paladia wrote:A higher refresh rate means at average lower latency, however. As the screen will more often check if there is a new frame to output.
- android78
- Certif-Eyable!
- Posts: 990
- Joined: Sat Dec 22, 2007 3:38 am
Re: Clearing up misconceptions
I like the way the Oculus rift is everything that isn't TV or cinema.2EyeGuy wrote: But for everything else, eg. the Oculus Rift, a higher refresh rate means a lower latency.
![Wink ;-)](./images/smilies/icon_e_wink.gif)
It also is likely to mean less blur due to faster switching of the pixels, but that's not a given. With cinema, it's also important to remember that the higher refresh rate then frame rate is important for 3D. Especially for panning shots, because it's frame synchronous, if you only show each eye once per frame, then panning shots will actually shift the perceived depth. I believe that the cinemas usually show one eye twice and the other once each frame, alternating which is shown twice for the next frame
Frame 1 - LRL
Frame 2 - RLR
Frame 3 - LRL
...
It also reduces flicker.
-
- Sharp Eyed Eagle!
- Posts: 425
- Joined: Sat Dec 22, 2007 3:38 am
Re: Clearing up misconceptions
A theoretically lower latency. It is easily possible to have a 60Hz display with a lower latency with a 120Hz (true updating rate) display, if the 120Hz display has a high processing latency.2EyeGuy wrote:Actually yep. TVs and cinemas are strange, and talk about refresh rate differently (perhaps dishonestly) since they are usually dealing with material that has a limited fixed rate. But for everything else, eg. the Oculus Rift, a higher refresh rate means a lower latency.
And even then, there's the latency of the game itself. This may vary wildly; see TechReports' Inside the Second article.
Rendered Frames Per Second, display updating rate, display refresh rate, and frame latency (from input time to the result of that input appearing on the display) are related, but there is no hard relation between any of them.
- PasticheDonkey
- Sharp Eyed Eagle!
- Posts: 450
- Joined: Sun Jan 06, 2013 4:54 am
Re: Clearing up misconceptions
more importantly most displays have limits based on the HDMI spec they are using. so even if they are 120hz and the game can run faster than 60fps, they still only show 60fps.
- PasticheDonkey
- Sharp Eyed Eagle!
- Posts: 450
- Joined: Sun Jan 06, 2013 4:54 am
Re: Clearing up misconceptions
it annoys me that cinema 3d has temporal sync problems. causes some weird artefacts at times that you wont get on the rift.android78 wrote:I like the way the Oculus rift is everything that isn't TV or cinema.2EyeGuy wrote: But for everything else, eg. the Oculus Rift, a higher refresh rate means a lower latency.
It also is likely to mean less blur due to faster switching of the pixels, but that's not a given. With cinema, it's also important to remember that the higher refresh rate then frame rate is important for 3D. Especially for panning shots, because it's frame synchronous, if you only show each eye once per frame, then panning shots will actually shift the perceived depth. I believe that the cinemas usually show one eye twice and the other once each frame, alternating which is shown twice for the next frame
Frame 1 - LRL
Frame 2 - RLR
Frame 3 - LRL
...
It also reduces flicker.
-
- Cross Eyed!
- Posts: 154
- Joined: Mon Jan 21, 2013 5:26 am
Re: Clearing up misconceptions
Considering the Rift, like most PC monitors, uses DVI, that isn't an issue as Dual Link DVI supports 120 hz.PasticheDonkey wrote:more importantly most displays have limits based on the HDMI spec they are using. so even if they are 120hz and the game can run faster than 60fps, they still only show 60fps.
- PasticheDonkey
- Sharp Eyed Eagle!
- Posts: 450
- Joined: Sun Jan 06, 2013 4:54 am
Re: Clearing up misconceptions
but the screen it uses currently doesn't.
-
- Binocular Vision CONFIRMED!
- Posts: 337
- Joined: Mon Jan 21, 2013 12:30 pm
Re: Clearing up misconceptions
60 fps is pretty smooth regardless. Frame rate only starts to bother me under 30 fps. We need high refresh rate for extended use, Oculus is so close to our eyes.
- Delryn
- Two Eyed Hopeful
- Posts: 99
- Joined: Fri Jan 11, 2013 10:35 am
Re: Clearing up misconceptions
One thing people don't mention much is convergence.
A common problem with 3d is that your eyes are converged on a screen that is showing 3d images both nearer and farther than the actual screen is.
I know the OR has your eyes focus and converge on infinity, but how does this pan out if something is shown inches from my face? My eyes usually aren't focused and converged at infinity if i'm reading a book.
A common problem with 3d is that your eyes are converged on a screen that is showing 3d images both nearer and farther than the actual screen is.
I know the OR has your eyes focus and converge on infinity, but how does this pan out if something is shown inches from my face? My eyes usually aren't focused and converged at infinity if i'm reading a book.
GPU: NVidia GTX 680|CPU: Core i5-2400|RAM: 16gig|SSD: 120GB|HDD: 1.25TB|Liquid Cooled CPU and GPU
-
- Cross Eyed!
- Posts: 154
- Joined: Mon Jan 21, 2013 5:26 am
Re: Clearing up misconceptions
No, but Palmer has said he wants 120 Hz for the consumer version, he even said higher refresh rate was one of the main reasons he is looking for a different board, so hopefully he'll manage that. John Carmack has also been pushing him for a higher refresh rate display.PasticheDonkey wrote:but the screen it uses currently doesn't.
There's also some health concerns with 60 Hz running an inch from your eye. I for one know that if I look at a low refresh rate screen for a longer period of time, my eyes starts to physically hurt and I can't believe that it would be good for the eyes in the long run.
- PasticheDonkey
- Sharp Eyed Eagle!
- Posts: 450
- Joined: Sun Jan 06, 2013 4:54 am
Re: Clearing up misconceptions
convergence is naturalistic in the rifts case the focus isn't.Delryn wrote:One thing people don't mention much is convergence.
A common problem with 3d is that your eyes are converged on a screen that is showing 3d images both nearer and farther than the actual screen is.
I know the OR has your eyes focus and converge on infinity, but how does this pan out if something is shown inches from my face? My eyes usually aren't focused and converged at infinity if i'm reading a book.
-
- Two Eyed Hopeful
- Posts: 85
- Joined: Mon Aug 13, 2012 5:55 pm
Re: Clearing up misconceptions
However, in real life people focus and converge at the same time, and call this whole process "focusing". There was one hands-on video where a guy said that one of the first things he noticed in the rift was that he could "focus" on closer and farther away objects just like in real life. Ok, so on a pedantic level this isn't correct, but in terms of qualitative experience it means that having convergence without focus can produce a pretty compelling illusion of having both.
- Randomoneh
- Binocular Vision CONFIRMED!
- Posts: 227
- Joined: Wed Oct 17, 2012 12:42 pm
Re: Clearing up misconceptions
But you don't have natural convergence - eyes are almost always parallel, yes? Accomodation (what you call "focus") is also not natural. Eyes are always accomodated for infinity, yes?Pyry wrote:However, in real life people focus and converge at the same time, and call this whole process "focusing". There was one hands-on video where a guy said that one of the first things he noticed in the rift was that he could "focus" on closer and farther away objects just like in real life. Ok, so on a pedantic level this isn't correct, but in terms of qualitative experience it means that having convergence without focus can produce a pretty compelling illusion of having both.
This member owns things.
- Delryn
- Two Eyed Hopeful
- Posts: 99
- Joined: Fri Jan 11, 2013 10:35 am
Re: Clearing up misconceptions
The eyes will be converged and focused on infinity, so that is a natural focus. There's no accommodation there. However, I think the brain will conflict with the eyes when the brain sees something that it thinks is inches away even though the eyes are focused at infinity.
GPU: NVidia GTX 680|CPU: Core i5-2400|RAM: 16gig|SSD: 120GB|HDD: 1.25TB|Liquid Cooled CPU and GPU
-
- Two Eyed Hopeful
- Posts: 79
- Joined: Fri Sep 28, 2012 1:44 am
Re: Clearing up misconceptions
Does the Rift focus your eyes at true infinity?
- Randomoneh
- Binocular Vision CONFIRMED!
- Posts: 227
- Joined: Wed Oct 17, 2012 12:42 pm
Re: Clearing up misconceptions
It would not be natural since our usual convergence is parallel only when we're looking at distance.Delryn wrote:The eyes will be converged and focused on infinity, so that is a natural focus.
This member owns things.
- Delryn
- Two Eyed Hopeful
- Posts: 99
- Joined: Fri Jan 11, 2013 10:35 am
Re: Clearing up misconceptions
That's what I mean by converged/focused on infinity. When we look at the horizon we are converged and focused on infinity. The Rift's lenses and dual screens will let our eyes focus as if they were looking at the horizon.Randomoneh wrote:It would not be natural since our usual convergence is parallel only when we're looking at distance.Delryn wrote:The eyes will be converged and focused on infinity, so that is a natural focus.
My question is, what will be the affect on a person using the Rift while looking at an object perceived as near.
GPU: NVidia GTX 680|CPU: Core i5-2400|RAM: 16gig|SSD: 120GB|HDD: 1.25TB|Liquid Cooled CPU and GPU
-
- Binocular Vision CONFIRMED!
- Posts: 209
- Joined: Tue Nov 23, 2010 5:18 pm
Re: Clearing up misconceptions
It never feels near enough to actually change your focus, is the answer. The optics are setup so you have to focus on/near infinity to see it clearly, so even if your focus changed, you'd no longer see the image clearly.Delryn wrote: My question is, what will be the affect on a person using the Rift while looking at an object perceived as near.
Running my DIY headset this weekend and watching 3D videos, like the Thriller one where a zombie points a rifle at you and walks real close and another stabs at you with a sword, you get the feeling that they're close but not right in your face.
- Delryn
- Two Eyed Hopeful
- Posts: 99
- Joined: Fri Jan 11, 2013 10:35 am
Re: Clearing up misconceptions
Interesting, thank you.German wrote: It never feels near enough to actually change your focus, is the answer. The optics are setup so you have to focus on/near infinity to see it clearly, so even if your focus changed, you'd no longer see the image clearly.
Running my DIY headset this weekend and watching 3D videos, like the Thriller one where a zombie points a rifle at you and walks real close and another stabs at you with a sword, you get the feeling that they're close but not right in your face.
GPU: NVidia GTX 680|CPU: Core i5-2400|RAM: 16gig|SSD: 120GB|HDD: 1.25TB|Liquid Cooled CPU and GPU
-
- One Eyed Hopeful
- Posts: 30
- Joined: Fri Nov 09, 2012 2:32 pm
Re: Clearing up misconceptions
No, for closer objects, due to 3D separation, won't your eyes naturally converge inwards?Randomoneh wrote:But you don't have natural convergence - eyes are almost always parallel, yes?
Yes. I have one question/concern though.Accomodation (what you call "focus") is also not natural. Eyes are always accomodated for infinity, yes?
In 3D TVs etc, the stereo images are overlapped on a single screen. This makes your eyes, by default, converge slightly inward. You can then create a "pop-out effect" or a "window effect" by changing the direction of object separation. Objects that overlap perfectly will appear at the same distance as the TV screen.
In the Rift however, your eyes do not converge at a single screen. So where do they put the "default" convergence? Also at infinity? If so, it means the Rift only sports a "pop-out" effect as the perceived depth of the screen is already at max distance (parallel eyes).
My concern then is, could a sloppy developer force my eyes to converge "negatively" (pointing outwards)?
![Confused :?](./images/smilies/icon_e_confused.gif)
-
- Two Eyed Hopeful
- Posts: 85
- Joined: Mon Aug 13, 2012 5:55 pm
Re: Clearing up misconceptions
Your eyes will be accommodated at infinity, or wherever the optics happen to project the screen, but you'll converge to whatever makes the two images align in a way that 'makes sense' to your brain. For example, in real life, if you hold your finger up to your face but 'focus' on the background, you'll see that you see two images of your finger, and likewise if you 'focus' on your finger you'll see two overlaid images of the background, because you're changing the convergence of your eyes to either align the two eyes' images of the background, or of your finger. You can do the same thing with any stereo display, including the rift, because you can consciously control your convergence to align different parts of the two images.Randomoneh wrote:But you don't have natural convergence - eyes are almost always parallel, yes? Accomodation (what you call "focus") is also not natural. Eyes are always accomodated for infinity, yes?
- Diorama
- Binocular Vision CONFIRMED!
- Posts: 273
- Joined: Mon Jan 28, 2013 10:37 am
- Location: Brighton, UK (Sometimes London)
Re: Clearing up misconceptions
In the spirit of the thread title, can I just make sure I understand what is going on here?
What I think I have learned (presented as bullet points):
- The human eye uses two methods for working out the distance of objects/ mentally producing 3 dimensional images.
- One is the angle the eyes are pointed toghether (convergence?) with extreme examples being looking at your nose, your eyes are angled/converged inward to see the close object, versus looking at stars in the night sky your eyes are effectively parallel.
- The other is the reshaping of the lens in your eyeball (focus) depending on the distance of the object, the depth of field effect that makes the background blurry if you focus on your close up finger.
- The Oculus simulates the first of these (convergence) with its stereoscopic pair of images, but can not simulate the focus/DoF effect as the physical screen is always the same distance from your eyes thus your lens does not re-adjust to view close or far objects in the Oculus world.
- Thus nearby objects will look nearby (an object 1 metre away will appear to be exactly a metre away) but without 100% perfect veracity as the focus change that a very close object requires in the real world will not be needed.
Right?
What I think I have learned (presented as bullet points):
- The human eye uses two methods for working out the distance of objects/ mentally producing 3 dimensional images.
- One is the angle the eyes are pointed toghether (convergence?) with extreme examples being looking at your nose, your eyes are angled/converged inward to see the close object, versus looking at stars in the night sky your eyes are effectively parallel.
- The other is the reshaping of the lens in your eyeball (focus) depending on the distance of the object, the depth of field effect that makes the background blurry if you focus on your close up finger.
- The Oculus simulates the first of these (convergence) with its stereoscopic pair of images, but can not simulate the focus/DoF effect as the physical screen is always the same distance from your eyes thus your lens does not re-adjust to view close or far objects in the Oculus world.
- Thus nearby objects will look nearby (an object 1 metre away will appear to be exactly a metre away) but without 100% perfect veracity as the focus change that a very close object requires in the real world will not be needed.
Right?
- PasticheDonkey
- Sharp Eyed Eagle!
- Posts: 450
- Joined: Sun Jan 06, 2013 4:54 am
Re: Clearing up misconceptions
it doesn't need focus to tell distance but it's used to having to focus when converging because close things that you need to tow your eyes in to see also need to be focused because they are closer. in the rift it'll be different than normal but may not be a problem.
we work out distance with stereoscopy parallax and perspective just making things of known size smaller in the distance.
we work out distance with stereoscopy parallax and perspective just making things of known size smaller in the distance.
-
- Two Eyed Hopeful
- Posts: 85
- Joined: Mon Aug 13, 2012 5:55 pm
Re: Clearing up misconceptions
That's about right, but with two subtleties:
When you look at your finger in front of your face, the background appears 'blurry' for two reasons. First, each eye has changed the shape of its lens (accommodated) to optically focus on the finger, which brings the background out of optical focus. Second, the image you perceive in your mind is the combination of both eyes, and so the background you perceive when you converge both eyes on your finger is actually a blend of two offset views of the background. The rift will produce this second type of 'blurring', but not the first.
The second subtlety is that the inability to actually optically focus won't make things look wrong, so much as it might make things feel wrong, since your optical focus (reshaping the lens) won't be where your brain expects it to be.
When you look at your finger in front of your face, the background appears 'blurry' for two reasons. First, each eye has changed the shape of its lens (accommodated) to optically focus on the finger, which brings the background out of optical focus. Second, the image you perceive in your mind is the combination of both eyes, and so the background you perceive when you converge both eyes on your finger is actually a blend of two offset views of the background. The rift will produce this second type of 'blurring', but not the first.
The second subtlety is that the inability to actually optically focus won't make things look wrong, so much as it might make things feel wrong, since your optical focus (reshaping the lens) won't be where your brain expects it to be.
- PasticheDonkey
- Sharp Eyed Eagle!
- Posts: 450
- Joined: Sun Jan 06, 2013 4:54 am
Re: Clearing up misconceptions
your brain has to be able to cope with some incongruity tho because focus is slower than convergence.
-
- One Eyed Hopeful
- Posts: 30
- Joined: Fri Nov 09, 2012 2:32 pm
Re: Clearing up misconceptions
I believe you're right, right, right, right and right.Diorama wrote:In the spirit of the thread title, can I just make sure I understand what is going on here?
![Smile :)](./images/smilies/icon_e_smile.gif)
- Convergence and focus are important, but they are only two out of many cues your brain uses to create a sensation of depth.
- True there is no focus, but there is still "double vision"..
-
- Petrif-Eyed
- Posts: 2708
- Joined: Sat Sep 01, 2012 10:47 pm
Re: Clearing up misconceptions
Besides focus and "convergence" (accommodation), you also judge depth by parallax (subtle sideways head movement to look around objects). Focus and parallax are of paramount importance for those who do not have stereo vision (from blindness in one eye, or from "lazy eye", or from neurological or developmental problems). I personally known several people who cannot see depth on my LG passive TV, so it must be a common problem.
Thankfully, the Rift head tracking will provide parallax, but it would take holographic imaging (or eye tracking adjustable lenses) to provide focal depth cues. Parallax alone can be very immersive, especially with wide FoV. Stereoscopsis is mainly for nearby objects anyway.
Thankfully, the Rift head tracking will provide parallax, but it would take holographic imaging (or eye tracking adjustable lenses) to provide focal depth cues. Parallax alone can be very immersive, especially with wide FoV. Stereoscopsis is mainly for nearby objects anyway.
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. ![Image](http://i.creativecommons.org/l/by-sa/3.0/80x15.png)
![Image](http://i.creativecommons.org/l/by-sa/3.0/80x15.png)
-
- One Eyed Hopeful
- Posts: 30
- Joined: Fri Nov 09, 2012 2:32 pm
Re: Clearing up misconceptions
Re-reading my post on page 2 I realize it was maybe not the best explanation.. I hope this makes it more clear:
![Image](http://i.imgur.com/xgPac6J.jpg)
To the left, a cross eyed view. No problem there.
Next, a typical 3DTV / 3D cinema configuration, still no problem as your eyes still have to converge inwards (and by a good margin) by default.
And lastly, parallel views as in the Oculus Rift. What concerns me are those red lines; What prevents a developer from forcing my eyes to converge outwards? Could it be harmful? Or will my eyes simply refuse to do it?
![Image](http://i.imgur.com/xgPac6J.jpg)
To the left, a cross eyed view. No problem there.
Next, a typical 3DTV / 3D cinema configuration, still no problem as your eyes still have to converge inwards (and by a good margin) by default.
And lastly, parallel views as in the Oculus Rift. What concerns me are those red lines; What prevents a developer from forcing my eyes to converge outwards? Could it be harmful? Or will my eyes simply refuse to do it?
Last edited by voliale on Fri Feb 01, 2013 1:53 pm, edited 1 time in total.
- Diorama
- Binocular Vision CONFIRMED!
- Posts: 273
- Joined: Mon Jan 28, 2013 10:37 am
- Location: Brighton, UK (Sometimes London)
Re: Clearing up misconceptions
I would imagine that there are two possibilities (excellent image by the way).
Either a) It's not a huge deal and therefore, it's not a huge deal.
or more likely b) It IS a huge deal and thus never appears in a game because it is caught so early in development, or the SDK simply doesn't allow it.
Either way I don't see it becoming a major problem ( I wonder if it would be used intentionally as a 'freak out', say if your character is concussed/takes drugs?).
After all, whats to stop developers just not putting any lights in their games (lol Doom3) or making all objects have the same texture? Nothing is actually stopping them, but its stupid and would be caught very early in the process.
I hope. Because my eyes do not bend that way. Can't even see Magic Eye pics unless they are cross-eye![Sad :(](./images/smilies/icon_e_sad.gif)
Either a) It's not a huge deal and therefore, it's not a huge deal.
or more likely b) It IS a huge deal and thus never appears in a game because it is caught so early in development, or the SDK simply doesn't allow it.
Either way I don't see it becoming a major problem ( I wonder if it would be used intentionally as a 'freak out', say if your character is concussed/takes drugs?).
After all, whats to stop developers just not putting any lights in their games (lol Doom3) or making all objects have the same texture? Nothing is actually stopping them, but its stupid and would be caught very early in the process.
I hope. Because my eyes do not bend that way. Can't even see Magic Eye pics unless they are cross-eye
![Sad :(](./images/smilies/icon_e_sad.gif)
-
- Two Eyed Hopeful
- Posts: 85
- Joined: Mon Aug 13, 2012 5:55 pm
Re: Clearing up misconceptions
There is no way the SDK could prevent negative convergence without injecting itself very deep into the rendering process**, and since the rift just connects by HDMI and shows up as a regular display anyway, a developer could just forego the SDK altogether and directly display a negative convergence pair of images.
However, the same is true of 'standard' 3d displays: it's totally possible to produce negative convergence on (for example) a 3d monitor. There's been some speculation that it might be harmful long term, but it's also really uncomfortable, so basically if your 3d display is causing you discomfort, stop using it or adjust the settings.
** Even then a malicious developer could just put billboards in front of the two virtual cameras with a negative convergence image pair textured on them
However, the same is true of 'standard' 3d displays: it's totally possible to produce negative convergence on (for example) a 3d monitor. There's been some speculation that it might be harmful long term, but it's also really uncomfortable, so basically if your 3d display is causing you discomfort, stop using it or adjust the settings.
** Even then a malicious developer could just put billboards in front of the two virtual cameras with a negative convergence image pair textured on them
-
- Binocular Vision CONFIRMED!
- Posts: 337
- Joined: Mon Jan 21, 2013 12:30 pm
Re: Clearing up misconceptions
Adding these.
"Depth of field" (aka focus) - can be simulated in the game engine. Objects not near the center will blur, Rift lenses naturally do this with the pixels as well.
Separation - between the lenses/images, is the main source of depth and distance. Parallax will enhance this when your moving. Too much separation and you'll get eye strain.
Convergence - really only noticed when your close to objects (at least with shutter glasses). This gives you "pop-out" effect, like it's really coming towards you. Too much and it will give you focus/eye strain issues.
"Depth of field" (aka focus) - can be simulated in the game engine. Objects not near the center will blur, Rift lenses naturally do this with the pixels as well.
Separation - between the lenses/images, is the main source of depth and distance. Parallax will enhance this when your moving. Too much separation and you'll get eye strain.
Convergence - really only noticed when your close to objects (at least with shutter glasses). This gives you "pop-out" effect, like it's really coming towards you. Too much and it will give you focus/eye strain issues.
Last edited by Direlight on Sat Feb 02, 2013 2:48 am, edited 1 time in total.