Oculus will be on stage at with Engadget at CES

EdZ
Sharp Eyed Eagle!
Posts: 425
Joined: Sat Dec 22, 2007 3:38 am

Re: Oculus will be on stage at with Engadget at CES

Post by EdZ »

I'd put the calibration buttons inside the casing, requiring a screwdriver to access them. So often, I've seen people 'calibrate' their displays based on what they're used to seeing on their TVs. Unfortunately, almost every TV sold is set to defaults that are absolutely awful: massively oversaturated, contrast turned up far too high, dynamic brightness, edge-enhancement, anything that will make the display look 'punchy' next to others in a brightly lit showroom but awful for actually sitting down to watch something. Trying to match your HMD to this will only result in it looking worse.
Unlike CRTs, LCDs don't drift out of calibration to any significant degree (unless you've got a calibrated colour-matched professional graphics display that has to be absolutely stop-on), so setting the correct brightness and contrast at the factory should be sufficient, and allows developers to work to a known target.
zeroxygen
Two Eyed Hopeful
Posts: 64
Joined: Sat Dec 29, 2012 6:08 pm

Re: Oculus will be on stage at with Engadget at CES

Post by zeroxygen »

For a non-consumer device I would suggest putting as many dials on it as possible. This one is for people that almost assuredly know how to tweak standard monitor settings.
upsilandre
One Eyed Hopeful
Posts: 8
Joined: Wed Aug 08, 2012 8:49 am

Re: Oculus will be on stage at with Engadget at CES

Post by upsilandre »

zeroxygen
Two Eyed Hopeful
Posts: 64
Joined: Sat Dec 29, 2012 6:08 pm

Re: Oculus will be on stage at with Engadget at CES

Post by zeroxygen »

That should be a portion of the rendering that is not observable with the headset on, in-between the warping.
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: Oculus will be on stage at with Engadget at CES

Post by TheHolyChicken »

EdZ wrote:I'd put the calibration buttons inside the casing, requiring a screwdriver to access them. So often, I've seen people 'calibrate' their displays based on what they're used to seeing on their TVs. Unfortunately, almost every TV sold is set to defaults that are absolutely awful: massively oversaturated, contrast turned up far too high, dynamic brightness, edge-enhancement, anything that will make the display look 'punchy' next to others in a brightly lit showroom but awful for actually sitting down to watch something. Trying to match your HMD to this will only result in it looking worse.
Unlike CRTs, LCDs don't drift out of calibration to any significant degree (unless you've got a calibrated colour-matched professional graphics display that has to be absolutely stop-on), so setting the correct brightness and contrast at the factory should be sufficient, and allows developers to work to a known target.
This sounds ideal to me for the consumer device. How many players do you think thought a game was a bit too dark, meddled with their monitor brightness, and then thought other games were washed out?

Take 'Amnesia the Dark Descent' for example; they permitted access to extremely high-end settings that were just there 'for fun' (certain shadowing settings). Lo and behold, many users blindly meddled with the settings recommended for them and then complained it was "badly optimised" and that it had "crappy performance".

Given that the environmental viewing conditions will be IDENTICAL for everybody, it makes sense to me to pretty much lock it down.
Sometimes I sits and thinks, and sometimes I just sits.
User avatar
crespo80
Binocular Vision CONFIRMED!
Posts: 314
Joined: Wed May 23, 2012 6:46 am

Re: Oculus will be on stage at with Engadget at CES

Post by crespo80 »

TheHolyChicken wrote: Given that the environmental viewing conditions will be IDENTICAL for everybody, it makes sense to me to pretty much lock it down.
Yeah, calibration has a reason for a standard monitor which can be put in very different enviroments, light-wise.
For the same reason, I can understand calibration even on a "standard" HMD like the Vuzix or the Sony, because some ambient light enters and can modify the image.
But on the Rift, every single unit will be displayed in the same enviroment, so the light conditions are exactly the same, there's no need for adjustments on the consumer unit (maybe on the dev kit for testing purposes), just keep it simple
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: Oculus will be on stage at with Engadget at CES

Post by TheHolyChicken »

A couple of further thoughts:

I'm reminded of when I was making Portal 2 maps using the easy fun map tool Valve released. One section of my map was quite dark but, as everything was still CLEARLY visible to me, I thought it was fine. I didn't give it a second thought. Fine on my big, bright, beautiful 27" monitor, that is... in the feedback I discovered, to my displeasure, that some players abandoned my map because there was a section that was "completely pitch black" or "impossible" etc. Should the experience be changed or diluted because some people have terrible screens? Many games are.

Have you ever looked at a Wii game and wondered why none of the gui elements are right at the edge of the screen? They're always set a little inset from the borders. It's actually one of the many requirements/restrictions for games on that platform; no important GUI element is permitted to be outside a certain area of screen space, due to a small number of players that would be plugging their Wii consoles into awful screens that would cut some image off.

How many games have you played where they wanted you to attempt to calibrate the gamma? Wouldn't it be great if you could be guaranteed you're getting the experience the game devs desired? Having a fixed viewing experience is a great thing in this case - I would support those screen controls being removed or concealed.
Sometimes I sits and thinks, and sometimes I just sits.
2EyeGuy
Certif-Eyable!
Posts: 1139
Joined: Tue Sep 18, 2012 10:32 pm

Re: Oculus will be on stage at with Engadget at CES

Post by 2EyeGuy »

upsilandre wrote:What is it?
strange warping
It looks like it's rendering the scene as a single side-by-side 3D image before warping, and then probably rendering that image to the screen only once with a pixel shader that does the warping. The pixel shader takes in the coordinates of each pixel, then probably has a branch for which side of the screen the pixel is on, and scales the coordinates, then does a texture lookup at the new coordinates in the image. Where the coordinates are outside the edges of the image it returns black (you could change that behaviour by using clamp, BTW), but when the coordinates are within the image (even if now on the wrong side) then it uses the colour at that point. So when it renders the right edge of the left eye, some of the left edge of the right eye gets used instead of black.

It probably doesn't matter, since the lenses stretch out the corners of the image into a pincushion shape (undoing the warping), where they will probably be outside your field of view and you can't see them.
2EyeGuy
Certif-Eyable!
Posts: 1139
Joined: Tue Sep 18, 2012 10:32 pm

Re: Oculus will be on stage at with Engadget at CES

Post by 2EyeGuy »

TheHolyChicken wrote: I'm reminded of when I was making Portal 2 maps using the easy fun map tool Valve released.
That reminds me, I should be finishing my Style Changer mod for that easy fun map tool. It's a bad sign that people are now using the past tense, and my mod isn't finished. But the 1950s, 1960s, 1970s, and 1980s underground styles are finished and working.

https://dl.dropbox.com/u/101772879/Styl ... anTest.zip
User avatar
marbas
Binocular Vision CONFIRMED!
Posts: 247
Joined: Sat Aug 11, 2012 4:41 am

Re: Oculus will be on stage at with Engadget at CES

Post by marbas »

Engadget wrote:We like virtual reality headsets, but we also refuse to accept any such headsets with displays below 7-inches in size.
We do?
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: Oculus will be on stage at with Engadget at CES

Post by MSat »

Maybe it's just me, but I think the controls are a good thing - especially for something that sits so close to your eyes. I'm actually more sensitive to bright lighting than most people I know (I constantly squint in bright sunlight), so having the ability to control the settings is a good thing. Sure, you can sort of adjust it in software, but you'll never get the colors right without control over the backlight. What would be nice is a single "brightness" control that actually affects both brightness and contrast so that each one doesn't have to be set individually. This might not be a big deal for the dev-kit, but would definitely be nice feature for the consumer version.
troffmo5
One Eyed Hopeful
Posts: 49
Joined: Mon Aug 06, 2012 4:50 am

Re: Oculus will be on stage at with Engadget at CES

Post by troffmo5 »

Looking at the first video of the Verge article i noticed that the borders of the lenses are not flat.
Please tell me that inside there is a screw and you can rotate the lenses to adjust the distance from the screen!
distance.png
I think the mechanism of adjusting the IPD is almost clear :D
IPD.png
cold, warm or hot ;)
You do not have the required permissions to view the files attached to this post.
Mart
Two Eyed Hopeful
Posts: 79
Joined: Fri Sep 28, 2012 1:44 am

Re: Oculus will be on stage at with Engadget at CES

Post by Mart »

I understand the need to reduce the backlight brightness - I for one am very sensitive to bright light - but what's the point of allowing the end-user to reduce the contrast ratio of the display?
EdZ
Sharp Eyed Eagle!
Posts: 425
Joined: Sat Dec 22, 2007 3:38 am

Re: Oculus will be on stage at with Engadget at CES

Post by EdZ »

MSat wrote:Maybe it's just me, but I think the controls are a good thing - especially for something that sits so close to your eyes. I'm actually more sensitive to bright lighting than most people I know (I constantly squint in bright sunlight), so having the ability to control the settings is a good thing. Sure, you can sort of adjust it in software, but you'll never get the colors right without control over the backlight. What would be nice is a single "brightness" control that actually affects both brightness and contrast so that each one doesn't have to be set individually. This might not be a big deal for the dev-kit, but would definitely be nice feature for the consumer version.
Doe to a holdover from ancient TV systems (and poor labelling even then), 'brightness' and 'contrast' have nothing whatsoever to do with image brightness or contrast. 'Brightness' determines what the darkest possible light level will be emitted as (i.e. if you turn 'brightness' up, black areas will become grey, and nowhere will be black. If you turn it down, areas that should be grey will be black, and you will lose information in dark areas), and 'contrast' determines how bright the brightest light level will be (i.e. if you turn 'contrast' down, white will emit less light and you will diminish the available dynamic range. If you turn 'contrast' up, light grey areas will be displayed as completely white, and you will lose information in bright areas). Neither actually affect the backlight level, which is a separate setting.
The optimum setting (outside of the TV world, which has yet more crazy holdovers), the optimum setting is to have an input of 0 produce the least amount of light physically possible with your display technology, an input of 255 produce the desired maximum brightness (usually measured in light emitted per unit panel area, but for the Rift that will probably be light emitted per unit solid angle), and every step inbetween to be a distinct change in output. If you want things darker than this, you can modify in software equally as well (and probably better) then you can in hardware. 255 should already be the brightest acceptable emission level, so making things even brighter shouldn't be allowed to prevent eye strain. If you instead made 255 the brightest possible level emitted by the display, you then rely on game developers to ensure the safe/comfortable light limit is not breached. Enforcing this limit in software alone not only invites workarounds, but makes it hard to emit a known desired brightness.
User avatar
Libertine
Binocular Vision CONFIRMED!
Posts: 204
Joined: Wed Jan 11, 2012 1:06 pm

Re: Oculus will be on stage at with Engadget at CES

Post by Libertine »

The one size fits all mentality is what brought us Crysis 2's post processing 3D, needlessly locking out 3D Vision and is what brought us BF3s blue tint (not to mention Deus Ex's yellow tint and Mech Warriors brown tint) over everything regardless of the [opposite] opinions of so many customers. I always turn up the backlight on my displays as high as they can go and i certainly dont get eye strain, in fact i like to feel the hit of bright light when stepping into a bright virtual outside area. I like realistic brightness levels that look life-like, where the light rays bounce off everything, filling in the shadows as they do in real life, to enhance immersion and i run from plasma TV's for related reasons. Many games don't render realistically, like Fable 3, which i tried to play through a little bit just last week. In it i turned down the brightness in some appropriate areas halfway.

btw, am i the only one that rears back and says "HUH?" when people mention following the intentions of the directors/devs? If i keep having to follow the intentions of the current directors/devs, i may not be an active gamer/movie-goer for much longer. I already must break too many "rules" to enjoy games the way i wish too as the believability of games and thus the immersion they provide is not following my age curve, so to say.

On another front, I recently spent 6 hours preparing to play Crysis 2 for the first time. I had to disable the forced post-processing AA that blurred the entire screen, enable [via hack] some true AA using SGSSAA and then turned on some sharpening on my TV, the results were absolutely stunning, genuinely amazing (and not intended). In Metro 2033, the walk speed is a flat out sprint, with no acceleration, and I crouch walk as i do in many games (HL2,etc ) sometimes to get the best immersion, even though it might not make sense to you. I do many other things like this to fit my personal preference as i assume many people do. Ever play a BF match giving yourself one life to live, putting your life first? You won't help your team much, but it sure is intense! Thats a good example of ignoring the rules to create an experience that i personally desire.

btw, theres no reason the controls couldn't have a very rigid detent location with some bright arrows to indicate the default level.
User avatar
crespo80
Binocular Vision CONFIRMED!
Posts: 314
Joined: Wed May 23, 2012 6:46 am

Re: Oculus will be on stage at with Engadget at CES

Post by crespo80 »

anyway, I happen to have forgotten that the dev rift's control box does already have contrast and brightness controls :P

http://www.roadtovr.com/files/2012/11/o ... or-box.jpg
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: Oculus will be on stage at with Engadget at CES

Post by MSat »

EdZ wrote:
MSat wrote:Maybe it's just me, but I think the controls are a good thing - especially for something that sits so close to your eyes. I'm actually more sensitive to bright lighting than most people I know (I constantly squint in bright sunlight), so having the ability to control the settings is a good thing. Sure, you can sort of adjust it in software, but you'll never get the colors right without control over the backlight. What would be nice is a single "brightness" control that actually affects both brightness and contrast so that each one doesn't have to be set individually. This might not be a big deal for the dev-kit, but would definitely be nice feature for the consumer version.
Doe to a holdover from ancient TV systems (and poor labelling even then), 'brightness' and 'contrast' have nothing whatsoever to do with image brightness or contrast. 'Brightness' determines what the darkest possible light level will be emitted as (i.e. if you turn 'brightness' up, black areas will become grey, and nowhere will be black. If you turn it down, areas that should be grey will be black, and you will lose information in dark areas), and 'contrast' determines how bright the brightest light level will be (i.e. if you turn 'contrast' down, white will emit less light and you will diminish the available dynamic range. If you turn 'contrast' up, light grey areas will be displayed as completely white, and you will lose information in bright areas). Neither actually affect the backlight level, which is a separate setting.
The optimum setting (outside of the TV world, which has yet more crazy holdovers), the optimum setting is to have an input of 0 produce the least amount of light physically possible with your display technology, an input of 255 produce the desired maximum brightness (usually measured in light emitted per unit panel area, but for the Rift that will probably be light emitted per unit solid angle), and every step inbetween to be a distinct change in output. If you want things darker than this, you can modify in software equally as well (and probably better) then you can in hardware. 255 should already be the brightest acceptable emission level, so making things even brighter shouldn't be allowed to prevent eye strain. If you instead made 255 the brightest possible level emitted by the display, you then rely on game developers to ensure the safe/comfortable light limit is not breached. Enforcing this limit in software alone not only invites workarounds, but makes it hard to emit a known desired brightness.
I've done a bit more reading on this subject, and from what I found leads me to believe that what you said only applies to CRTs, and perhaps video card driver settings. For LCDs however, the "brightness" control is as the name would seem to imply, that is, output level of the LCD backlight, likewise the contrast sets the ratio between minimum and maximum pixel light emission. That leads me to believe that contrast control is unnecessary, or at least on the display side.

One issue that I would like to bring up is that LCDs (at least those for mobile, not sure about computer and TV LCDs) generally aren't full 24-bit, but rather 15 to 18-bit so that can be a particular problem for content that is overwhelmingly dark, bright, as well as content that uses minor variances in hues (Ex: Hawken). In such cases, it might be beneficial to be able to exaggerate the contrast in order to preserve details that might otherwise get lost. While this limits the overall dynamic range, one solution to preserve some semblance of it may be by using a dynamic color look-up table limited to the LCD's actual bit-depth, so that dark indoor areas might use a particular LUT, a bright outdoor area could use another. Better still is if the software could take panel brightness into account as well to apply further correction to the LUTs.
Flassan
Cross Eyed!
Posts: 151
Joined: Wed Apr 18, 2012 3:27 pm
Contact:

Re: Oculus will be on stage at with Engadget at CES

Post by Flassan »

The display settings should NOT be subjectively adjusted for personal taste. They should faithfully reflect the way the material is intended to look.
The enclosed nature of the Rift design means they have absolute control and that could be a major advantage. Normally a producer has no idea how much ambient light the viewer is watching in, it's color temperature or how the monitor is adjusted. The scientific way to check this is by using television test signals such as http://en.wikipedia.org/wiki/Test_Card_F
For instance the flesh tones of the girl should be the right hue and saturation and the little grey dot in the middle of the black rectangle in the greyscale to her left should be adjusted using brightness until it is just visible.
Testcard_F.jpg
The engineering terms for brightness and contrast are Lift and Gain. Lift is the black level and Gain is the signal amplitude and in a TV studio they are always adjusted with the help of a waveform monitor (a kind of Oscilloscope). Of course game developers should always view their material under operational lighting to check it's leaving them ok :D
You do not have the required permissions to view the files attached to this post.
EdZ
Sharp Eyed Eagle!
Posts: 425
Joined: Sat Dec 22, 2007 3:38 am

Re: Oculus will be on stage at with Engadget at CES

Post by EdZ »

MSat wrote:I've done a bit more reading on this subject, and from what I found leads me to believe that what you said only applies to CRTs, and perhaps video card driver settings. For LCDs however, the "brightness" control is as the name would seem to imply, that is, output level of the LCD backlight, likewise the contrast sets the ratio between minimum and maximum pixel light emission.
This is unfortunately NOT the case, causing no end of infuriation to anyone trying to set up an LCD or PDP (or even a CRT outside of the broadcast world). Some LCDs may have a separate backlight level, some may just not allow modification of the backlight level, some may link it to 'contrast' (white level) and in the worst case the 'brightness' (actually black level) setting will change both the black level AND the backlight level, making it completely impossible to acceptably set both!
One issue that I would like to bring up is that LCDs (at least those for mobile, not sure about computer and TV LCDs) generally aren't full 24-bit, but rather 15 to 18-bit so that can be a particular problem for content that is overwhelmingly dark, bright, as well as content that uses minor variances in hues (Ex: Hawken). In such cases, it might be beneficial to be able to exaggerate the contrast in order to preserve details that might otherwise get lost. While this limits the overall dynamic range, one solution to preserve some semblance of it may be by using a dynamic color look-up table limited to the LCD's actual bit-depth, so that dark indoor areas might use a particular LUT, a bright outdoor area could use another.
The 8-bit -> 6 bit conversion is done via dithering/modulation (or both) in the LCD hardware itself. You won't be able to drive the display from the computer end at a high enough refresh rate to computationally replicate the same or better effect, with the exception maybe of a slightly nicer dithering algorithm. In either case, blowing out the gamma (what most people from the photography world thing of as contrast) won't net you any more detail from a 6 bit panel, it'll just result in a less accurate output.
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: Oculus will be on stage at with Engadget at CES

Post by MSat »

EdZ wrote:This is unfortunately NOT the case, causing no end of infuriation to anyone trying to set up an LCD or PDP (or even a CRT outside of the broadcast world). Some LCDs may have a separate backlight level, some may just not allow modification of the backlight level, some may link it to 'contrast' (white level) and in the worst case the 'brightness' (actually black level) setting will change both the black level AND the backlight level, making it completely impossible to acceptably set both!
The more I looked into it, the more my assumption was verified. From the various sources I've seen, here are some of the best:

http://www.spearsandmunsil.com/articles ... ntrol.html

"One question we get a lot is, “why does the brightness control need to be calibrated? Isn’t there a standard voltage or code value for black that can be locked in at the factory?” This isn’t a dumb question at all. Modern computer monitors almost never need brightness calibration, and a lot of LCD monitors either don’t have a control labeled “brightness” or have one that controls the backlight brightness, which is a completely different adjustment. Essentially modern computer monitors assume that the video card is going to produce a consistent signal that is exactly to spec. This is a pretty good assumption – in fact video cards do generally produce proper and consistent signals that are exactly (or very nearly exactly) to spec."

And another from http://www.lagom.nl/lcd-test/black.php

"Use the contrast setting, and maybe gamma to improve the display of the darker squares, but watch out for undoing the optimizations in the earlier images. On most LCD monitors, the brightness setting only affects the backlight, but doesn't affect the test images otherwise."

Though I have seen some sources that mention as you said and have backlight controls linked to black-level controls, particularly once the backlight is dimmed beyond a certain point.

At any rate, the Rift could have a strict backlight-only control, and do away with any legacy "brightness" and "contrast"
The 8-bit -> 6 bit conversion is done via dithering/modulation (or both) in the LCD hardware itself. You won't be able to drive the display from the computer end at a high enough refresh rate to computationally replicate the same or better effect, with the exception maybe of a slightly nicer dithering algorithm. In either case, blowing out the gamma (what most people from the photography world thing of as contrast) won't net you any more detail from a 6 bit panel, it'll just result in a less accurate output.
I understand that dithering is generally handled on the LCD side (at the DVI/HDMI/etc -> LCD panel interface to be specific), but that wasn't what I was talking about. But speaking of dithering, I think in reality it would be detrimental to overall picture quality considering the relatively low angular resolution of the Dev-kit (and perhaps even a 1080 unit), which would make any implementation of dithering much more apparent, reducing its effect, and probably do little else than introduce significant noise to the image. What I was suggesting was not using dithering at all, but rather working within the constraints of the panel's native bit-depth by eliminating extensive use of subtle hue variations which would be impossible to resolve making the scene looked washed out. The point is to get the most out of the hardware, and since dithering is unlikely to help, why not at least sharpen the image as much as possible?
User avatar
Libertine
Binocular Vision CONFIRMED!
Posts: 204
Joined: Wed Jan 11, 2012 1:06 pm

Re: Oculus will be on stage at with Engadget at CES

Post by Libertine »

Flassan wrote:The display settings should NOT be subjectively adjusted for personal taste. They should faithfully reflect the way the material is intended to look.
Lets hope game devs who never intended to have their game/s used with an Oculus Rift have a more respectful attitude toward other people's personal tastes than you do.

Are you working on Windows 8 by chance? :D

That said: What i really hope for is backlight and white level adjustability. I don't modify brightness that much outside of compensating for dimming 3D glasses.
Flassan
Cross Eyed!
Posts: 151
Joined: Wed Apr 18, 2012 3:27 pm
Contact:

Re: Oculus will be on stage at with Engadget at CES

Post by Flassan »

oh, I didn't mean to cause offence. Reading it back I guess it was a bit blunt. Sorry about that.
Just trying to pass on information that some may find useful. Lift and gain are interrelated so increasing the gain causes the blacks to become grey which reduces the dynamic range.
I haven't tried Windows 8 programming yet but when I do I'll know my place :D
EdZ
Sharp Eyed Eagle!
Posts: 425
Joined: Sat Dec 22, 2007 3:38 am

Re: Oculus will be on stage at with Engadget at CES

Post by EdZ »

MSat wrote:"One question we get a lot is, “why does the brightness control need to be calibrated? Isn’t there a standard voltage or code value for black that can be locked in at the factory?” This isn’t a dumb question at all. Modern computer monitors almost never need brightness calibration, and a lot of LCD monitors either don’t have a control labeled “brightness” or have one that controls the backlight brightness, which is a completely different adjustment. Essentially modern computer monitors assume that the video card is going to produce a consistent signal that is exactly to spec. This is a pretty good assumption – in fact video cards do generally produce proper and consistent signals that are exactly (or very nearly exactly) to spec."
And from the paragraph previous to that one (emphasis mine):

"The name “brightness” is really not a very good one. Professional video engineers refer to “black level,” which is more descriptive. What you’re really setting is the input level that the display will consider absolute black. If the input is analog (such as component or VGA), then you’re setting a voltage level that the display will consider black. If the input is digital (such as DVI or HDMI), then you’re setting a digital value that will be considered black."


If you check almost any LCD display (or TV), you will find a brightness and contrast setting. It is completely pot-luck as to which control, if either, will change the blacklight level. You're entirely reliant on what the display controller manufacturer thinks the setting should do, and this decision seems to usually be entirely arbitrary, as does the choice of default values.
User avatar
PasticheDonkey
Sharp Eyed Eagle!
Posts: 450
Joined: Sun Jan 06, 2013 4:54 am

Re: Oculus will be on stage at with Engadget at CES

Post by PasticheDonkey »

backlight level is normally under eco settings.
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: Oculus will be on stage at with Engadget at CES

Post by MSat »

EdZ wrote:
MSat wrote:"One question we get a lot is, “why does the brightness control need to be calibrated? Isn’t there a standard voltage or code value for black that can be locked in at the factory?” This isn’t a dumb question at all. Modern computer monitors almost never need brightness calibration, and a lot of LCD monitors either don’t have a control labeled “brightness” or have one that controls the backlight brightness, which is a completely different adjustment. Essentially modern computer monitors assume that the video card is going to produce a consistent signal that is exactly to spec. This is a pretty good assumption – in fact video cards do generally produce proper and consistent signals that are exactly (or very nearly exactly) to spec."
And from the paragraph previous to that one (emphasis mine):

"The name “brightness” is really not a very good one. Professional video engineers refer to “black level,” which is more descriptive. What you’re really setting is the input level that the display will consider absolute black. If the input is analog (such as component or VGA), then you’re setting a voltage level that the display will consider black. If the input is digital (such as DVI or HDMI), then you’re setting a digital value that will be considered black."


If you check almost any LCD display (or TV), you will find a brightness and contrast setting. It is completely pot-luck as to which control, if either, will change the blacklight level. You're entirely reliant on what the display controller manufacturer thinks the setting should do, and this decision seems to usually be entirely arbitrary, as does the choice of default values.

I certainly don't mean to argue what the definition of "brightness" and "contrast" is as it pertains to video displays, and as I see it is irrelevant to the discussion. In order to avoid further confusion, or a misuse of terms, I'll refer to backlight control as "backlight intensity" or perhaps "backlight luminance" which should make the meaning quite clear. The ratio between the minimum and maximum possible luminance of a pixel for any given backlight intensity could be called the "contrast ratio". All I'm saying is that the rift should indeed have a "backlight intensity" control, while a "contrast ratio" control might not be necessary.

As I respect your (as well as many other members of this forum) opinions, I would like to know what your thoughts are regarding the possible issues I mentioned with using dithering. Am I on the right track with thinking that it would not enhance the perceived colors but rather just introduce noise to the image given the low angular resolution?
EdZ
Sharp Eyed Eagle!
Posts: 425
Joined: Sat Dec 22, 2007 3:38 am

Re: Oculus will be on stage at with Engadget at CES

Post by EdZ »

MSat wrote:As I respect your (as well as many other members of this forum) opinions, I would like to know what your thoughts are regarding the possible issues I mentioned with using dithering. Am I on the right track with thinking that it would not enhance the perceived colors but rather just introduce noise to the image given the low angular resolution?
Dithering is a perceptual effect, so it's hard to tell without a Rift in front of you whether the pixel size would be too great to avoid it just looking like chroma noise. A non-static dithering pattern would probably be unnoticeable except in overall very dark scenes. In this regard, on-board dithering from an 24-bit input and software dithering and a native 18-bit input would give similar results.
However, there's also FRC (or 'temporal dithering'), which is harder to effectively replicate in software (the colour change frequency is ideally several times of the panel refresh rate). In this case, providing an 24-bit input would produce superior results to providing a 18-bit input.
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: Oculus will be on stage at with Engadget at CES

Post by MSat »

EdZ wrote:Dithering is a perceptual effect, so it's hard to tell without a Rift in front of you whether the pixel size would be too great to avoid it just looking like chroma noise. A non-static dithering pattern would probably be unnoticeable except in overall very dark scenes. In this regard, on-board dithering from an 24-bit input and software dithering and a native 18-bit input would give similar results.
However, there's also FRC (or 'temporal dithering'), which is harder to effectively replicate in software (the colour change frequency is ideally several times of the panel refresh rate). In this case, providing an 24-bit input would produce superior results to providing a 18-bit input.
I took a look at a bunch of panel specs on panelook.com, and found plenty of native 24-bit or 18-bit+FRC units, so I guess I'm making a moot point. I had jumped the gun because I was unaware of their existence.
PalmerTech
Golden Eyed Wiseman! (or woman!)
Posts: 1644
Joined: Fri Aug 21, 2009 9:06 pm

Re: Oculus will be on stage at with Engadget at CES

Post by PalmerTech »

Locking the controls makes sense for a consumer version, since it can be calibrated perfectly. This is a developer kit, though, giving people as much control as possible makes a lot of sense.

On top of that, we currently have more pressing things to work on than getting a perfectly calibrated profile for this display. Even if we did do it, though, every computer is not the same. People often adjust gamma and brightness settings in the graphics output settings, and Windows, OSX, and Linux all have some variation in output.
User avatar
Libertine
Binocular Vision CONFIRMED!
Posts: 204
Joined: Wed Jan 11, 2012 1:06 pm

Re: Oculus will be on stage at with Engadget at CES

Post by Libertine »

I think its good to be cautious with calibration. One calibration result i read about had the backlight all the way down. The "cinema" TV picture preset setting with the "warm 2" color profile as its sometimes called, is the closest preset to a calibrated TV result according to many of the 3DTV reviews i've read. In my opinion not only is that preset brown and ugly, its too dark and doesn't look at all realistic in daylight scenery and ruins any chance of a proper luminant value/feeling bright lights have amist a dark background.
EdZ
Sharp Eyed Eagle!
Posts: 425
Joined: Sat Dec 22, 2007 3:38 am

Re: Oculus will be on stage at with Engadget at CES

Post by EdZ »

PalmerTech wrote:Locking the controls makes sense for a consumer version, since it can be calibrated perfectly. This is a developer kit, though, giving people as much control as possible makes a lot of sense.
Can I suggest including maybe a leaflet (or readme on the documentation CD or whatever) with the recommended settings (brightness etc)? Having some developers working to one setup and some to another would make it a pain in the behind to get everything looking the same for everyone (even if not 'correct' for everyone), and proper display calibration can be confusing and arcane even for the well informed.
The "cinema" TV picture preset setting with the "warm 2" color profile as its sometimes called, is the closest preset to a calibrated TV result according to many of the 3DTV reviews i've read. In my opinion not only is that preset brown and ugly, its too dark and doesn't look at all realistic in daylight scenery and ruins any chance of a proper luminant value/feeling bright lights have amist a dark background.
Try giving your eyes time to adjust. After using the proper settings (generally just the 'Cinema' preset, or 'THX' if available) for a while and then switching back to the 'showroom' defaults, they will appear very washed out, far too bright and far too blue.
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: Oculus will be on stage at with Engadget at CES

Post by MSat »

EdZ wrote:Can I suggest including maybe a leaflet (or readme on the documentation CD or whatever) with the recommended settings (brightness etc)? Having some developers working to one setup and some to another would make it a pain in the behind to get everything looking the same for everyone (even if not 'correct' for everyone), and proper display calibration can be confusing and arcane even for the well informed.
I wonder if they would be able to implement this with the interface chip they'll be using. Based on the images we've seen of the control box, the levels are set using +/- buttons, so the values would have to be displayed via an OSD. I don't know how customizable this functionality is in the firmware, but I agree, it's a valid concern.
User avatar
Libertine
Binocular Vision CONFIRMED!
Posts: 204
Joined: Wed Jan 11, 2012 1:06 pm

Re: Oculus will be on stage at with Engadget at CES

Post by Libertine »

EdZ wrote:Try giving your eyes time to adjust. After using the proper settings (generally just the 'Cinema' preset, or 'THX' if available) for a while and then switching back to the 'showroom' defaults, they will appear very washed out, far too bright and far too blue.
I do often let my eyes adjust when adjusting settings and try to keep in check any effect pre-conceived notions might have. Luckily my 3dtv has a neutral color setting, which i use since it is my computer monitor. When i switched to my 3DTV from my 3007wfp, i noticed the colors were essentially the same with this setting. Again, my main concern is backlight adjustment being available, not so much brightness or contrast. However, having convienent brightness/contrast adjustments might be nice to give extra control over the liquid crystals and help dim the display when using the Rift at night just before going to bed. I for example still use sunglasses at night with my 3dtv's backlight all the way down, which is largely due to its huge FOV due to viewing it at ~1 meter. Otherwise, i have little chance of feeling tired near my bedtime.

Heres some quick examples of scenery in games where i think having the backlight at maximum lends the most realistic look and feel. What i usually do is picture in my head, a similar real life setting i've experienced in real life and compare it.
Image
Image

Perhaps compare with: Hiking on a sunny day google search: https://www.google.com/search?q=hiking+ ... 8&bih=1007

For the sake of thoroughness [and what the hell...], i came across a good example of realistic daytime rendering vs. rendering shadows unrealisticly dark, which i mentioned before. I find brightness controls can help to a [small] degree in some games.
Realistic:
Image
vs.
Image
Last edited by Libertine on Tue Jan 15, 2013 5:19 pm, edited 1 time in total.
EdZ
Sharp Eyed Eagle!
Posts: 425
Joined: Sat Dec 22, 2007 3:38 am

Re: Oculus will be on stage at with Engadget at CES

Post by EdZ »

Libertine wrote: Luckily my 3dtv has a neutral color setting, which i use since it is my computer monitor.
Ah-hah! You've hit on one of the weirdneses of video vs computers: they don't quite use the same colourspace (heck,SD and HD don't!), and define black and white points at different levels (0-255 for computers, 16-235 for video). What looks correct for a computer will look wrong for video, and vice-versa, with the same settings.
Image
On my display, the 'pitch black' shadow is far from it (foliage within the shadow being discernable), but it is noticeable darker than the photo. However, the photo is overall much lighter than the screenshot, so it is not an 'apples to apples' comparison. There are so many other variables (e.g. exposure) that it is not a really valid comparison.
Image
User avatar
Libertine
Binocular Vision CONFIRMED!
Posts: 204
Joined: Wed Jan 11, 2012 1:06 pm

Re: Oculus will be on stage at with Engadget at CES

Post by Libertine »

I didn't actually create the tower comparison. I don't think its the best example they could have used i see much darker rendering in some games, but the BF3 shot did provide a decent example of the difference between the Crysis 2 photo above it, which i do think looks very representative of reality, yet not perfect either. In the BF3 photo, you can tell where the sun is from the shadow of the tower and the left sides of the gun should realistically be lighter than that i would think.
User avatar
Libertine
Binocular Vision CONFIRMED!
Posts: 204
Joined: Wed Jan 11, 2012 1:06 pm

Re: Oculus will be on stage at with Engadget at CES

Post by Libertine »

Flassan wrote:oh, I didn't mean to cause offence. Reading it back I guess it was a bit blunt. Sorry about that.
Just trying to pass on information that some may find useful. Lift and gain are interrelated so increasing the gain causes the blacks to become grey which reduces the dynamic range.
I haven't tried Windows 8 programming yet but when I do I'll know my place :D
Oops, sorry, my post also contained more bite than i intended. I did find you post informative, thanks.

EDIT: oops, should have edited my last post, can't delete this though.
Post Reply

Return to “Oculus VR”