AMD Demonstrates "FreeSync", Free G-Sync Alternative, at CES

MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by MSat »

Maybe I've missed something (certainly possible), but I haven't heard any evidence to suggest that they're using something like tweening. The demos they've been using on the Crystal Cove hardware are most likely running at the frame rate necessary to reduce flicker associated with low persistence displays.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by geekmaster »

MSat wrote:Maybe I've missed something (certainly possible), but I haven't heard any evidence to suggest that they're using something like tweening. The demos they've been using on the Crystal Cove hardware are most likely running at the frame rate necessary to reduce flicker associated with low persistence displays.
This thread is not about the Crystal Cove, per se, but about using FreeSync or G-Sync to more accurately control or predict when the display content will be visible, and to (perhaps) warp or tween the screen content to match where your eye wants to see it at that time.

If you can predict when frame render will complete (which really depends on scene complexity), it could certainly benefit from warping if it completes too soon before the next display update time.
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

Fredz wrote:
TheHolyChicken wrote:The higher the better (I'd love 120Hz+)
I'm fine with 90Hz, 120Hz or more is pushing a bit too far the horsepower required for rendering for no evident benefit IMO.
I absolutely disagree with that - I literally can't disagree more. I use a 120Hz monitor at home, and the smoothness of motion is plainly evident against standard monitors. It's a much nicer experience, and I can only imagine that this benefit will be even more pronounced in VR. There's also no disadvantage to using a panel at 120Hz (or higher!); those who can't reach 120FPS are no worse off than before, but the experience ceiling is raised for others.
Fredz wrote:
TheHolyChicken wrote:I wasn't very clear in what I wrote; I just meant that g-sync is still the preferred option over vsync, regardless of the FPS we're maintaining. Vsync is never the preferred option.
That's probably true but I'm still not sure. Somewhat I feel that a constant refresh rate would be better for comfortable motion perception, but I can't find any reference about that. I guess I'll have to see that by myself when I get a variable rate monitor.
People at the g-sync conference - being shown a side-by-side comparison between the two - asserted that the gsync panel was preferable to the regular. The better perception of smooth motion was literally what was being demonstrated at the conference: http://www.youtube.com/watch?v=y5iEuMtK1RU
Fredz wrote:
TheHolyChicken wrote:If the FPS is dropping low enough that the display would be uncomfortably flickery, it's probably feasible to just flash up the old image again. The blur is back, but you'd avoid the eyestrain from flicker.
It would be worse if the flashing was done by the display itself, since the second image wouldn't be at the correct position.
Oh I fully agree, definitely. It would be far from ideal - that was more a "last resort" for if framerates are dipping so low that flicker will be an issue. I would obviously far prefer that something better was displayed than having to show an old frame (eg a tweened frame). This problem is also not specific to any of the different techniques discussed here, though; even on a traditional vsync 60Hz rigid monitor, you will encounter the problem of "I need to show a frame now, but I've not been given a new one - what should I show on the screen?".
Fredz wrote:
TheHolyChicken wrote:and this is where g-sync comes in. It's still really important, because you want that frame to be presented immediately. Without g-sync, your brand new amazing frame still sits around for a couple of milliseconds until the panel decides it's time to switch. The vsync timeline/diagram still applies.
With V-sync you know the duration between refreshes so you can use a timer - or a hardware interrupt if the pre-warping can be done fast enough, like it was done for VGA DDC 3D glasses - and do the pre-warping just before the vertical retrace.

The only difference is that it's done at a variable rate and automatically with G-sync and at a constant rate with V-sync with some more work, but the frames would be correctly timed in both cases.
If headtracking error really is problematic due to a discrepancy between the estimated and the actual rendering time, you could always instead opt to enforce a preferred framerate in software. This would be set at a value that provides comfortable leeway to ensure the render is done on time, and then you can present the frame at precisely the moment you have been targeting with headtracking prediction. This is obviously similar to what you'd want to do on v-sync, only the difference is that on a g-sync monitor the penalty for missing your render target is almost negligible (while on vsync your FPS nosedives).

Furthermore, you can enforce ANY desired framerate on gsync. Perhaps the ideal safe framerate is 81; maybe it's 102; maybe it'll be 75 for your system. On v-sync you don't have that option, leading to wasted performance (this especially hurts if you're just shy of a breakpoint, eg if you're at 59 FPS you plummet down to 40 or 30).
Sometimes I sits and thinks, and sometimes I just sits.
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by Fredz »

TheHolyChicken wrote:I use a 120Hz monitor at home, and the smoothness of motion is plainly evident against standard monitors. It's a much nicer experience, and I can only imagine that this benefit will be even more pronounced in VR.
Yes I don't dispute the fact that 120Hz LCD monitors are a much better experience than current 60Hz LCD monitors. But I'm not convinced the difference between a 120Hz LCD monitor and a 90Hz low-persistence monitor is that noticeable.

I've spent most of my time playing on CRT monitors by now and I've never felt the urge to raise the frequency to 120Hz when it was possible (except for 3D), as opposed to 72Hz or 85Hz. The frequency in itself was probably not the main advantage I think, low input lag, low display lag and low persistence were probably what helped the experience the most.

I think the display in the Rift shares more characteristics with a CRT monitor than a standard LCD monitor.
TheHolyChicken wrote:There's also no disadvantage to using a panel at 120Hz (or higher!); those who can't reach 120FPS are no worse off than before, but the experience ceiling is raised for others.
Not worse with G-sync yes, but much worse with V-sync.
TheHolyChicken wrote:People at the g-sync conference - being shown a side-by-side comparison between the two - asserted that the gsync panel was preferable to the regular. The better perception of smooth motion was literally what was being demonstrated at the conference
Sure, but they compared a V-sync monitor with non constant frame rate with a G-sync monitor. When the V-sync monitor was displaying at the correct frame rate there was no difference ("now they should look identical, they're both rendering perfectly").

In the case of the Rift, when the frame rate is below the refresh rate, it'll not stutter like on standard LCD monitors, the late de-warping will maintain a smooth motion with V-sync, just like with G-sync. The only difference is that the content itself would be a bit older or newer (which doesn't matter since it's not correctly timed in both cases), but it doesn't impact the motion perception.
TheHolyChicken wrote:Oh I fully agree, definitely. It would be far from ideal - that was more a "last resort" for if framerates are dipping so low that flicker will be an issue. I would obviously far prefer that something better was displayed than having to show an old frame (eg a tweened frame).
But how would you calculate the tweening ? You need two frames to interpolate and you only have one. And you need to know the current sensor position to orient the frame correctly, but you don't have this information. You'll then be forced to have a dedicated hardware like Geekmaster proposed which can read directly the sensor values, but this sounds a bit overkill to me.
TheHolyChicken wrote:even on a traditional vsync 60Hz rigid monitor, you will encounter the problem of "I need to show a frame now, but I've not been given a new one - what should I show on the screen?".
With V-sync monitors you would display the previous frame, but you can still do the de-warping on each vertical retrace and the content will still be rendered at the correct orientation. You can't do that with G-sync AFAIK.

EDIT : I forgot to say, but another aspect of G-sync that could be problematic (raised by some people on reddit) is the non-constant brightness. If you flash pixels at varied time intervals the brightness will not be constant, it's left to be seen if it'll really be a problem, but if it's the case it should be quite complex to correct that in hardware or software I'm afraid.
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by Fredz »

TheHolyChicken wrote:The "time warp" sounds like the team have come up with a way of "correcting" the frame using more recent headtracking data. [...] Once the frame has been generated, you then "correct" that frame (somehow[???] - I'd love a good explanation!) using the most recent headtracking data.
I think I know how they do it now, I can't believe I missed this part of the article when I read it last September...

From Measuring Input Latency :
RenderingPipeline wrote:We now have a OpenGL/WGL extension that can help us to get the timing right to query the user input (again) right before the vsync: wgl_delay_before_swap - sadly so far this is only supported by NVidia and only on Windows, I’d love to see a glx_delay_before_swap variant for Linux as well.
This extension has been created on February 4, 2013 and unsurprisingly John Carmack is listed as a contributor. :P

Overview :
OpenGL wrote:For most interactive applications, the standard rendering loop responding to input events on a frame granularity is sufficient. Some more demanding applications may want to exchange performance for the ability to sample input closer to the final frame swap and adjust rendering accordingly. This extension adds functionality to allow the application to wait until a specified time before a swapbuffers command would be able to execute.
Too bad that I'm on Linux though, I would have liked to give this a try... :/
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by Fredz »

Spoke too fast, GLX_NV_delay_before_swap is available :) :
http://www.opengl.org/registry/specs/NV ... e_swap.txt

Let's see if it's implemented in the latest NVIDIA driver...
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by Fredz »

Not available in the latest driver from January (331.38) unfortunately. But I contacted one of the contributors from NVIDIA and he said it's been implemented internally and it's in the release pipeline. Great ! :)
Post Reply

Return to “Oculus VR”