AMD Demonstrates "FreeSync", Free G-Sync Alternative, at CES

User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

AMD Demonstrates "FreeSync", Free G-Sync Alternative, at CES

Post by Fredz »

User avatar
cybereality
3D Angel Eyes (Moderator)
Posts: 11407
Joined: Sat Apr 12, 2008 8:18 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by cybereality »

Very interesting.

I think this technology will be a lot more popular if it is not tied to one graphics card vendor.
remosito
Binocular Vision CONFIRMED!
Posts: 251
Joined: Wed Apr 10, 2013 2:06 am

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by remosito »

That Vesa extension is a feature mainly used for mobile displays to save battery power, chances are potential Rift panel might support it.


Just read an article though where they mentioned sth about triplebuffering needed, dunno why yet. That would not gel well with VR.

http://techreport.com/news/25867/amd-co ... -sync-tech
In Koduri's assessment, it's possible to achieve a G-Sync-like animation smoothness with a combination of two techniques: dynamic refresh rates and triple buffering.
Koduri is AMD dude doing the freesync demoing at CES. Not entirely sure if triplebuffering is only for Nvidias G-sync or as well for AMDs Freesync.

Another very interesting quote:
Koduri's puzzlement over Nvidia's use of external hardware was resolved when I spoke with him again later in the day. His new theory is that the display controller in Nvidia's current GPUs simply can't support variable refresh intervals, hence the need for an external G-Sync unit
Now if Gsync really relies on triplebuffering then it's pretty much dead in the water for VR HMD purposes.

Would be crazy if AMD just magiced a bunny out of a hat that uses existing standards, is free and does not have triplebuffering. Together with Mantle's potential for one gpu per eye Xfire mode. Would seriously tip the balance for Rifters.

But triplebuffering could apply only to the freesync implementation. Not Gsync and actually making gsync happen worth it.
Starcitizen - Elite:Dangerous - Xing - Gallery: Six Elements - Among the sleep - Theme Park Studio - The Stomping Land - Son of Nor - Obduction - NOWHERE - Kindom Come : Deliverance - Home Sick - prioVR
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by Fredz »

Now that it seems confirmed that Oculus VR is going for a low-persistence display I'm not sure these technologies will have any interest for VR since it would produce flicker at lower frequencies.

Still interesting for standard gaming though... till next year, when standard gaming will mean VR. :P
cegli
One Eyed Hopeful
Posts: 36
Joined: Thu May 16, 2013 5:35 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by cegli »

Fredz wrote:Now that it seems confirmed that Oculus VR is going for a low-persistence display I'm not sure these technologies will have any interest for VR since it would produce flicker at lower frequencies.

Still interesting for standard gaming though... till next year, when standard gaming will mean VR. :P
Hmm... but wouldn't it be technically possible to still use something like this at a low frame rate? It would have to be some kind of state-machine that detects the frame rate has dropped below "X-FPS" and pulses each frame twice.

For example:

If FPS >= 75FPS, flicker once per frame. If FPS < 75FPS, flicker twice per frame (assuming it can flicker at 150hz). I wouldn't be surprised if we could have a smooth VR experience at 40fps, if the display was synced to the GPU! You would just end up with some motion blur as a penalty for having low FPS. Maybe in 2016...
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by Fredz »

Ah yes it seems possible, didn't think about that, could be a good compromise.

Since they're flashing the OLED I guess they may have designed the controller board themselves, so I suppose they could flash it at any frequency since OLED can theoretically be driven very fast.
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by zalo »

Palmer mentioned that they can be theoretically driven at KHz.

I wonder if there are any uses that can be had from those speeds...

Resolution wouldn't matter as much, so they could run it ten times as fast as now, but at a tenth the resolution...

Maybe the brain responds to subliminal messages at speeds like that?
User avatar
nateight
Sharp Eyed Eagle!
Posts: 404
Joined: Wed Feb 27, 2013 10:33 pm
Location: Youngstown, OH

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by nateight »

cegli wrote:below "X-FPS" ... pulses each frame [multiple times]
Fredz wrote:they may have designed the controller board themselves, so I suppose they could flash it at any frequency
zalo wrote:Palmer mentioned that they can be theoretically driven at KHz.
Image
Shameless plug of the day - Read my witty comments on Reddit, in which I argue with the ignorant, over things that don't matter, for reasons I never fully understood!
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by geekmaster »

G-Sync is supposed to have improved lightboost tech built-in:
http://www.blurbusters.com/confirmed-nv ... t-upgrade/

Does this "G-Sync Alternative" have such a feature?
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

Fredz wrote:Now that it seems confirmed that Oculus VR is going for a low-persistence display I'm not sure these technologies will have any interest for VR since it would produce flicker at lower frequencies.

Still interesting for standard gaming though... till next year, when standard gaming will mean VR. :P
The benefit is still fantastic. Yes, at low framerates you will get flickering, but what about at 58 FPS? I'd much rather have true 58 FPS synced with the panel, instead of the framerate tanking down to 30FPS.

Previously, Palmer had said that G-sync would not be appearing in the first consumer Rift. I wonder if that might change (or perhaps FreeSync?), given the CES news that:
"Palmer Luckey says that the company's success has opened new doors. Manufacturers, he says, have started looking at the Rift as more than an untested product, which means they're willing to work with Oculus on displays that aren't just repurposed phone and tablet parts."
(from http://www.theverge.com/2014/1/7/528577 ... -prototype )
Sometimes I sits and thinks, and sometimes I just sits.
User avatar
nateight
Sharp Eyed Eagle!
Posts: 404
Joined: Wed Feb 27, 2013 10:33 pm
Location: Youngstown, OH

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by nateight »

TheHolyChicken wrote:I'd much rather have true 58 FPS synced with the panel, instead of the framerate tanking down to 30FPS.
Hold on, back up. I'm anything but a rendering chain guy, so I still don't quite grok all this G-Sync business, and I want to be sure I didn't just infer a bunch of stuff that doesn't line up with reality. With an OLED panel, a custom control box, and some very fancy software tricks, what I'm hearing in this thread is that you can have an effective 1000 FPS, locked, regardless of how wimpy your CPU and GPU are (see the EDIT below before you get too excited, though). In layman's terms:

A new frame arrives at the control box. Hooray! This is immediately pushed to the panel, and (because this is an OLED panel with no backlight) immediately switched back off. The kilohertz strobe buffer also receives this frame, and begins cycling this latest frame every 0.001 seconds, simulating a LightBoost effect. The rendering chain is already hard at work producing the next frame, and if the input thread detects that there has been a change in viewing direction, that time warping stuff I don't understand kicks in and alters the end result in a way that avoids a perceptible drop in responsiveness at the expense of possibly including some minor visual artifacts. Whether the warping happens in software or in another dedicated chip in the control box, the resulting frame is served up, gets displayed, and replaces the old frame in the strobe buffer, with the new one now flashing away.

It's probably just because I don't understand what G-Sync actually is or how it works, but this dim picture I have is something more like a Rift-specific competitor to G-Sync, or at the very least something that will benefit from only the GPU-side rendering tricks G-Sync employs. It's still about the highest possible FPS being generated by the computer equating with the best possible performance, but it isn't about syncing the monitor with the GPU, it's about the monitor insisting on always and continually operating at precisely 1000Hz and changing what's being strobed as fast as the frames reach the strobe buffer. 30 FPS from the PC is still going to look unsatisfying (and there might even be some stuff "warping" around), but it won't "feel" uncomfortable because the display is chugging along at 1000Hz no matter what.

Help me out here: How much of that is flawed, misunderstood, magical thinking and/or just plain not how this stuff works or ever could work? :geek:


EDIT: This video suggests the above is just not how this stuff works or ever could. The problem is apparently less about strobing something, and more about strobing the correct thing. If a frame is late, attempting to replace it with the last good frame may actually be a terrible idea, because this frame is rapidly diverging from what your brain expects to be seeing during head motion - you'd wind up with blur at best, and employing your barf bucket at worst.

It's possible that there is some happy medium here, whereby the Rift insists on strobing at a locked 120Hz whether it has to use the just-delivered frame, the last real frame, or the last time warped frame, but it now sounds like strobing at a rate beyond what the PC can actually deliver frames at is not recommended. I'm still not sure, though. :cry:
Shameless plug of the day - Read my witty comments on Reddit, in which I argue with the ignorant, over things that don't matter, for reasons I never fully understood!
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

You're almost there, nateight.

The strobing isn't what's important - it's a means to an end. What IS important is displaying the correct image at the correct time, and ONLY the correct image, on the panel. Because we can't have infinite FPS and infinite Hz on the panel, the consequence is that we want to turn the panel off unless we've got our fresh (correct) image to display. In practical terms, that means strobing.
Sometimes I sits and thinks, and sometimes I just sits.
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

What IS important is displaying the correct image at the correct time, and ONLY the correct image, on the panel.
And this is where G-sync (or similar) comes in. It helps get us the correct image at the correct time. To avoid tears, we want to have v-sync on. This means the panel will update the image at (usually) 60z, which is once every 16ms:


|---------------|---------------|---------------|---------------|---------------
0ms............16ms............32ms................

However, the computer doesn't produce frames at exactly 16ms intervals. Let's say our computer is managing to produce each frame in APPROX ~12ms (75FPS). This is ideal, surely? We're getting a "smooth" 60FPS at all times on the monitor, right? Well, maybe not.....


x= Frame Ready
x------------x--|------x--------|--x------------|x------------x-|--------x------
  • x = Exact match - Hurrah! This frame was presented at precisely the time we wanted it to be.
  • x = This frame was finished 3ms before the panel was ready for it. When the panel updates the picture, this means the frame is actually 3ms old. It's what the game world looked like 3ms in the past!
  • x = This frame was ready 9ms before the panel. When the panel updates the picture, this frame will be 9ms old!
  • x = This frame was ready 13ms before the panel. When we get our "brand new" frame on the screen, it's already a massive 13ms out of date!
  • x,x = Yuck. The pink frame is actually just discarded: we never see it! This is completely wasted processing time and effort. Instead, we see the red frame. The red frame is 2ms "old" when we get to see it.
  • x= This frame is 7ms old when we see it.
This was always an issue on monitors (albeit not massively upsetting), but for VR its effects are worse. For one, everything we see has an indeterminate (and variable!) amount of latency added to it. Furthermore, things are rarely presented precisely when/where they should be. We are learning that the vestibulo-ocular reflex is extremely precise, and our eyes are amazing at tracking objects as we move/turn our heads (if we turn our heads, our eyes counter-rotate by precisely the same angle). The issue for VR is that, using the above traditional rendering pipeline, objects are almost never exactly where they should be! I believe this is one of the factors contributing to "blur" when we move our heads.

G-sync eliminates ALL of these problems by presenting the frames to us as soon as they're ready. Frames are never "old"; you see the new frame immediately, and so it is always at the correct time. There is no wasted processing power, because frames are never discarded. You get lower effective latency, smoother visuals, and better effective performance. I want!
Sometimes I sits and thinks, and sometimes I just sits.
User avatar
nateight
Sharp Eyed Eagle!
Posts: 404
Joined: Wed Feb 27, 2013 10:33 pm
Location: Youngstown, OH

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by nateight »

TheHolyChicken wrote:x------------x--|------x--------|--x------------|x------------x-|--------x------
Ah, okay! This makes it very clear what G-Sync is and why John Carmack has been saying it's long overdue. The final piece of the puzzle here is that proprietary G-Sync technology needs to be present not just in the user's GPU and driver base, but the user's monitor also needs to have specialized hardware to enable this effect. Hence why this variable frame rate VBLANK "FreeSync" thing is a big deal - many (but surely not all) displays already have the capability to do this, no nVidia licensing or "upgrade kits" required. I think I'm also finally starting to understand all this vestibulo-ocular reflex-related blurring stuff that struck me from my first day with the Rift, e.g., "strobing" is definitely the wrong nomenclature to be using when we can say "low persistence of vision" instead. Brains are weird, but it's always thrilling when we use them to understand them better. It's very encouraging to see that Oculus is on top of all this and trying hard to track down (or create?) a panel that can take advantage of this tech.

Thanks, THC!
Shameless plug of the day - Read my witty comments on Reddit, in which I argue with the ignorant, over things that don't matter, for reasons I never fully understood!
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

Glad I could help. If I got any of that wrong, anybody, please correct me. I try to understand all this best I can, but I'm far from infallible.
So Oculus..... any chance this is in the pipes? This is your cue to whisper some nod of confirmation :D
Sometimes I sits and thinks, and sometimes I just sits.
remosito
Binocular Vision CONFIRMED!
Posts: 251
Joined: Wed Apr 10, 2013 2:06 am

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by remosito »

freesync will have more input lag than using v-sync alone because VBLANK under CVT rules requires the interval to be set before the frame is drawn thus adding at least one more frame of latency
anybody can confirm this? yet another frame buffered over v-sync which is already double-buffered?
Starcitizen - Elite:Dangerous - Xing - Gallery: Six Elements - Among the sleep - Theme Park Studio - The Stomping Land - Son of Nor - Obduction - NOWHERE - Kindom Come : Deliverance - Home Sick - prioVR
lossofmercy
One Eyed Hopeful
Posts: 45
Joined: Thu Oct 18, 2012 2:25 am

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by lossofmercy »

nateight wrote:many (but surely not all) displays already have the capability to do this, no nVidia licensing or "upgrade kits" required.
Not quite, we don't know of any desktop monitors that support this specification, only some mobile devices like the tablet computers they were running their demo on. What AMD is saying is "hey, this already exists in the specs, why don't you guys start making software/hardware that supports this specification?"

I hope there is a huge drive for this, especially on the higher end displays where it's going to matter. I hate tearing and I hate vsync a little less. I think it might be moderately important for the Rift, but it's more important when you are dipping down to sub 40 fps in a regular monitor. That number might change if you are in the rift, we can't say.
lossofmercy
One Eyed Hopeful
Posts: 45
Joined: Thu Oct 18, 2012 2:25 am

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by lossofmercy »

http://techreport.com/news/25878/nvidia ... -sync-demo
However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by Fredz »

TheHolyChicken wrote:The benefit is still fantastic. Yes, at low framerates you will get flickering, but what about at 58 FPS? I'd much rather have true 58 FPS synced with the panel, instead of the framerate tanking down to 30FPS.
I don't think I'd like to see flicker at 58Hz or 60Hz on an HMD.

That's basically the same problem than with CRTs at the times, 60Hz was unbearable for most people. I found 72Hz to be acceptable, but some people can't bear anything lower than 85Hz and I guess it's even worse with a wide FOV.

Brendan Iribe said they were shooting for 90Hz in his last presentation.
TheHolyChicken wrote:G-sync eliminates ALL of these problems by presenting the frames to us as soon as they're ready. Frames are never "old"; you see the new frame immediately, and so it is always at the correct time. There is no wasted processing power, because frames are never discarded. You get lower effective latency, smoother visuals, and better effective performance. I want!
I don't think that's how it would work with the current implementation of the SDK.

G-sync presents the frame as soon as it's ready but the sensor orientation is read before the start of the rendering, so when the frame is displayed its position is still incorrect since the head has moved.

Considering a 100°/s constant head rotation (corresponding to a smooth pursuit eye movement) :
- without G-sync at 60Hz (16.7ms frame) : 1.66° error
- with G-sync at 70Hz (14,3ms frame) : 1.43° error
- with G-sync at 58Hz (17,2ms frame) : 1.72° error
- with G-sync at 50Hz (20ms frame) : 2° error
- with G-sync at 40Hz (25ms frame) : 2.5° error

With the new SDK implementation which probably uses the "time warping" technique from John Carmack I think it would be different. The sensor orientation would be read when the frame is ready and the rendering would be de-warped to account for this new orientation.

In the previous presentation it's said to take 2ms, so this would give the same result with or without G-sync.
Last edited by Fredz on Thu Jan 09, 2014 2:52 pm, edited 2 times in total.
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by MSat »

I think most people would notice the flicker at 75Hz or less.

I'm also not sure how effectively you can utilize dynamic refresh rates when you're using motion prediction algorithms for head tracking since it would require knowledge of how long the scene is going to take to render (and how would you know that ahead of time?).
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

Fredz wrote:
TheHolyChicken wrote:The benefit is still fantastic. Yes, at low framerates you will get flickering, but what about at 58 FPS? I'd much rather have true 58 FPS synced with the panel, instead of the framerate tanking down to 30FPS.
I don't think I'd like to see flicker at 58Hz or 60Hz on an HMD.

That's basically the same problem than with CRTs at the times, 60Hz was unbearable for most people. I found 72Hz to be acceptable, but some people can't bear anything lower than 85Hz and I guess it's even worse with a wide FOV.

Brendan Iribe said they were shooting for 90Hz in his last presentation.
You're right that flashing the screen at <60Hz (or maybe even <70 or <80) would almost certainly be uncomfortable, due to flicker, on a low-persistence display. The higher the better (I'd love 120Hz+). I wasn't very clear in what I wrote; I just meant that g-sync is still the preferred option over vsync, regardless of the FPS we're maintaining. Vsync is never the preferred option. If the FPS is dropping low enough that the display would be uncomfortably flickery, it's probably feasible to just flash up the old image again. The blur is back, but you'd avoid the eyestrain from flicker.
Fredz wrote:
TheHolyChicken wrote:G-sync eliminates ALL of these problems by presenting the frames to us as soon as they're ready. Frames are never "old"; you see the new frame immediately, and so it is always at the correct time. There is no wasted processing power, because frames are never discarded. You get lower effective latency, smoother visuals, and better effective performance. I want!
I don't think that's how it would work with the current implementation of the SDK.

G-sync presents the frame as soon as it's ready but the sensor orientation is read before the start of the rendering, so when the frame is displayed its position is still incorrect since the head has moved.
I didn't want to overcomplicate the g-sync explanation, but you're correct; even if you're using g-sync, the FRAME is brand new when it's displayed on the panel, but the headtracking data used to generate the frame is not. Assuming it's taking approx 16ms to render the frame, even with g-sync it means that the frame was using headtracking data from 16ms ago. This is where prediction comes in handy, of course, and the rumoured "time warp".
Fredz wrote:With the new SDK implementation which probably uses the "time warping" technique from John Carmack I think it would be different. The sensor orientation would be read when the frame is ready and the rendering would be de-warped to account for this new orientation.

In the previous presentation it's said to take 2ms, so this would give the same result with or without G-sync.
Well this is where things get really magic.

The "time warp" sounds like the team have come up with a way of "correcting" the frame using more recent headtracking data. Assuming it takes ~16ms to render the frame, this means you use 16ms prediction on the headtracking data used to generate that frame. Once the frame has been generated, you then "correct" that frame (somehow[???] - I'd love a good explanation!) using the most recent headtracking data. The idea being that the up-to-date headtracking data will be a little more accurate than your old headtracking data + prediction. This frame - using brand new headtracking data - is then sent to the panel....

.... and this is where g-sync comes in. It's still really important, because you want that frame to be presented immediately. Without g-sync, your brand new amazing frame still sits around for a couple of milliseconds until the panel decides it's time to switch. The vsync timeline/diagram still applies.
Sometimes I sits and thinks, and sometimes I just sits.
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

MSat wrote:I'm also not sure how effectively you can utilize dynamic refresh rates when you're using motion prediction algorithms for head tracking since it would require knowledge of how long the scene is going to take to render (and how would you know that ahead of time?).
The time it will take to render the scene is still an unknown if you're using vsync. Vsync doesn't help at all:

Let's imagine it's 16ms until the next panel update. You could use 16ms prediction, render the scene, and then, theoretically, you nailed it. Perfect!
....but what happens if the frame takes 17ms to render instead of 16ms? Oh crap - we've missed our deadline! What do we do now? If we use this frame for the next panel update, it will be 15ms old!

We should render a new frame. But... what prediction value do we use? Do we gamble, use 15ms prediction, and hope that we can render a replacement, better frame in just 15ms? What happens if we miss it again? (likely). Maybe we should aim for the sync after that, using 31ms predicton? Eww... that's not ideal either: if we're actually early, this frame's prediction will be all wrong. This is a mess.

Instead of all that, you just use a prediction value based on recent frames. If the recent frames were approx 10ms, then you just use prediction of 10ms; if the recent frames were approx 16ms, then you just use prediction of 16ms. Prediction is never going to be exact, you're just trying to mitigate the error in headtracking data. We then immediately present our new frame thanks to G-sync.
Sometimes I sits and thinks, and sometimes I just sits.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by geekmaster »

In a recent video, Iribe described the time warp procedure, which sounded a lot like what we previously discussed in the "PTZ Tweening" thread. The idea is to morph the framebuffer so that it more closely matches the current head orientation and position, perhaps at a higher framerate than VR rendering, and certainly at a more accurate position for the time the display is actually visible.

In my experience with deshaking video (using the VirtualDub Deshaker plug-in), having frames visible too long (such as with an LCD display that does not strobe the backlight), could be perceptually okay *IF* those long frames were CENTERED on where the brain expects them, instead of starting at the right position and being held too long. This would require motion prediction, to display a frame at a position corresponding to the CENTER of its perceived path while the frame remains visible with eye moving to track an object. Yes, there would still be motion blur, but I believe it would not be a source of queasiness if it the center of the blurred frame was where the eye expected that visible content to be at that time.

IMHO, we need to expect eye counter-rotation, and take frame hold (visibility) time into account, and create video frames that will be current at the center of the path a visible frame takes as it slides across the retina.

With moving onscreen objects that our eyes may track, we could go further and take video motion vectors into account (like DeShaker uses, or as used in MPEG compression), and use that data while morphing the framebuffer, to compensate for eyes following any arbitrary moving object instead of just the moving screen as a whole. We will be able to optimize that method later, with the addition of eye tracking data.
Last edited by geekmaster on Fri Jan 10, 2014 8:31 am, edited 1 time in total.
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

Carmack had tweeted:
John Carmack wrote:"Going from 30 to 20ms motion-to-photons latency is a clear win, but going down to 8 ms only really shows when "rattling" the HMD."
I asked him:
"I'm guessing that to attain 8ms you must be using some form of gsync-type tech? Or are you brute forcing with huge panel hz?"


His reply:
John Carmack wrote:"pacing the raster, updating just ahead of it"
Sometimes I sits and thinks, and sometimes I just sits.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by geekmaster »

John Carmack wrote:"pacing the raster, updating just ahead of it"
Michael Abrash called that "racing the beam", where "beam" is a archaic term from the ancient CRT days, when the beam was an electron beam that excited the screen phosphors. John's terminology may be more appropriate for the modern era.
;)

A simple way to think of it is that you create content for each scan line (or perhaps each pixel on a scan line), that will be where your brain expects it to be during the time the display hardware makes it visible. The beam/raster thing has to do with the fact that some displays (all CRTs and some LCD panels) update the display continuously as the data arrives.

Unless you can display ONLY the current raster scan position (a scan line or slice of scan lines), similar to how a CRT works (which COULD be done with OLED that used appropriate driver hardware), you need to predict where all the visible pixels WILL be when backlight strobing is used (or if all OLED pixels are visible at the same time).
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

I still don't quite get what he means by "pacing the raster". Pacing? I assume he means that we go through the transforms, and the clipping, and then apply the newest headtracking data at the last possible moment before rasterization. I wish I understood this better....

It looks like they have a panel capable of at least 120hz then, otherwise they couldn't reach as low as 8ms. That's pretty damn exciting.
Sometimes I sits and thinks, and sometimes I just sits.
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by MSat »

TheHolyChicken wrote:
MSat wrote:I'm also not sure how effectively you can utilize dynamic refresh rates when you're using motion prediction algorithms for head tracking since it would require knowledge of how long the scene is going to take to render (and how would you know that ahead of time?).
The time it will take to render the scene is still an unknown if you're using vsync. Vsync doesn't help at all....

But what you do know know with vsync (assuming you can manage to stay locked) is when those pixels on the display are going to start lighting up - and that's what you base your head tracking prediction target on (and therefore camera transforms).
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by MSat »

TheHolyChicken wrote:Carmack had tweeted:
John Carmack wrote:"Going from 30 to 20ms motion-to-photons latency is a clear win, but going down to 8 ms only really shows when "rattling" the HMD."
I asked him:
"I'm guessing that to attain 8ms you must be using some form of gsync-type tech? Or are you brute forcing with huge panel hz?"


His reply:
John Carmack wrote:"pacing the raster, updating just ahead of it"
It's excellent news that an 8ms latency is really only beneficial under extreme conditions! That means a sufficiently good experience can be had well below that 1000Hz FPS "holy grail".
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

MSat wrote:
TheHolyChicken wrote:
MSat wrote:I'm also not sure how effectively you can utilize dynamic refresh rates when you're using motion prediction algorithms for head tracking since it would require knowledge of how long the scene is going to take to render (and how would you know that ahead of time?).
The time it will take to render the scene is still an unknown if you're using vsync. Vsync doesn't help at all....
But what you do know know with vsync (assuming you can manage to stay locked) is when those pixels on the display are going to start lighting up - and that's what you base your head tracking prediction target on (and therefore camera transforms).
You're right - if you lock your game's timesteps to the panel refresh rate, and you can prepare the frame in time, then you could take advantage of the fact you'll know exactly when the pixels will light up. If you fail to prepare the frame on time, though, v-sync punishes you.

I still believe the g-sync route would be preferential. You sacrifice a tiny amount of headtracking accuracy to avoid vsync's unavoidable compromise of redundant performance vs massive FPS drops. Those FPS drops also result in headtracking inaccuracy, but additionally cause problems with the low-persistence display; if your FPS drops too low, they're going to have to start giving you old frames in order to avoid eye-strain-inducing flicker. That old frame is sort of equivalent to a new frame with large tracking inaccuracy.
Sometimes I sits and thinks, and sometimes I just sits.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by geekmaster »

TheHolyChicken wrote:... if your FPS drops too low, they're going to have to start giving you old frames in order to avoid eye-strain-inducing flicker. That old frame is sort of equivalent to a new frame with large tracking inaccuracy.
Brendan Iribe mentioned in a video from a few weeks back that they warp the frame buffer to match the head position (much like what I described in the "PTZ Tweening" thread). Essentially, it is warped to simulate looking at game content on a virtual movie screen in front of you in VR space, where your last-moment head movements update the position of the screen (framebuffer) in front of you. This goes a LONG way toward avoiding showing late frames that cause motion sickness, instead causing the frame to just get some warp distortion while actually being where your brain expects it to be...

In the PTZ Tweening thread, I mentioned simulating a higher frame rate than content update rate. A side effect of higher frame rate is better low-latency (virtual) head tracking. That works even if you do (or simulate) black frame insertion, using a lower frame rate with better control (or predicition) of WHEN the content becomes visible, allowing you to prepare the best you can, then do warp corrections to the 2D projected content at the last moment... FreeSync and G-Sync can certainly help in this prediction and frame warp correction.
Last edited by geekmaster on Fri Jan 10, 2014 11:53 am, edited 1 time in total.
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by TheHolyChicken »

Hmm, I forgot about that in my previous response. The implications are pretty interesting, if it's possible to simulate a higher framerate. I need to go re-read that tweening thread.
Sometimes I sits and thinks, and sometimes I just sits.
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by MSat »

TheHolyChicken wrote: I still believe the g-sync route would be preferential. You sacrifice a tiny amount of headtracking accuracy to avoid vsync's unavoidable compromise of redundant performance vs massive FPS drops. Those FPS drops also result in headtracking inaccuracy, but additionally cause problems with the low-persistence display; if your FPS drops too low, they're going to have to start giving you old frames in order to avoid eye-strain-inducing flicker. That old frame is sort of equivalent to a new frame with large tracking inaccuracy.
I suppose there's no reason you can't arbitrarily select your own upper FPS limit to lock to (for the sake of motion prediction) but take advantage of g-sync whenever the engine fails to reach its target.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by geekmaster »

The whole point of tweening/warping is to provide a substitute frame warped from old content, when new content does not arrive in time. A benefit of this is you could have richer content that takes more resources to render, occasionally (or often) causing dropped frames (to be substituted with warp previous framebuffer contents).

I called it "tweening" under that the assumption that you could drop every other frame, or even two out of three frames, and still display the framebuffer contents where you brain expects it to be while your head is moving. "Tweening" means creating intermediate frames between key frames, where in this case key frames are all frames that arrive in time to be used.

We could even allow all frames to arrive late, and "remove" latency by warping the framebuffer every time to compensate for FreeSync or G-Sync delays. It sounds like that is the direction OculusVR is heading, based on Iribe's talk I mentioned previously...
bobv5
Certif-Eyed!
Posts: 529
Joined: Tue Jan 19, 2010 6:38 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by bobv5 »

Fredz wrote:Brendan Iribe said they were shooting for 90Hz in his last presentation.
geekmaster wrote:In a recent video, Iribe described the time warp procedure, which sounded a lot like what we previously discussed in the "PTZ Tweening" thread. The idea is to morph the framebuffer so that it more closely matches the current head orientation and position, perhaps at a higher framerate than VR rendering, and certainly at a more accurate position for the time the display is actually visible.
Anyone have links to these videos?
"If you have a diabolical mind, the first thing that probably came to mind is that it will make an excellent trap: how do you get off a functional omni-directional treadmill?"
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by geekmaster »

bobv5 wrote:
Fredz wrote:Brendan Iribe said they were shooting for 90Hz in his last presentation.
geekmaster wrote:In a recent video, Iribe described the time warp procedure, which sounded a lot like what we previously discussed in the "PTZ Tweening" thread. The idea is to morph the framebuffer so that it more closely matches the current head orientation and position, perhaps at a higher framerate than VR rendering, and certainly at a more accurate position for the time the display is actually visible.
Anyone have links to these videos?
time warp (tweening):
http://www.youtube.com/watch?v=tL6hVtEkV9Q&t=10m18s

As you can see, he describes the decoupling of a fast display update from a slower VR environment render, just like I suggested in my thread. But like mounting LEDs on the latest Rift prototype (via Patim Patam's protoype) my render/display update uncoupling I was pushing seems to be harvested from the forums to become their own, without credit due as far as I can see. Well, he does say "leave it again to game developers to do some tricks" -- he just doesn't say who those "game developers" are...

And their derivative extensions to our openly shared ideas are not being given back to the community, so far, from what I can see. But who knows, maybe they WILL share their "time warp" source code in a future SDK release. I hope so...

EDIT: Brendan also talks about the 90Hz frame updates required, immediately following the "time warp" comments in that video.
Last edited by geekmaster on Fri Jan 10, 2014 2:29 pm, edited 1 time in total.
bobv5
Certif-Eyed!
Posts: 529
Joined: Tue Jan 19, 2010 6:38 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by bobv5 »

Thanks.

I can't comment on Oculus use of tweening, but I was using IR LED's on my old HMD years ago. Not a new idea.
"If you have a diabolical mind, the first thing that probably came to mind is that it will make an excellent trap: how do you get off a functional omni-directional treadmill?"
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by geekmaster »

bobv5 wrote:Thanks.

I can't comment on Oculus use of tweening, but I was using IR LED's on my old HMD years ago. Not a new idea.
Johnny Lee' infamous video used IR LEDs too:
http://www.youtube.com/watch?v=Jd3-eiid-Uw

But what Patim Patam did is spread LEDs over a Rift, and behind it (such is on the headstrap buckle in the Crystal Cove prototype). The point is, comparing PHOTOS of both devices (built-in and prior clip-on), shows how there was prior art in these forums.

Image

However, I have not seen my "PTZ Tweening" ideas described ANYWHERE but in my thread, and later, in Brendan Iribe's video linked above, and a recent brief tweet from John Carmack about "time warp".

http://www.youtube.com/watch?v=FPzhYhuQXZY
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by Fredz »

TheHolyChicken wrote:The higher the better (I'd love 120Hz+)
I'm fine with 90Hz, 120Hz or more is pushing a bit too far the horsepower required for rendering for no evident benefit IMO.
TheHolyChicken wrote:I wasn't very clear in what I wrote; I just meant that g-sync is still the preferred option over vsync, regardless of the FPS we're maintaining. Vsync is never the preferred option.
That's probably true but I'm still not sure. Somewhat I feel that a constant refresh rate would be better for comfortable motion perception, but I can't find any reference about that. I guess I'll have to see that by myself when I get a variable rate monitor.
TheHolyChicken wrote:If the FPS is dropping low enough that the display would be uncomfortably flickery, it's probably feasible to just flash up the old image again. The blur is back, but you'd avoid the eyestrain from flicker.
It would be worse if the flashing was done by the display itself, since the second image wouldn't be at the correct position.
TheHolyChicken wrote:Once the frame has been generated, you then "correct" that frame (somehow[???] - I'd love a good explanation!) using the most recent headtracking data.
I think it's simply done by rendering for a wider FOV, which size can be derived from the speed of the head movement. Then just before sending the image to the display, the sensor values available from another thread (working at 1000Hz) are fed to the GPU (via a pinned memory buffer) and a pixel shader de-warps the image for rendering at the correct orientation.
TheHolyChicken wrote:and this is where g-sync comes in. It's still really important, because you want that frame to be presented immediately. Without g-sync, your brand new amazing frame still sits around for a couple of milliseconds until the panel decides it's time to switch. The vsync timeline/diagram still applies.
With V-sync you know the duration between refreshes so you can use a timer - or a hardware interrupt if the pre-warping can be done fast enough, like it was done for VGA DDC 3D glasses - and do the pre-warping just before the vertical retrace.

The only difference is that it's done at a variable rate and automatically with G-sync and at a constant rate with V-sync with some more work, but the frames would be correctly timed in both cases.
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by Fredz »

geekmaster wrote:my render/display update uncoupling I was pushing seems to be harvested from the forums to become their own, without credit due as far as I can see.
Latency Mitigation Strategies by John Carmack - February 22, 2013
"PTZ Tweening" for low-power low-latency head-tracking by you - February 27, 2013
Priority rendering with a virtual reality address recalculation pipeline [pdf] by Matthew Regan & Ronald Pose - 1994

Lacking a bit of humility, eh ? Not the first time alas...
Last edited by Fredz on Fri Jan 10, 2014 3:41 pm, edited 1 time in total.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: AMD Demonstrates "FreeSync", Free G-Sync Alternative, at

Post by geekmaster »

Fredz wrote:
geekmaster wrote:my render/display update uncoupling I was pushing seems to be harvested from the forums to become their own, without credit due as far as I can see.
Latency Mitigation Strategies by John Carmack - February 22, 2013
"PTZ Tweening" for low-power low-latency head-tracking by you - February 27, 2013
Priority rendering with a virtual reality address recalculation pipeline [pdf] by Matthew Regan & Ronald Pose - 1994

Lacking a bit of humility, eh ? Not the first time alas...
As I said, I was not AWARE of other references to it, and my ideas were based on framerate interpolation, and animation "in betweening". I did not read John's stuff until much later. However, my thread was read by OculusVR staff, and certainly expanded on such ideas and brought them to the surface where they could be used by whomever may need them.

Regarding "not the first time", read the following post at the link you provided. I gave foisi motivation to finish his project, as he claimed, which is credit enough for my efforts...

Regarding your 1994 link for "address recalculation", I actually did THAT back in the early 80's, in firmware, to move a window around in a (text) framebuffer. Not for VR use though, but highly effective compared to previous firmware. That experience is what begain my PTZ Tweening thread.

Nothing to be humble (or humiliated) about. I give others credit when I am aware of prior art. I put lots of links in my posts. There is nothing to be ashamed of when promoting oneself, as long as you give valuable information to accompany it. Empty self-promotion is just braggadocio, and I try hard to avoid that.

I did not claim to INVENT these ideas, and in fact I provided lots of external references in my thread.

All I am saying in this thread is that it would be nice to give me credit for my ideas, even though they may be based on the ideas of others from decades past. We all stand on the shoulders of giants (even giants like John Carmack). His time warping method also needs a depth buffer, whereas my tweening method was intended for low-overhead warping (mostly PTZ/pan-tilt-zoom based, a variation of "address recalculation"), but I also mentioned using motion vectors to warp or morph the intermediate "tween" frames.

I must have added something valuable to this idea. What is wrong with sharing credit? Sharing ideas is good, and so is sharing credit, IMHO.
Post Reply

Return to “Oculus VR”