Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post Reply
User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

https://www.youtube.com/watch?v=JLEIJhunaW8

The full video is interesting, but the TL;DW is that the nvidia drivers have now been discovered to have a significant overhead/slowdown when used in CPU limited scenarios (with weaker CPUs) compared with AMD cards.

This could be the same bottleneck as 3D Vision experiences, as has been confirmed with nVidia themselves as a bug which was going to be fixed aaaaany day now, before they killed 3DV :roll:

If nvidia fix this in new drivers, it's a shame that we won't be able to use it with 3DV unless we manage to hack new drivers to work with 3DV... :o
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

Lysander
Certif-Eyed!
Posts: 542
Joined: Fri May 29, 2020 3:28 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by Lysander »

:( still hard to swallow this driver fiasco
Ryzen 5 3600X, RTX2060, 16GB ram, Windows 20H2, nVidia 452.06, SSD, VG278H.

BazzaLB
Two Eyed Hopeful
Posts: 56
Joined: Mon Sep 16, 2019 8:53 am

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by BazzaLB »

RAGEdemon wrote: Thu Mar 11, 2021 1:29 pm https://www.youtube.com/watch?v=JLEIJhunaW8

The full video is interesting, but the TL;DW is that the nvidia drivers have now been discovered to have a significant overhead/slowdown when used in CPU limited scenarios (with weaker CPUs) compared with AMD cards.

This could be the same bottleneck as 3D Vision experiences, as has been confirmed with nVidia themselves as a bug which was going to be fixed aaaaany day now, before they killed 3DV :roll:

If nvidia fix this in new drivers, it's a shame that we won't be able to use it with 3DV unless we manage to hack new drivers to work with 3DV... :o
No need to worry, they wont fix it.

User avatar
DJ-RK
Binocular Vision CONFIRMED!
Posts: 336
Joined: Thu Sep 19, 2019 8:13 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by DJ-RK »

Well, I can confirm that although 3 core bug is still present with my new Ryzen 5600x powered PC that I just built, it can, however, brute force it's way to getting near 60 FPS (and way over in some cases) in GTAV and RotTR, 2 games that I previously played in the 20 FPS range before on a 6600K. Anything over 60FPS is a waste in 3D anyways, so I'm happy to have finally smashed through this stupid frickin CPU bug. FFXV sadly still gets nowhere near 60 FPS, and unfortunately the fix masterotaku and I put together reduces performance a good amount. Running around in 3D without the fix sadly looks fairly smooth at around 40FPS, but put the fix on and it dips below 30. Still, that's better than dipping down into the teens like it was on my 7700K@5Ghz, but a 100% increase on that low of an FPS still amounts to a pretty low amount. :/

But yeah, all in all my testing has shown that I've at least doubled my FPS in all bugged CPU bottlenecked games coming from a 7700K, which ain't too bad. Those waiting for the next gen of CPU's will definitely be in for a treat, but those itching to get something now aren't in too bad of a spot either, I think. Also, to note, I haven't tested on this much, but what could be another factor in my performance gains is that rather than going with 2 sticks of RAM, I went with 4 sticks of single ranked RAM to take advantage of a known performance benefit that Ryzen chips get from having 4 ranks of memory.

Anyways, just throwing this out there for anyone wondering about what type of gains they could be looking at, since I don't if anyone else around here has upgraded to this gen yet. This is probably the first time that upgrading my CPU architecture has gained me more gaming performance than a GPU upgrade ever has, so I'm quite in disbelief about it, tbh. Very pleasantly surprised, for a change.

User avatar
4everAwake
One Eyed Hopeful
Posts: 25
Joined: Mon Jul 21, 2014 11:04 am

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by 4everAwake »

Thanks for this, DJ-RK. I'm happy to read about your performance jump with your new CPU. Now, I'm seriously looking into upgrading my rig this year.
My rig:

GTX 1070 (driver 452.06), Intel i5-6600k, 16 gigs RAM, Windows 10 (2004)

Lysander
Certif-Eyed!
Posts: 542
Joined: Fri May 29, 2020 3:28 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by Lysander »

ooooh, very nice indeed. DJ, if you get a chance, could you test out RE2/3 with your new CPU? I hate the dips in those beautiful games that otherwise run at perfect 60fps on my rig.
Ryzen 5 3600X, RTX2060, 16GB ram, Windows 20H2, nVidia 452.06, SSD, VG278H.

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

Thank you for the valuable data point DJ-RK, WOW, almost double the performance in a game just from going from 4c8t 7700k@5GHz to a stock 6c12t 5600x! That's amazing!

I wonder if the performance came more from the IPC increase or the core count... :)
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

mistersvin21
One Eyed Hopeful
Posts: 26
Joined: Mon Sep 16, 2019 3:06 am

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by mistersvin21 »

Thank you, DJ-RK ! This is fantstic news, I couldnt even think of movig to AMD CPU until now..
Taking inti aacount Zen 4 CPUs will offer an IPC jump of 25% - I will definitely wait for them, but still.. I'm shocked :-)

User avatar
Tullebob
Cross Eyed!
Posts: 126
Joined: Wed Dec 02, 2020 5:41 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by Tullebob »

Thank you DJ-RK, for your report. Very useful info indeed. I am planning to upgrade to Ryzen 5900X coupled with fast ram (3600 or 4000) sometime this year and pair with my 2080ti, so this sounds promising. I game in 3dV using only 1080p these days. Due to the extreme hardware requirements for 3DV, I overall find 1080p to be the best compromise and smoothest alternative. I am happy to give up resolution for increased performance, so sorely needed in 3DV. When gaming on a native 1080p display, 1080p looks good I think :)

Having been somewhat distanced from 3DV gaming for some years, I picked it up again for real last year. I must say that overall, my view is that 3DV performance now is better (and more fun) than it has ever been. Ironic that so few people have insight into how good it actually is now. I am now getting good performance out of most of the games that I am interested in. Many games now hold a rock solid 1080p/60fps (RE7, Witcher 3, Mortal Kombat 11, Titanfall 2 etc.). And a whole motherload of games now are capable of being played in 60 fps most of the time, with some noticeable dips, but still give an overall impression of smoothness and enjoyability (GTA5, Dishonored 2, Halo Master Chief Collection, Hitman 2, Plague Tale, Rise / Shadow Tomb Raider etc). Even Star Wars Jedi Fallen Order (the FIX IS JUST CRAZY OUT OF THIS WORLD GOOD BTW!!) now runs at 60 fps in many portions (although with some clear exceptions), being super enjoyable overall. Hell, testing Control and Monster Hunter World the other day, I even saw these titles being GPU bound in certain instances (a rare sight indeed with 3dV :). Also, the Vulkan fixes, like Doom Eternal, are just amazing with good performance. There are also many cool/novel fixes for 3DV, like the one for GTA4. Just amazing fun to see that old game truck along in good 3DV. With more increase in CPU performance, things will be even better.

What resolution did you run the games at to get 60 fps? Also, should you have the chance, it would be really interesting to hear how the Hitman 2 benchmark behaves on your Ryzen setup (as well as in particular RE2, and to some extent also RE3) :)

Footnote:

I completely understand your desire to also game on other platforms. In my view, one should game on the platform that provides most fun/best overall experience. With many games that is 3DV in my view. For other games (Cyberpunk, Horizon), a very cool (and very flat) experience can be had in 4k/Ultrawide/Gsync/HDR/Ray tracing etc. And playstation exclusives can only be played on playstation :) Playing through the last of us 2 some months ago on ps4 pro, I could not help but think how utterly stunning that game would have been running in 3DV :)
3D Vision rig -> 8700k oc 4.9, asus rog 2080ti, 16 GB DDR4 3200 Mhz ram, Win 7 running driver 425.31 on SSD1, Win 10 running driver 452.06 on SSD2, Win 10 running latest Nvidia driver for Vulkan3dV and 2D gaming on SSD3, for 3d vision display: 1080p on VG248QE (and sometimes 1440p on PG278PR), Valve Index or Optoma UHD42 projector

User avatar
DJ-RK
Binocular Vision CONFIRMED!
Posts: 336
Joined: Thu Sep 19, 2019 8:13 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by DJ-RK »

4everAwake wrote: Mon Mar 15, 2021 4:59 pm Thanks for this, DJ-RK. I'm happy to read about your performance jump with your new CPU. Now, I'm seriously looking into upgrading my rig this year.
Looking at your sig I can see you're rocking a 6600K, so yeah, I have direct experience with that CPU and can confidently say you should get huge gains, probably over double your FPS in games like GTAV like I've experienced!
Lysander wrote: Mon Mar 15, 2021 8:10 pm ooooh, very nice indeed. DJ, if you get a chance, could you test out RE2/3 with your new CPU? I hate the dips in those beautiful games that otherwise run at perfect 60fps on my rig.
Yeah, sure I can fire up at least one of those, I'll probably just load up RE3 since I only played it through once and barely unlocked anything, whereas RE2 I've played through twice now and don't feel there's much left to do there.
RAGEdemon wrote: Tue Mar 16, 2021 2:44 am Thank you for the valuable data point DJ-RK, WOW, almost double the performance in a game just from going from 4c8t 7700k@5GHz to a stock 6c12t 5600x! That's amazing!

I wonder if the performance came more from the IPC increase or the core count... :)
Well, I'm actually not running at stock, I'm currently using Precision Boost Override (PBO) and it's Auto-Overclock to add 200Mhz to the boost clock and using it's curve optimizer to lower the voltage curve (which, in turn, raises the clock speed the CPU's run at any particular voltage), which has me running at 4850Mhz on my fastest 2 cores for single-threaded or low CPU intensive tasks, and around 4650Mhz across all cores on heavy loads, which nets me about a 10% increase on multithreaded tasks and 5% increase on single threaded in various CPU benchmarks

I'm almost certain the increase is only due to the IPC increase, because using performance graphs I can see that during 3D gameplay only my 2 fastest cores are really working at all (so 4 threads in total). In 2D, the other cores do a little bit of extra work. I'll post up a couple pictures taken from BL3 (after running the benchmark) showing the CPU usage in both cases. My 2 fastest cores are cores 2 and 6 (so the CPU threads 3 & 4 and 11 & 12 in the screenshots).

2D performance:
BL3 CPU performance chart - 2D.jpg
3D performance:
BL3 CPU performance chart - 3D.jpg
*Note: The spike that you can see happening simultaneously on all cores just after halfway across the graph is actually the loading that occurs prior to running the benchmark

Tullebob wrote: Tue Mar 16, 2021 10:36 am Thank you DJ-RK, for your report. Very useful info indeed. I am planning to upgrade to Ryzen 5900X coupled with fast ram (3600 or 4000) sometime this year and pair with my 2080ti, so this sounds promising.
Nice. Yeah, I was originally thinking about holding out for the 5900x, but I was hoping the lesser cores of the 5600x would allow for more thermal headroom for overclocking and I figured I really didn't need the extra cores. At the very least, both the 5600x and 5900x both have only 6 cores running per CCX, whereas the 5800x and 5950x have 2 additional cores enabled per CCX. Sadly my overclocking tuning so far hasn't really gained me as much performance as I had hoped for, and I think I wish I had simply opted for a 5800x instead so that I'd have 8 cores to match the same number of cores that the new consoles have.

As for the RAM, again, I highly recommend going 4 sticks of single ranked memory. I got the GSkill Trident Neo 3600Mhz 4x8GB kit. The kit I got is labelled as being specifically optimized for Ryzen (not sure if that's just marketing fluff though). Paid a lot more for that than the 2x16GB kit I originally planned to go with, but if you look on the internet you'll find that others have tested and shown that going from 2 ranks of memory to 4 ranks can increase performance by 10%. You might be able to get the same benefit by getting 2 sticks of dual ranked memory, but I heard that can be flakey so I just went with 4 sticks instead, but that limited my options greatly. Apparently you can get the same results just by buying 2 matching sets of single ranked 2x8GB sticks instead of one individual kit of 4x8GB, which greatly increases your options significantly.
What resolution did you run the games at to get 60 fps?
I ran at both 1440P and 1080P, and the results were the same in those games I tested previously, so still slightly CPU bottlenecked at 1440P, but it seems like it's only just so, so that seems to be the sweet spot. If I, say, run at a resolution scale of 125% @ 1440P I definitely take a performance hit there and become GPU bound


Also, should you have the chance, it would be really interesting to hear how the Hitman 2 benchmark behaves on your Ryzen setup (as well as in particular RE2, and to some extent also RE3) :)
Hmmm, ok. I have Hitman 2 but never loaded it up yet, so didn't even know it had a benchmark, but nice to know it does and I'll add that to my list of games with benchmarks, which sadly isn't very long so any new additions are always welcome. On that note, I wish GTAV's benchmark gave some sort of results screen at the end.

Footnote:

I completely understand your desire to also game on other platforms. In my view, one should game on the platform that provides most fun/best overall experience. With many games that is 3DV in my view. For other games (Cyberpunk, Horizon), a very cool (and very flat) experience can be had in 4k/Ultrawide/Gsync/HDR/Ray tracing etc. And playstation exclusives can only be played on playstation :) Playing through the last of us 2 some months ago on ps4 pro, I could not help but think how utterly stunning that game would have been running in 3DV :)
Absolutely, my friend! I, too, was once like everyone else and felt I could never go back to 2D gaming, but over the years there have been tons of nice features we've had to forego (on top of the lost performance), and so it's no longer a matter of "going back to 2D," now it's also going forward in a lot of ways. I honestly didn't think that raytracing would be that big of a difference, but when it comes to raytraced reflections I'm convinced it adds a lot more depth that you can feel when you're actually playing the game. I picked up Godfall on PS5, which (to my knowledge) uses raytracing for it's reflections, and when I'm playing the game I can literally FEEL something different about them, it adds, I dunno, an additional level of smoothness you can feel as you are swinging the camera around. I let my gf play it while I watched her, and I couldn't see/feel it the same way I do when I play because I wasn't in control. The next time I played it, I instantly had that "ah hah" feeling again about it. Don't even get me started on Spiderman!

In short I would urge 3D only people: don't sleep on checking out raytracing in 4k + HDR in 2D just because you "can't ever go back to playing pancake." Trust me, you will be missing out on just as much cool new tech as you would be losing from giving up playing in 3D, so it's an even wash if you ask me at this point.


Ok, just to add another data point: Been doing a bit of testing using BL3's benchmark, and my performance gains there are "only" about 25%. My highest FPS on my old rig was only ever about 45-47 FPS and I can now get 56-58 FPS there. That is, of course, with lowered graphics settings and resolution. With my actually preferred settings I've gone from 32-35 FPS to 52, but obviously my jump from a 1080Ti to 2080Ti helps in that regard a bit as well.
You do not have the required permissions to view the files attached to this post.

User avatar
Tullebob
Cross Eyed!
Posts: 126
Joined: Wed Dec 02, 2020 5:41 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by Tullebob »

Wise words there indeed :) To get optimal gaming experience, it is necessary to embrace both past and future (hey, I still have my Commodore 64 fully functioning around and one of the best games ever of course is Breath of the Wild for Switch :)

Very interesting what you say about "feeling" raytracing and depth. I must admit that I have not yet bothered to play a game fully that has raytracing. But will have an eye out for what you say. Am currently enjoying the hell out of Fallen Order with your fix enabled. Just spectacular stuff, both gameplaywise and the 3d. Also after a while tinkering with settings managed to get good performance (with some exceptions). Game seems to respond best to driver 425.31, 1080p resolution, ingame vsync off, NVCP vsync on, Rivia Tuner 60 fps limit, open options in origin before launching and ingame refresh set equal to monitor refresh. What an experience!

The reason I asked for the Hitman 2 benchmark is that certain scenes in it are CPU limited as hell on my OC 8700k. Will be superinteresting to hear how your rig tackles it. The 3d fix for Hitman 2 is also amazing. I have not played the actual game, but I suspect that real gameplay is not at all as bothered by the CPU limitation as the benchmark is.

I suspect that RE3 is not as well optimized for PC as RE2 remake. RE3 do not get the best results even in 2d, so am mostly interested in RE2 as I have not yet done the Chris campaign in the remake version. Will hold off until I get a new CPU if your findings indicate that the game responds well to the ryzen.
3D Vision rig -> 8700k oc 4.9, asus rog 2080ti, 16 GB DDR4 3200 Mhz ram, Win 7 running driver 425.31 on SSD1, Win 10 running driver 452.06 on SSD2, Win 10 running latest Nvidia driver for Vulkan3dV and 2D gaming on SSD3, for 3d vision display: 1080p on VG248QE (and sometimes 1440p on PG278PR), Valve Index or Optoma UHD42 projector

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

Great detailed post DJ-RK!
Ok, just to add another data point: Been doing a bit of testing using BL3's benchmark, and my performance gains there are "only" about 25%.
If I might humbly suggest: I believe you will get higher fps if you uncap/de-vsync from 60fps 3D and 120fps 2D in your screenshots. It is possible, either through the options or via nVidia inspector to disable vsync and disable the fps cap in the visuals menu inside game while having 3DV on.

As a data point comparison with uncapped fps at 800p at medium settings, I am getting highest = 72, lowest = 29; however mostly hovering between 40-55 but that is subjective.

I am really excited about your results mate! There have been lengthy discussions regarding CPU buying in the old nVidia forums and here on the MTBS3D boards. It has been difficult because there has never been a standard benchmark which we can rely on to compare.

I wonder if we have an opportunity here to compare benchmarks from a few systems to get to the bottom of what 3D Vision responds to best? Of course it will be your system, but if it turns out that our older systems get significantly less performance in unrestricted apples to apples comparisons then I think a lot of us will be upgrading sooner rather than later, if not overnight! :)

My system is similar to yours: 2080 Ti; 4 sticks of 3900MHz memory; 8c16t 9900k; and I shall be more than happy to test. Maybe we can share config files etc. I can't do 1080p but I can do 720p or 800p, which ought to remove the GPU from the equation as an advantage.

Maybe you might kindly suggest a good benchmark we can all use that makes good use of multithreading to show off CPU 3DV prowess? (Or not, if we want to focus on single core performance!)

I am not sure BL3 is great because it only uses 1.5 cores on my system too in both 2D and 3D. Maybe SOTTR or even better - some benchmark like 3dmark that we could hack into working with 3d like timespy - it wouldn't need to be 3dv fixed for benchmarking purposes, as long as we can get 3dv to engage...

Only if you might have some spare time and curiosity of course - no pressure! :)
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

3DNovice
Certif-Eyed!
Posts: 561
Joined: Thu Mar 29, 2012 4:49 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by 3DNovice »

DarbeeVision gives extra pop to 2D/3D games and movies.

My LG CX has a similar feature, I can't remember off the top off my head what they call it.

Perhaps ReShade or Nvidia's filter have a similar setting. I know I personally contacted a few people at Nvidia and posted in the FreeStyle forums requesting that they add it.

RageDemon said it was some type of photography effect used originally by photagraphers.

Anyhow, as long as you're not some kind of purist that believes altering the image in anyway is a sin because it's not the way it was intended to be seen, you'll like the added effect.
Settings differ depending on the display being used, but 100% is way too over the top, think "A Scanner Darkly"
Here's a 100% example video https://www.youtube.com/watch?v=Tdzw0V-cQKc
https://www.youtube.com/watch?v=pilYrpB2uaA normal use

DarbbeVision works very well with projectors.

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

Update: I've used a tool called CapFrameX which does a pretty good job of capturing and displaying FPS on a graph. It also allows you to capture CPU but it doesn't allow you to compare it from what I can tell.

Here are the results of the BL3 benchmark run at 720p (no GPU bottleneck), MEDIUM settings, VSync OFF, frame cap Unlimited.

Orange = 3DV Disabled in Control Panel
Green = 3DV Enabled in Control Panel, toggled OFF
Blue = 3DV Enabled in Control Panel, toggled ON
System specs in sig.
Image

I've attached my system's benchmark files.
bl3 benchmark capture.zip

DJ-RK, if you have a moment mate, Would you kindly capture the benchmark on the above settings and upload the .json file? I'll compare and try to superimpose the CPU usage to the FPS comparison graphs... this should be fascinating :)

Bonus graphs:
3DV DISABLED vs CPU
Average = 167 fps
Image


3DV Toggled OFF vs CPU
Average = 133 fps
Image


3DV Toggled ON vs CPU
Average = 47 fps
Image

On an intel 9900k, one can clearly see the CPU usage drop a bit between 3DV Disabled and Enabled-Toggled-Off; and then significantly drop between Toggled-Off and Toggled-On. GPU was always hovering around 50% so was never a bottleneck.

Sadly, there is a:
72% reduction in FPS compared to 3DV Disabled and ON.
65% reduction in FPS compared to 3DV toggled OFF vs toggled ON.

I wonder how a Ryzen 5-series with its monstrous IPC will look :)
You do not have the required permissions to view the files attached to this post.
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

User avatar
Tullebob
Cross Eyed!
Posts: 126
Joined: Wed Dec 02, 2020 5:41 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by Tullebob »

This will be very interesting to see! (While the patterns displayed in the first graph certainly were not unexpected, it is still staggering to see a visual representation of the extent to which 3DV performance is impeded by the CPU issue)
3D Vision rig -> 8700k oc 4.9, asus rog 2080ti, 16 GB DDR4 3200 Mhz ram, Win 7 running driver 425.31 on SSD1, Win 10 running driver 452.06 on SSD2, Win 10 running latest Nvidia driver for Vulkan3dV and 2D gaming on SSD3, for 3d vision display: 1080p on VG248QE (and sometimes 1440p on PG278PR), Valve Index or Optoma UHD42 projector

User avatar
DJ-RK
Binocular Vision CONFIRMED!
Posts: 336
Joined: Thu Sep 19, 2019 8:13 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by DJ-RK »

RAGEdemon wrote: Tue Mar 16, 2021 6:11 pmIf I might humbly suggest: I believe you will get higher fps if you uncap/de-vsync from 60fps 3D and 120fps 2D in your screenshots. It is possible, either through the options or via nVidia inspector to disable vsync and disable the fps cap in the visuals menu inside game while having 3DV on.
Yeah, the driver keeps sneaking back to defaulting with vsync on, which keeps annoying me, but for that particular test I did wasn't influential on the results so I didn't care to rectify it then. I'll most certainly be keeping it off for future tests.
I wonder if we have an opportunity here to compare benchmarks from a few systems to get to the bottom of what 3D Vision responds to best?
Absolutely, I'd be quite happy to participate in some standardized testing and comparison, and would most appreciate your participation and input (as well as any others that want to get on board with it). I'm currently focusing on trying out various performance tweaks to get the optimal setup for my new rig, and my primary objective is to maximize my 3D performance, so once I'm settled I'll be running all these tests anyways, may as well see if any of it can be of any use more than squeaking out a couple extra % points for myself!

Maybe you might kindly suggest a good benchmark we can all use that makes good use of multithreading to show off CPU 3DV prowess? (Or not, if we want to focus on single core performance!)

I am not sure BL3 is great because it only uses 1.5 cores on my system too in both 2D and 3D. Maybe SOTTR or even better - some benchmark like 3dmark that we could hack into working with 3d like timespy - it wouldn't need to be 3dv fixed for benchmarking purposes, as long as we can get 3dv to engage...
Well, as I mentioned before, I'm not super keenly aware of too many benchmarks in games, at least those that matter. Here are the ones I know of:


BL3 - Overall one of the newest games I have to have one, and a UE4 game to boot (though heavily modified one), so that has the benefit of giving a performance benchmark on one of the most commonly used engines to date. If we could find another UE4 game with a built in benchmark might be a bit more ideal.

GTAV - Is good because it was the first game I think we really noticed the 3 core bug, though not a hugely common engine. My main gripe is that it doesn't give you any actual result at the end, just a live FPS counter

Tomb Raider / RotTR / SotTR - I've only played up to RotTR, so I haven't even loaded up SotTR to know that it had a benchmark. RotTR was another game that I was hugely plagued with CPU performance, so I know it to be a reliable source, but I'm willing to switch to SotTR or run both.

Metro Last Light (non-redux) - My old go to DX11 benchmark. Not sure how it holds up, but it's an option. Haven't played Exodus to know if it has a benchmark built in, but I do own it, so I could install and use that if it has the option.

Final Fantasy XV - I actually have never tried using the benchmark as I heard it is NOT reflective of the actual performance you get in the game. There's also the matter of the fix killing a lot of performance, so although this has the potential of being the heaviest load it makes a relatively poor candidate, I think.

Hitman 2 - Was just mentioned earlier in this thread as having a benchmark. It's recent enough to be relevant.

Far Cry / Crysis games - I'm not sure if they have built in benchmarks, but I believe I own them all

Batman Arkham games - Don't think any of these actually be relevant still

I haven't ever actually tried to get any 3DMark tests to work in 3D. Has anyone ever tried and achieved this? I guess, was that what you were referring to when saying "as long as we can get 3dv to engage?" Have you managed to get that far, but not actually display in 3D?

I can see you've already run your tests for BL3 in your newer post, so I'll happily oblige and run the same tests and report the results back. Care to also do the same for GTAV and your choice of TR game? I think those 3 make overall good candidates for a baseline, as well as any others. I haven't followed the threads where the TR games DX11 vs DX12 modes have been discussed. Has anyone actually gone and done these tests there to know which performs better? If not, we could possibly do that here. If it has, then we can simply agree to use one option over the other.

Alright, I think that covers most things that I can respond to at this point. I'm going to return to my tweaking for now and will hopefully be back later this week with some results to compare.

3DNovice
Certif-Eyed!
Posts: 561
Joined: Thu Mar 29, 2012 4:49 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by 3DNovice »

DJ-RK wrote: Mon Mar 15, 2021 3:31 pm Also, to note, I haven't tested on this much, but what could be another factor in my performance gains is that rather than going with 2 sticks of RAM, I went with 4 sticks of single ranked RAM to take advantage of a known performance benefit that Ryzen chips get from having 4 ranks of memory.
Sorry if I missed it, but what GPU model do you have?

I just completed my new build with a 5800X last night with a RTX 2080ti using Windows 10 1909 atm.

I also read about the 4 stick boost, but first I wanted to try tighter timings of DDR4 3600 C14 and see if it would work with my mobo, it's not listed as supported, so fingers crossed.

I do not really have any new games to test other than Tomb Raider, though I do want to try Jedi Fallen Order to see what gains I get there.

But I may need to get more fans before I push it very hard, I opted for a Noctua NH-U12A to cool it.

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

@3DNovice He has a 2080 Ti, I think this is what most us will end up with - perhaps having a later generation for 2D as dual boot.

@DJ-RK, your detailed insights are appreciated mate.

I have been trying for the last ~6 hours but can't get SOTTR nor ROTTR to engage 3D. I have played them in 3D before but it looks like things changed along the way somewhere. They would have been interesting to compare DX12 3DV with...

- ROTTR actually performed a tad worse in DX12 compared to DX11 if I recall correctly - I do not recall being able to engage 3DV in DX12 back then.
- SOTTR performed much better - StarMan reported that DX11 gave him 39-44 fps while DX12 gave him 57-60fps. I played DX11 back then but can't engage either now.

I did manage a GTA5 graph in 3D. The 2D graph is useless as the game's max FPS was 187.5 which the benchmark constantly hit against in 2D.

GTA5: 720p default settings except VSync off, 3DV on. Advanced graphics settings - everything disabled.
2D = Green - useless - always hitting the 187.5fps wall.
3D = Orange - much more useful, however the large frame hikes in between are due to high fps in between scenes when nothing is being rendered.
Image

GPU vs CPU:
Image

Benchmark files:

https://filebin.net/mns5t10l1dgvkn7p



So it looks like we might need to find another game to benchmark if you feel 2 isn't enough. I'm happy to try metro or maybe hitman 2 might be a bit more modern... :)
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

Lysander
Certif-Eyed!
Posts: 542
Joined: Fri May 29, 2020 3:28 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by Lysander »

Something I struggle to understand is situations where none of my CPU cores nor GPU show 100% usage (in fact, far from it, GPU around 60-70% and a CPU core around 50%) yet the FPS is low - how does this happen? Is another component bottlenecking the instruction flow?
Ryzen 5 3600X, RTX2060, 16GB ram, Windows 20H2, nVidia 452.06, SSD, VG278H.

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

CPU's are massively serial in that it is very difficult to generate more than one thread. Most games still only use one or two threads, e.g. unity is still mostly a single threaded engine.

So say a game uses 2 threads such as Borderlands 3. An 8 core CPU will then show usage on only 2 cores out of the 8 cores, and therefore the CPU utilisation on an 8c CPU will be 2/8*100 = 25% usage, even though the CPU is absolutely 'maxed out'. The threads usually jump around from core to core as windows attempts to keep them all equally cool. I think there is some new tech which finds the fastest cores and runs the threads on those, but my CPU doesn't have that.

If the CPU is maxed out, then it can't supply the GPU with enough frames to process, so the GPU will starve and show <100% utilisation, assuming you are not already hitting the FPS cap of the game, e.g. 60fps.

The 3D Vision "3 core bug" is simply a bad 3D Vision overhead within the driver that overloads the main game thread, preventing it from spawning more threads as efficiently. E.g. if a game uses 100% of one core for the main thread which then is meant to generate 4 more threads, and the 3DV driver overloads the main thread by 20%, then the game will perform not only 20% worse but also many less threads will be spawned for the other cores to process making 3D performance abysmal.

At this point the GPU gets starved of frames and the whole thing manifests itself as low CPU usage and low GPU usage. Generally, games will not use more than 3 cores because of this badly optimised code, hence we colloquially call it "the 3-core bug" which most people understand; but it depends on how well the game is optimised for multithreading in the first place. Badly optimised games will be badly affected and use less than even 3 cores. Well optimised games will use more than 3 cores and will be less affected.

Helifax proved that it's just bad nVidia coding - back in the day when 3D Vision launched, they did not expect CPUs to have more than 2 cores as 4 cores CPUs were just coming out. Helifax's VKvision driver has virtually no CPU overhead - I tested this a while back. nvidia's driver was never improved beyond its initial production in circa 2008, which was already heavily based on ELSA's 3D driver from the late 90s - Unfortunately the truth is that it's ancient inefficient code originally designed to be run on a single core CPU as a hack... ¯\_(ツ)_/¯

Nowadays we have things like single pass stereo that is used for VR, which doesn't have these old bottlenecks. In VR, the limitation is almost purely the GPU horsepower...
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

Lysander
Certif-Eyed!
Posts: 542
Joined: Fri May 29, 2020 3:28 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by Lysander »

Thanks for that analysis - it makes sense, except for one part - even if the game is single-threaded, why am I not seeing 100% CPU utilisation on any cores? I still don't fully understand that. If there's capacity to process things quickly, why isn't it utilised by that single core? An explanation I thought of is that maybe the thread scheduler doesn't send anything to the thread until it's done processing whatever load it got previously and that's why it's not maxed out. Kinda of like a batch of instructions instead of a queue. Lets say if it has capacity to process 100 units but is sent only 50 and it can only get the next 50 once it's finished processing the first 50 - as opposed to the scheduler sending it the 2nd 50 while it's still processing the 1st 50. I'd like to understand why more frames can't be rendered when both the CPU and GPU have capacity. And clearly it's not memory maxed-out or something like that.
Ryzen 5 3600X, RTX2060, 16GB ram, Windows 20H2, nVidia 452.06, SSD, VG278H.

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

That's an interesting observation and may well be the case - I don't know.

If you disable hyperthreading so virtual cores are disabled to give a truer picture of core usage:

On your 6c CPU, assuming the GPU is <70% and there is no vsync or fps cap, are you saying that you are seeing less than (1/6c*100 = 16%) usage in games?

If it's showing more than 16% usage with HT OFF then the game thread is maxing at least one core all the time - the thread is just hopping from one core to another constantly so it will never actually show 100% usage on any single core in windows - it will look like a few cores are hovering around some percentage but adding them up should roughly be about 100% on a single core or 16% total CPU - this would be a ballpark, as windows CPU usage reporting is not accurate.

If it's significantly less than 16% usage then the bottleneck might be elsewhere as you say, e.g. memory or PCIe bus, or even CPU cache.
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

User avatar
DJ-RK
Binocular Vision CONFIRMED!
Posts: 336
Joined: Thu Sep 19, 2019 8:13 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by DJ-RK »

Sorry to be so tardy with providing my results, I got wrapped up in testing some new betas of a performance tweaking tool for Ryzen CPU's called CTR, and was hoping it would be the key to me squeaking out a tiny extra bit of performance and breaking the 4850Mhz cap that is imposed by AMD on the chip, so I've spent nearly the past 2 weeks working with the 2 latest betas, 2 different bios versions, tons of different settings, etc. Sadly, after all this time I've not been able to achieve stability on anything higher than what I had before, so I've just gone back to my simple and stable overclock through PBO instead. *shrugs* At least I was able to put that to rest and get on with the testing.

I'm going to be lazy and just upload the zip file with the results of all my tests from BL3 and GTAV because I haven't quite figured out how to edit the captures to make them all line up properly in comparisons and such (or whether that's even possible), so if you don't mind I'll leave the detailed comparison and analysis to you, my friend, as you're much more experienced and capable in this regard I believe.

In short, though, from what I can tell just by comparing my charts with yours with my eyes is that they look quite similar besides that I think I'm getting about 10-20% higher FPS than you, which is probably about what we should expect and I believe is in line with other benchmarks on these CPU's. Nothing too shocking here, though I'll admit I was secretly hoping to see even more, somehow. (don't we all? :p)

One surprise for me, though, is that with BL3 I don't seem to take a performance penalty when running in 2D with 3D enabled in the control panel like you do... although, was yours with the 3D fix installed? Because my "2D with 3D on in CP" test did not have my 3D fix enabled, so if you did have the 3D fix installed for your version of that test that may explain why you take a bigger hit. On that note, I DID do 2 separate 3D tests, one with the fix installed and one without, and there's about a 10-15% performance delta between those, hence why I asked whether you had the fix installed or not for your 2D test.

I had a bit of fun looking at the analysis tab to do some comparisons on the captures as well, mainly just looking at the max and avg CPU loads. When I compare my BL3 "3D On, no Fix" with "3D On, fix installed" results, the CPU usage goes down even further with the fix installed... so it seems that 3DMigoto only compounds on the CPU bottleneck further. Not that should surprise anyone, but it's still a bit shocking and saddening to see the actual numbers and charts showing it.

Anyways, I'll let you work your magic and it'll be interesting to see some better comparison and stats from it all, though probably won't be anything earth-shattering from the results.

Captures.zip

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

Thanks for the data DJ-RK; your eyes are superb - the difference is indeed pretty much exactly 20%!

To the best of my recollection, my tests were done with the 3D fix installed because I remembered you saying you had it installed in your OP. Your observation that on vs. off in the Control Panel is brilliant! Wow, so with a better CPU, there is no difference! I have always had a difference on my systems through the last 2 decades...

Graphs:
Borderlands3 3D - AMD 5600x Maxed vs. Intel 9900k Maxed:
TL;DR - 20% improvement
Image

BL3 2D/3D:
5600x 3D OFF ControlPanel OFF
5600x 3D OFF ControlPanel ON
9900k 3D OFF ControlPanel OFF
9900k 3D OFF ControlPanel ON
5600x 3D ON
9900k 3D ON
Image


GTA5 3D - AMD 5600x Maxed vs. Intel 9900k Maxed:
Image
Note: Green and Orange are swapped in the final graph because there is a bug in the capture software which doesn't properly let you change colours.

Complex Analysis: New AMD Zen3/5xxx series kicks ass :)
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

russellk
Cross Eyed!
Posts: 173
Joined: Sun Jan 24, 2010 2:09 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by russellk »

RAGEdemon wrote: Thu Mar 18, 2021 12:14 pm That's an interesting observation and may well be the case - I don't know.

If you disable hyperthreading so virtual cores are disabled to give a truer picture of core usage:

On your 6c CPU, assuming the GPU is <70% and there is no vsync or fps cap, are you saying that you are seeing less than (1/6c*100 = 16%) usage in games?

If it's showing more than 16% usage with HT OFF then the game thread is maxing at least one core all the time - the thread is just hopping from one core to another constantly so it will never actually show 100% usage on any single core in windows - it will look like a few cores are hovering around some percentage but adding them up should roughly be about 100% on a single core or 16% total CPU - this would be a ballpark, as windows CPU usage reporting is not accurate.

If it's significantly less than 16% usage then the bottleneck might be elsewhere as you say, e.g. memory or PCIe bus, or even CPU cache.
@Ragedaemon did you see that Hardware unboxed have published the 2nd part of the analysis?

Now, I only skipped through the video and comments, but I think they're suggesting it's because Nvidia do GPU scheduling in the driver, which adds a CPU load.
This is what causes issues at high frame rates where CPU load can be high.
Like I say, I didn't watch the whole thing so the answer may be in the video, but it just made me wonder how effective the hardware scheduling options are in the modern versions of Windows 10, and it also made me wonder if this is why Helifax saw excellent improvements with HW scheduling turned on.
Win 10 1903 (Via 3dfix manager - Non DCH), 9700K, Gigabyte 2080Ti OC, BenQ XL2420T 3d surround, LG 3d OLED, 4k Projector (3dtvplay), WMR Odyssey+, WIn 19 1809 (425.31), Win 7 Pro x64

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

Hi Russell mate,

Yea, I posted about it here: viewtopic.php?p=177843#p177843

Direct link to part 2 of CPU bottleneck video for anyone interested: https://www.youtube.com/watch?v=G03fzsYUNDU

I wondered about the same - I had the hardware scheduling driver installed and enabled, but it didn't seem to make any difference (451 I think?). I have not done any tests with the above capture method however. If anyone could test it out, it would also be a great data point :)

Right now, I have gone into system policy and disabled specific hardware ID's from driver updates, so windows doesn't allow anything/anyone to ever update my graphics driver again from the current 446.14, which is although pre-hardware scheduling, it is the latest driver to not have hiccups in VR that a lot of people are complaining about as well as work well with 3DV - I don't want to mess with registry/system policies again as they stand - however if someone tests and shows some kind of intriguing difference with HW scheduling, I shall do just that :)

VR hiccups with later drivers, blamed in some part on HW scheduling:
https://www.nvidia.com/en-us/geforce/fo ... -nvidia-d/

I have also wondered about Smart Access Memory/Sizable BAR if that tech ever trickles down to 2xxx series cards. I guess we wait and see ¯\_(ツ)_/¯
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

User avatar
DJ-RK
Binocular Vision CONFIRMED!
Posts: 336
Joined: Thu Sep 19, 2019 8:13 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by DJ-RK »

Alright, here's a bonus one to add, since we've started discussing about hardware scheduling. Here is my comparison with Hardware-accelerated GPU Scheduling turned on vs off in the Windows graphics settings (not sure if there's anywhere else that's supposed to be turned on in the Nvidia driver, as far as I'm aware not). This is in BL3, both tests with 3D on and the fix installed. I'm also on 452.06, and been on that for all my tests so far.
BL3 3D On - GPU HW scheduling on vs off.png
Pretty sure I don't need to say which is which because they are both pretty much identical, but green is with it turned on and orange off. I thought I saw a bit of an improvement before on my 7700k, though, but that was purely anecdotal and I never recorded and compared as we are doing now.

This is fun, anything else that we want to compare? :D
You do not have the required permissions to view the files attached to this post.

User avatar
DJ-RK
Binocular Vision CONFIRMED!
Posts: 336
Joined: Thu Sep 19, 2019 8:13 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by DJ-RK »

Ok, last two, I think.

I wanted to do my own test using only 2 of my sticks of RAM vs 4 sticks (or, more specifically 2 ranks vs 4 ranks of memory, since you should be able to achieve the same using only 2 sticks of dual ranked memory and 4 sticks is not the only option), to see if I'm getting the supposed 10-15% increase from that.

[Removed images and analysis due to inaccuracy and redundancy to new images below]

Edit: Oh crap, I just checked and noticed that my XMP memory profile didn't get loaded, so the memory speed was only at the default 2133Mhz. Guess I gotta make a couple more charts for myself after all.

Edit 2:
Ok, so retested with XMP properly set, so some new charts to put up (and I've taken down the old). Here's the new legend:

Green - 4 ranks of memory, XMP on
Orange - 2 ranks of memory, XMP off (2133Mhz)
Blue - 2 ranks of memory, XMP on (3600Mhz)


BL3 - All 3 up for comparision:
BL3 3D On - 2 x RAM vs 4 x RAM vs 2 RAM w XMP.png
BL3 - The correct comparison of just 2 ranks vs 4 ranks:
BL3 3D On - 2 x RAM vs 4 x RAM XMP on.png
BL3 3D On - 2 x RAM vs 4 x RAM XMP on - bar chart.png

GTAV, same order as above:
GTAV 3D On - 2 x RAM vs 4 x RAM vs 2 RAM w XMP.png
GTAV 3D On - 2 x RAM vs 4 x RAM XMP On.png
GTAV 3D On - 2 x RAM vs 4 x RAM XMP On - bar chart.png

So yeah, when I first took a look at the regular line chart I was a bit conflicted because it seemed like the difference was a bit of a wash since in some cases the blue is higher than green, but looking at the bar chart for some of the stats shows there's still about a consistent 10% increase in the lowest frames in both games. Not quite as resounding as the near 10-15% across the board as I thought I was getting before, but it's still a bit of a win, at least.

What this does show, though, is that memory timing plays a significant part in this. It might not necessarily be the memory speed itself, though, but rather that Ryzen's "Infinity Fabric" interop layer clock speed is known to be best when tied in a 1:1 ratio with the memory clock speed (before the DDR 2x multiplication). I've tried to overclocking my memory to 3800Mhz so I can max out my IF clock at 1900Mhz for quite some time, but I didn't manage to get it stable on the single attempt I've made at doing so. At this point I'm thinking I'm just going to settle where I'm at, though. As always, I've spent WAY too much time chasing for miniscule performance gains through overclocking, and I think it's time for me to accept what I've got. It's clearly decent enough where it is, so I should be happy...

...

Right?
You do not have the required permissions to view the files attached to this post.

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

Great graphs and insight mate - thank you!

Are you on Samsung b-die? It ought to get you to ~3900mhz with fairly tight timings at ~1.5v(and above) generally. I have no clue about OCing a Ryzen unfortunately, but it seems like you have done a stellar job with your silicon lottery - you have the most powerful 3DV system in the community by a mile! :woot
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

User avatar
DJ-RK
Binocular Vision CONFIRMED!
Posts: 336
Joined: Thu Sep 19, 2019 8:13 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by DJ-RK »

Sadly not Samsun B-dies, the kit I've got is composed of Hynix C-dies. Still capable, apparently, but I don't think quite as much as the Sammy's. I tried using a tool called DRAM Calculator for Ryzen (made by the same dude who made the CTR tool I was testing earlier), which helps to determine what should be usable timings at a certain memory clock speed, but again, no luck there as of yet. Knowing myself, I'll probably give it another go in the near future, though. Gotta get those extra frames! Y'know? ... All 3 of them.

russellk
Cross Eyed!
Posts: 173
Joined: Sun Jan 24, 2010 2:09 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by russellk »

DJ-RK wrote: Mon Mar 29, 2021 10:59 am Sadly not Samsun B-dies, the kit I've got is composed of Hynix C-dies. Still capable, apparently, but I don't think quite as much as the Sammy's. I tried using a tool called DRAM Calculator for Ryzen (made by the same dude who made the CTR tool I was testing earlier), which helps to determine what should be usable timings at a certain memory clock speed, but again, no luck there as of yet. Knowing myself, I'll probably give it another go in the near future, though. Gotta get those extra frames! Y'know? ... All 3 of them.
Thanks for posting the graphs and data mate, I'm actually quite fascinated by the RAM differences.

I bought some fast B-die for my 6700k a while ago and I used the Ryzen DRAM calculator to try and help me overclock it. Although the calculator itself is largely non applicable to Intel, you can import info from Tyhphoon Burner and it will give you recommended sub timings etc.
The ASRock timing configurator (intel) will allow you to monitor the actual timings from within Windows.

From what I understand, B-die is the best, but only because certain timings scale with voltage, so you can dial down the latency and sub timings if you're a big 'tweaker/overclocker'.
Since then, Micron E-die and others came along though, so you're probably not missing out on too much unless you want to spend the rest of your life running stability tests! B-die carries a large price premium...
Win 10 1903 (Via 3dfix manager - Non DCH), 9700K, Gigabyte 2080Ti OC, BenQ XL2420T 3d surround, LG 3d OLED, 4k Projector (3dtvplay), WMR Odyssey+, WIn 19 1809 (425.31), Win 7 Pro x64

User avatar
DJ-RK
Binocular Vision CONFIRMED!
Posts: 336
Joined: Thu Sep 19, 2019 8:13 pm
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by DJ-RK »

Lysander wrote: Mon Mar 15, 2021 8:10 pm ooooh, very nice indeed. DJ, if you get a chance, could you test out RE2/3 with your new CPU? I hate the dips in those beautiful games that otherwise run at perfect 60fps on my rig.
As per a recent follow up request from Lysander, I have another graph to present. This is taken from a session of Resident Evil 3 (remake, obv).

I don't think the game has a benchmark, so I just ran the FPS capture during the first 10 minutes of actual gameplay, starting from the beginning of the scene when Nemesis first comes crashing through the wall and ended right after the scene where you drive him over the edge of the parking lot and then he eats a missile served courtesy of our man Carlos, and then you escape into the subway. This segment involved a couple of open'ish areas with NPC's and skirmishes with zombies coming after you, not to mention is also heavily scripted (so more strain on the CPU) so hopefully it serves decently enough to give an idea on what to expect throughout most of the game.

My settings are running at 1440P, everything maxed (except lens flares, distortion, etc). Heck, I even turned the autoconvergence shader on halfway through. Sorry I didn't go the more scientific route and drop down to 1080P or compromise on any settings to show what the maximum FPS this monster CPU can achieve, but instead this can demonstrate how my "optimal" 3D system performs at this relatively high load.

Anyways, I'll stop talking and let the pictures do a little of their own.
RE3 1440P Max.png
RE3 1440P Max - bar chart.png
Summary: Over 60 FPS most of the time with some drops into the mid 50's and 40's. 60FPS locked is most likely possible at 1080P. I'm happy enough with these results that I don't feel a rush to answer that question, though. I certainly don't think I need to answer that question with another chart... do I? :P I hope not, I'm starting to feel like a bit of a showoff and a :ugeek: at this point, but I swear this one's on Lysander! :lol:
You do not have the required permissions to view the files attached to this post.

User avatar
RAGEdemon
Certif-Eyed!
Posts: 538
Joined: Thu Mar 01, 2007 1:34 pm
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by RAGEdemon »

Thanks DJ-RK, the entire board is off course envious of your monster setup - I, for one, love salivating over your graph results! :lol:
Windows 10 64-Bit | 9900K @ 5.1GHz | 2080 Ti OC | 32GB 3920MHz CL16 RAM | Optane PCIe SSD | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2

Lysander
Certif-Eyed!
Posts: 542
Joined: Fri May 29, 2020 3:28 pm

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by Lysander »

Thanks so much DJ for this valuable information. These numbers look great. I think I found my next step on the upgrade path, since clearly some higher powers do not want me to go the video card route, what with the non-working adapters, unavailable LCDs and sky-high 2080Ti prices. From these charts and internet comparisons, it seems this CPU would give me that extra push I seem to need in some non-GPU-bound games, especially at 1080. I'll do some more poking around and see AMD's roadmap to see what's around the corner, but otherwise I might be getting this CPU shortly.

Don't bother with more charts, this is good enough for me and again - thank you (and sure, I'll take the blame here :D )
Ryzen 5 3600X, RTX2060, 16GB ram, Windows 20H2, nVidia 452.06, SSD, VG278H.

User avatar
WickedScav
One Eyed Hopeful
Posts: 26
Joined: Sun Sep 29, 2019 11:59 am
Which stereoscopic 3D solution do you primarily use?: S-3D desktop monitor

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by WickedScav »

On my 5900x with a 2080 (not a ti), I get 60fps consistently with 1440p. However I only tried two, not that demanding games (Deliver us the moon & Beyond a steel sky), so performance could be worse with SotTR or similar titles, but so far, the brute-force power of this CPU makes playing 3DV games a blast :)

According to HWiNFO, max CPU Usage in both games is 25% which would also correspond to the before-mentioned "3 cores limitation".

User avatar
Guig2000
Binocular Vision CONFIRMED!
Posts: 232
Joined: Wed Nov 25, 2009 9:47 am
Which stereoscopic 3D solution do you primarily use?: S-3D Projector Setup
Location: Bordeaux, France

Re: Tech Reviewer finds nvidia CPU bottleneck even without 3D Vision...

Post by Guig2000 »

russellk wrote: Sun Mar 28, 2021 6:08 am
@Ragedaemon did you see that Hardware unboxed have published the 2nd part of the analysis?

Now, I only skipped through the video and comments, but I think they're suggesting it's because Nvidia do GPU scheduling in the driver, which adds a CPU load.
This is what causes issues at high frame rates where CPU load can be high.
Like I say, I didn't watch the whole thing so the answer may be in the video, but it just made me wonder how effective the hardware scheduling options are in the modern versions of Windows 10, and it also made me wonder if this is why Helifax saw excellent improvements with HW scheduling turned on.
Exactly, nvidia don't have an hardware scheduler into their GPU, contrary to AMD.
That why this CPU bottleneck on little or old CPUs is:
1°) very different from the 3Dvision driver issue
2°) and appears only on DX12 titles.

When running Dx 9/10/11 games on low CPU, the situation is reversed, the CPU bottleneck is worst when using an AMD GPU (a and is known for long performance issues) than a nvidia GPU. AfaIk, This issue on AMD as always been considered as a driver overhead by tech reviewers, but now some thinks that it could be related somewhat to the hardware scheduler
Image

Post Reply

Return to “NVIDIA GeForce 3D Vision Driver Forums”