Optimal CPU for RTX 2080Ti on 3DVision games - AMD or Intel

Post Reply
floph
Cross Eyed!
Posts: 117
Joined: Sat May 30, 2020 4:52 pm

Optimal CPU for RTX 2080Ti on 3DVision games - AMD or Intel

Post by floph »

Hi,

I just got a used RTX 2080Ti and I would like to upgrade my CPU. Currently I have i7 6700K with 64 GB RAM DDR4 2400 MHz.
i7 6700K CPU blottlenecks the 2080Ti GPU according to this website:
https://pc-builds.com/cyri/Core_i7-6700 ... eList=(3i)

I am thinking about AMD Ryzen 7 3700X and here are the results:
https://pc-builds.com/cyri/Ryzen_7_3700 ... eList=(3i)

But I don't know if AMD CPUs are buggy or not when it comes to 3DVision, because I've read different opinions that AMD is not as reliable as Intel CPUs in this regard.

Could you share your experience with AMD CPUs in 3DVision ?
Is it a good idea to use a AMD CPU for 3DVision instead of Intel ?
I am also interested what would be the cheapest CPU that does not bottleneck the 2080Ti.

I plan on using DSR to play at 4K, downsampled to 1080.
I use the bottleneck calculator in the links above as a guide, but they are meant for 2D gaming, so I would appreciate any hands on experiences you could share.

Any recommendation is much appreciated.

Thank you !
User avatar
RAGEdemon
Diamond Eyed Freakazoid!
Posts: 740
Joined: Thu Mar 01, 2007 1:34 pm

Re: Optimal CPU for RTX 2080Ti on 3DVision games - AMD or Intel

Post by RAGEdemon »

viewtopic.php?f=105&t=24922 <--- This answers most of your questions.

Right now the best CPU is the AMD 5X series due to its excellent single core performance.

3D Vision can't take advantage of more than 4 core CPUs very well. Your 6700k is very good - overclock it to all core 4.8GHz. Your memory at 2400MHz is rubbish. Optimally it should be 3600, - at least 3200.

Resolution is independent of your CPU performance - Resolution is all about GPU performance.

3D Vison performance is double performance requirement from the GPU so if 2D requires GPU making 1 million pixels for 60fps, 3D Vision will require 2 million pixels.

DSR is an utter waste unless you specifically use 4x mode (anything less gives bad image quality), which multiplies H and V by 2 exactly. 4x DSR is excellent, combined with 0% smoothing and 100% sharpening.

Quick calculation:
Your 1920x1080 DSR 4x = 1920*2 x 1080*2 = 3840 x 2160 2D = 8,294,400 pixels.

3D Vision = 8,294,400 x 2 = 16,588,800 = roughly equivalent to 6000x3000 2D resolution.

There is no way that a 2080Ti could ever manage to run a 'modern' game at 6k x 3k resolution in 2D at 60fps. Maybe some low poly Indi games? I don't know.

The best I have managed is barely 2560x1600 3D @ 60fps = 4k 2D @ 60fps

This is the main reason I am sticking with a 800p projector so I can DSR to 2560x1600 for 3DV.

Maybe 2 years from now, we might have a GPU with double the performance, but it likely won't support 3DVision driver.
Windows 11 64-Bit | 12900K @ 5.3GHz | 2080 Ti OC | 32GB 3900MHz CL16 RAM | Optane PCIe SSD RAID-0 | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2
Lysander
Terrif-eying the Ladies!
Posts: 941
Joined: Fri May 29, 2020 3:28 pm

Re: Optimal CPU for RTX 2080Ti on 3DVision games - AMD or Intel

Post by Lysander »

Got a 3600X in October and no issues with 3dv at all. This is on an x570 board with a 2060. I play at 1080p and most games I can max out (not the latest ones though) at 60fps in 3d.

AMD seems king now, unless maybe you go to the top top tier.
Ryzen 5 5600X, RTX2080Ti, 16GB ram, Windows 20H2, nVidia 452.06, SSD, Dell S2716DG.
floph
Cross Eyed!
Posts: 117
Joined: Sat May 30, 2020 4:52 pm

Re: Optimal CPU for RTX 2080Ti on 3DVision games - AMD or Intel

Post by floph »

RAGEdemon wrote: Mon Dec 21, 2020 9:46 am viewtopic.php?f=105&t=24922 <--- This answers most of your questions.

Right now the best CPU is the AMD 5X series due to its excellent single core performance.

3D Vision can't take advantage of more than 4 core CPUs very well. Your 6700k is very good - overclock it to all core 4.8GHz. Your memory at 2400MHz is rubbish. Optimally it should be 3600, - at least 3200.

Resolution is independent of your CPU performance - Resolution is all about GPU performance.

3D Vison performance is double performance requirement from the GPU so if 2D requires GPU making 1 million pixels for 60fps, 3D Vision will require 2 million pixels.

DSR is an utter waste unless you specifically use 4x mode (anything less gives bad image quality), which multiplies H and V by 2 exactly. 4x DSR is excellent, combined with 0% smoothing and 100% sharpening.

Quick calculation:
Your 1920x1080 DSR 4x = 1920*2 x 1080*2 = 3840 x 2160 2D = 8,294,400 pixels.

3D Vision = 8,294,400 x 2 = 16,588,800 = roughly equivalent to 6000x3000 2D resolution.

There is no way that a 2080Ti could ever manage to run a 'modern' game at 6k x 3k resolution in 2D at 60fps. Maybe some low poly Indi games? I don't know.

The best I have managed is barely 2560x1600 3D @ 60fps = 4k 2D @ 60fps

This is the main reason I am sticking with a 800p projector so I can DSR to 2560x1600 for 3DV.

Maybe 2 years from now, we might have a GPU with double the performance, but it likely won't support 3DVision driver.
Thank you for the detailed explanations.

My projector is SBS and I usually play at 1080p @ 24Hz. Thanks to a built-in feature called Frame Interpolation, the motion is buttery smooth. It feel like it's 60 Hz and it saves me a lot of GPU power. But in VR there is no such feature to interpolate frames for 3D virtual screens, at least not that I know of, so for that I plan to add a second more powerful GPU, maybe 3080 or 3080 Ti, to be as smooth as possible. If Virtual Desktop had a Frame Interpolation feature, that would have been just perfect. In VR I will also use Superdepth3D, HelixVision / VK3DVision so I will need all the GPU horse power I can get. I haven't played yet any 3D game in VR so far so I am pretty excited.

I tried DSR on different occasions. The most impressive visuals I saw with DSR 4x was in Red Dead Redemption with Superdepth3D, but the fps was below 10 - unplayable, even with Frame Interpolation ON.
I wonder if the DLSS 2.1 Quality mode is equivalent to DSR 4x, in supported games. If that's the case then it's probably faster than DSR. Just saw now that DLSS 2.1 ads VR support. That's very interesting to see how it will work on HP Reverb G2 or Pimax 8KX.
User avatar
RAGEdemon
Diamond Eyed Freakazoid!
Posts: 740
Joined: Thu Mar 01, 2007 1:34 pm

Re: Optimal CPU for RTX 2080Ti on 3DVision games - AMD or Intel

Post by RAGEdemon »

I'm glad frame interpolation works so well for you. I have been a backer of the Smooth Video Project (SVP) since the beginning, and am fond of the technology, using it in all my videos.

It might interest you to know that SteamVR and WMR both have native frame interpolation - they call it "Motion Smoothing" - and it works wonders.

Unfortunately, it does not work with VorpX combined with the Reverb G2 - I have made Ralf aware of the issue (VorpX menu actually has a specific option for Motion Smoothing which presumably works with native SteamVR headsets such as the Vive and Index, but not WMR), and he might take a look at a fix if/when he gets a G2 down the line - he is not fond of the technology himself.

I don't know if it works with HelixVision and I have my doubts. I hope it does.
Windows 11 64-Bit | 12900K @ 5.3GHz | 2080 Ti OC | 32GB 3900MHz CL16 RAM | Optane PCIe SSD RAID-0 | Sound Blaster ZxR | 2x 2000W ButtKicker LFE | nVidia 3D Vision | 3D Projector @ DSR 1600p | HP Reverb G2
3DNovice
Petrif-Eyed
Posts: 2398
Joined: Thu Mar 29, 2012 4:49 pm

Re: Optimal CPU for RTX 2080Ti on 3DVision games - AMD or Intel

Post by 3DNovice »

floph wrote: Mon Dec 21, 2020 3:33 pmMy projector is SBS and I usually play at 1080p @ 24Hz.
For dx11 games you can edit the d3dx.ini for 3Dmigoto to get 1080P@60 Side by Side or Over/Under with your projector vs the 1080P@24hz limitation of 3DTV Play.


You can also do as this user suggests
viewtopic.php?f=181&t=25191#top

Doing either, should help quite a bit with game mechanics. 24Hz, is too low to decently play platformers/shooters/hack&slash/etc...
Post Reply

Return to “NVIDIA GeForce 3D Vision Driver Forums”