DevilMaster wrote:Matthew wrote:You know why a Direct3D port of Doom 3 does not exist? Because
OpenGL is better.
If OpenGL is better, why isn't there such a thing as a stereoscopic driver for modern systems with OpenGL support?
Because since DirectX is more widely used, it is considered a higher priority.
It takes a lot of work to write a stereo driver. And separate versions have to be made for each version of DirectX. There aren't even any modern stereo drivers for DirectX 7. This is too bad; many major classic games use DirectX 7.
DevilMaster wrote:In case you retort that stereoscopic drivers support quad buffer-based OpenGL stereoscopy, I'll rephrase the question as: "Direct3D-based stereoscopic drivers don't care about quad-buffered output. Why isn't there a driver that does the same thing with OpenGL?"
Quad buffering is used by Direct3D as well as OpenGL, and is the method of displaying the stereoscopic image. It simply means having separate buffers for the left and right eye images (this means four buffers total, since there are front and back buffers).
Quad buffering is always used with modern shutter glasses systems. Translating a game into stereoscopic 3D that wasn't programmed that way is a separate thing. Once it's translated into 3D, it's displayed using quad buffering.
As for why there isn't a driver that translates OpenGL games into stereoscopic 3D... Well actually, there ARE such drivers. I've used one myself (the eDimensional driver). It's just that the producers of modern stereo drivers have chosen not to support OpenGL anymore.
To get a sense of why they chose that, read the article I linked to in my previous post.
Writing an OpenGL stereo wrapper actually is very easy (a lot easier than with Direct3D). I could do it myself, but I have other priorities right now.
DevilMaster wrote:Matthew wrote:Why not just modify the engine to have native stereoscopic rendering support with OpenGL quad buffered output?
Because in that case, it would only work with video cards that support quad-buffered output in the first place.
Are you aware that this means all modern cards by nVidia and ATI?
A little over two years ago, nVidia and ATI unlocked support for OpenGL quad buffering on their consumer cards.
DirectX only supports stereoscopic quad buffering in DirectX 11 and later.
But OpenGL has
always supported it. Ever since OpenGL 1.0, which was released in 1992.
This means that adding native stereoscopic rendering support to legacy games is much, much easier with OpenGL than with DirectX.
With OpenGL, since it's always been part of the API, you can just use the now-enabled functionality.
With DirectX, since versions before DirectX 11 don't support it, you have to either modify the engine to use DirectX 11 or later, or use vendor-specific extensions.
These stereo drivers are not a substitute for native stereoscopic rendering support. Except with very old games, they tend to have a lot of graphics glitches.
I think one of the problems with nVidia 3D Vision is that nVidia has been too focused on the "automatic mode" support (translating games into 3D that weren't programmed to be that way). They should be focused on encouraging developers to include native stereoscopic rendering support in games.
"Automatic mode" support is mainly useful with legacy games. I think nVidia should have focused on maintaining support with legacy games, instead of dropping that (3D Vision only supports DirectX 8 and later) and focusing on modern games.
Modern games just don't work well in 3D unless they're programmed to render that way. The reason it worked with legacy games is that back then, GPUs used just a fixed function pipeline. Modern games use advanced effects and programmable shaders, which can't be properly translated into 3D at the API level.