i think that what Dmitry says is that you could change the word 'AMD' in this article by any other brand name as AMD is just providing generic PC hardware.
That's true. AMD/ATI claims to provide the best gaming experience in the world, yet all they can show on a stereoscopy promotion event are some third party solutions for video decoding? AMD is going to lose their hard-won customers because of this attitude.
The problem is, so far very few game developers bother to test their visuals with "automatic mode" stereo drivers, save to implement a full-featured stereoscopic rendering engine
This is the issue of content, not technical ability to generate and display stereo3D images
AMD considers that it is up to the developers to implement Stereo3D engine in their games. Which unfortunately is the best way to do it and the ultimate goal that we gamers should support.
It's rather an issue of development budgets. Avatar is a big game project based on a blockbuster movie - a stereoscopic 3D movie, mind you - by a highly-regarded film director which became a hit before even being released, so it is budgeted accordingly and can afford reimplementing the wheel. And Invincible Tiger is just another console game which had no other choice but to reimplement the wheel, since there is noone else to provide the stereo support.
I'm not convinced that implementing stereoscopy at the engine level is the best possible way; it does give the developers full creative control, but it's not future proof, since any new output device will have to be explicitly supported by the application, which means older applicaions will never be updated to support newer hardware. I still remember the days of DOS gaming with a Gravis UltraSound card, and I wouldn't want to return to a programming model where applications are left on their own with either multiple proprietary APIs for each piece of hardware or very generic middleware APIs which does not expose every unique feature of the device.
A plausible goal is to have a standard driver model for encoding stereo ouput to the display device, and the ultimate goal is to implement stereo features at the core level of the graphics API and in the display driver interface. That would be the most compatible and future-proof solution.
Thay's better follow the trend and support already existing hardware rather than creating their own.... which is exactly what they are doing ... AMD does not want to invest in the Mono-3D to Stereo-3D game conversion drivers against nvidia, but prefer to support 3rd party developers (like iZ3D and DDD).
Sorry, I don't see them following anything in a useful manner, the way AMD is marketing their "support" for iZ3D is unconvicing at best.
I do not urge AMD to enter LCD monitor business or shutter glasses business, but they could at least team with 3rd parties and endorse their 120 Hz solutions, like Nvidia are doing with Samsung and Viewsonic (and Zalman).
What about providing a standard VESA stereo connector on an external bracket, or arranging with some monitor maker to include a standard VESA stereo connector in their 120 Hz monitors, like they do it in 120 Hz 3DTVs? AMD do not have to promote their own shutter glasses, they have no incentive to withdraw support for 3rd party emitters and glasses (unlike nVidia), and many 3rd party emitters do feature VESA stereo connector.
It does require sligthly more effort than just showing you some pre-rendered stereoscopic content which anyone else is capable of showing as well.
So far nvidia is the only 3D glasses manufacturer to use this technology (dumb 120Hz monitor, sync by the computer). All others always use a direct sync between the display and the shutter glasses. So what you are saying here is "I want to use the nvidia Geforce 3D vision glasses" on an ATI GPU.
That's because there were no mass produced 120 Hz digital displays until recently, and many 3rd party glasses were mostly designed with CRT monitors in mind, that's why they use analog VGA for sync. When there are more 120 HZ Full HD LCD displays which use either dual-link DVI or DisplayPort, I'd imagine 3rd parties would soon be following suit by introducing DVI, DisplayPort or USB emitters. Either that, or AMD could just endorse the VESA standard stereo connector, as I said above.
What AMD will do for sure is support the hdmi 1.4 .. and provide a standard way for developers to access the hdmi 1.4 features through the display driver
There are no new features in HDMI version 1.4 which are relevant for PC users, and most features of HDMI are just higher-level data link and transport protocols etc.
, all implemented in software/firmware.
Come on, how relevant is in-cable Ethernet which requires you to replace your $50-per-piece HDMI cables and your $200 video card? What a joke, a $10 100BASE-TX hub and a pair of $2 UTP cables would do the same. HDMI 1.3 at least doubled the bandwidth of the physical layer.