It is currently Thu Nov 21, 2019 8:58 am



Reply to topic  [ 25 posts ] 
 Next Xbox coming late 2013, DiplayPort and Stereo 3D, $299 
Author Message
Diamond Eyed Freakazoid!
User avatar

Joined: Tue Jan 08, 2008 2:25 am
Posts: 776
Location: Moscow, Russia
Reply with quote
http://www.winsupersite.com/article/pau ... box-143461
http://www.winsupersite.com/article/xbo ... iew-143462

So, the next more powerful Xbox came in 2012, just as I expected back in 2010. The leaked presentation is dated August 2010 - so despite Microsoft being in complete denial at the time and posting that Xbox 360 would last until 2015 with its "revolutionary" Kinect controller, at the same time they gave their Xbox hardware team a boot and drew the hardware requirements for the next gen Xbox which would allow them to finally catch with the PlayStation 3. What a surprise (cough cough).

Quote:
...Microsoft intends to bring the next Xbox to market in holiday 2013. It will cost $299 for a unit that includes Kinect v2 (see below), the same price you’d pay today for a 4G Xbox 360 with Kinect. Microsoft describes this console as a “$2000 PC in a $300 box.”

...The next Xbox console will feature a (slot-loaded?) Blu-Ray drive... native 3D and flexible video output and compositing.

...the next Xbox will be “six times” as powerful as the current console... Its hardware underpinnings are codenamed “Yukon” and include 2 “ARM/x86” (whatever that means) processor cores running at 2 GHz, a 500 MHz 48ALU graphics processor, 4 GB of RAM, mass flash-based storage, USB 3.0, HDMI, DisplayPort, PCI-E and SATA controllers, and Gigabit Ethernet (with Wake On LAN) and 802.11n networking. A 3.2 GHz PowerPC chipset will be included for backwards compatibility with Xbox 360 games.

The next Xbox will be backed by a new, NextGen interface with Natural User Interface (e.g. Kinect-based) capabilities built-in...

Kinect v2 will be “an incremental improvement” over Kinect V1 (and, presumably, the 1.5 version sold to PC developers today) and will include improved voice recognition, better 3D play space recognition with support for closer, wider, and deeper living room areas, will support the tracking of four players concurrently (up from two) with seated or standing players, will feature dedicated hardware processing (instead of requiring the Xbox’s hardware), and will include a better HD RGB camera.

Kinect v2 will also support “props,” accessories you can hold with your hands to make virtual experiences—batting in a baseball game, playing golf, and so on—more realistic. “Feeling is believing,” it says. “Feel the crack of the bat, the kick of the rifle or the shake of the wheel as you speed through the turn.”


Mon Jun 25, 2012 2:04 pm
Profile
Petrif-Eyed
User avatar

Joined: Sat Sep 17, 2011 9:23 pm
Posts: 2220
Location: Menlo Park, CA
Reply with quote
Definitely curious about that Kinect 2. Hopefully they can significantly reduce latency with the on-board processing.


Mon Jun 25, 2012 4:17 pm
Profile
Certif-Eyable!
User avatar

Joined: Sat Dec 22, 2007 3:38 am
Posts: 990
Reply with quote
brantlew wrote:
Definitely curious about that Kinect 2. Hopefully they can significantly reduce latency with the on-board processing.

I hope that they can get the kind of latency and accuracy as this:
http://leapmotion.com/
The kinect is good at the moment for things like the dance games which rely on very general posture recognition. If they can get the sort of response and accuracy of the leap motion, combined with 3D you could have true object interaction. This is where I would get excited.


Mon Jun 25, 2012 4:23 pm
Profile
3D Angel Eyes (Moderator)
User avatar

Joined: Sat Apr 12, 2008 8:18 pm
Posts: 11394
Reply with quote
Personally question the accuracy/authenticity of the document, considering a lot could change in the past 2 years (even assuming its real).

_________________
check my blog - cybereality.com


Mon Jun 25, 2012 10:14 pm
Profile
Diamond Eyed Freakazoid!
User avatar

Joined: Tue Jan 08, 2008 2:25 am
Posts: 776
Location: Moscow, Russia
Reply with quote
If the projected late 2013 release is real, the hardware specs should have been finalized no later than end of 2010, the actual silicon should have been designed, traced to the specific manufacturing node and taped out by mid 2012, and volume production should start by mid 2013.


Tue Jun 26, 2012 1:03 am
Profile
3D Angel Eyes (Moderator)
User avatar

Joined: Sat Apr 12, 2008 8:18 pm
Posts: 11394
Reply with quote
True, assuming its actually coming next year.

_________________
check my blog - cybereality.com


Tue Jun 26, 2012 8:49 pm
Profile
Two Eyed Hopeful
User avatar

Joined: Mon Jun 13, 2011 9:09 pm
Posts: 77
Reply with quote
And assuming the document is real and assuming they are able to stick to their plan. I wouldn't trust that $299 figure at all - there's no way they can predict inflation and exchange rates three years ahead of time.

Honestly, it's hardly worth worrying about. We'll find out the stats and the pricing before we can buy one.

I must say that I'll be very surprised if it doesn't have s3D support, though.

_________________
"Every jumbled pile of person has a thinking part that wonders what the part that isn't thinking isn't thinking of."
- They Might Be Giants


Tue Jun 26, 2012 9:41 pm
Profile
Certif-Eyable!
User avatar

Joined: Sat Dec 22, 2007 3:38 am
Posts: 990
Reply with quote
As Cyber says, it would be hard to tell the accuracy of the document, however (assuming the document isn't a fake) I think it's encouraging to see the direction they are going with the kind of play space and some sort of HMD (AR maybe) integration into the ecosystem. I would still like to see more work done in the VR direction, but it seems that is still a ways off.


Tue Jun 26, 2012 10:05 pm
Profile
Diamond Eyed Freakazoid!
User avatar

Joined: Tue Jan 08, 2008 2:25 am
Posts: 776
Location: Moscow, Russia
Reply with quote
Zloth wrote:
I wouldn't trust that $299 figure at all - there's no way they can predict inflation and exchange rates three years ahead of time
Huh? Game console pricing has nothing to do with inflation, exchange rates, or even the actual cost of the hardware for that matter, and has never been.


Wed Jun 27, 2012 1:27 pm
Profile
3D Angel Eyes (Moderator)
User avatar

Joined: Sat Apr 12, 2008 8:18 pm
Posts: 11394
Reply with quote
Yep, consoles usually use a "loss leader" strategy where they price it below cost for several years in order to gain market share. The one exception to this is Nintendo, which has always made a profit on their consoles (but mainly because they didn't have the most cutting edge hardware). So its reasonable to think Microsoft could set an arbitrary price for the console which they thought would help them maintain their position.

@android78: Yes, it is good to think that MS is interested in stereo 3D and VR/AR experiences. Without that I doubt the "next-gen" will be the jump people are looking for.

_________________
check my blog - cybereality.com


Wed Jun 27, 2012 8:22 pm
Profile
Certif-Eyed!
User avatar

Joined: Fri May 11, 2007 10:13 am
Posts: 521
Reply with quote
I think the late 2013 Date is realistic.
But Im not shure about that specs. xbox and Playstation are normaly on the same level of performance.
If you now considder, that one of the Unreal programmers said that Sony should overthink their HW Specs, or the UE4 will have truble running on the PS4, the next gen isn't going to have much bang.

_________________
Image


Thu Jun 28, 2012 1:09 am
Profile
Certif-Eyable!
User avatar

Joined: Sat Dec 22, 2007 3:38 am
Posts: 990
Reply with quote
cirk2 wrote:
I think the late 2013 Date is realistic.
But Im not shure about that specs. xbox and Playstation are normaly on the same level of performance.
If you now considder, that one of the Unreal programmers said that Sony should overthink their HW Specs, or the UE4 will have truble running on the PS4, the next gen isn't going to have much bang.

To be honest, I don't think the next generation will have much 'bang' if the only update is better hardware to implement better graphics/physics. The companies need to be focusing on a new experience, which I think they are starting to do with things like kinect. For the bang that they really need, they should really be waiting and creating a new, revolutionary experience (AR/VR or something), but they won't because this is too big a risk for them. They are likely to gradually introduce new concepts into their ecosystem and just get the 'oh, that's cool' rather then the 'WOW' they need.
So, in my opinion, the 2013 date is realistic, but seems a little pointless unless they have a HMD that they have been working with in secret to ship with it. :lol:


Thu Jun 28, 2012 3:10 pm
Profile
3D Angel Eyes (Moderator)
User avatar

Joined: Sat Apr 12, 2008 8:18 pm
Posts: 11394
Reply with quote
@cirk2: UE4 was running on one GTX 680 I believe (at least the Samaratan demo was). If the new Xbox is at that level it will be pretty decent. But, as John Carmack said, current-gen console games are mainly running at 30FPS and at sub-720P resolutions. Even taking those same games and pushing them to the full 1080P @ 60FPS and you already use up much of that extra power. Not to mention if you are running 3D on top of that.

_________________
check my blog - cybereality.com


Thu Jun 28, 2012 8:09 pm
Profile
Certif-Eyed!
User avatar

Joined: Fri May 11, 2007 10:13 am
Posts: 521
Reply with quote
Sorry but just the upgrade from 720p 30fps to 1080p 60fps with the same 7 Jears old grafics is no selling point.
Modern warefare 3 will look like it is 7 years old even if it is run at 1080p and 60 fps.

If the next Console gerneration (that is at least one year from the release) ist beaten easily by current performance or High-end PC systems, they'll have to drastically shift their marketing.

And we PC gamer will be stuck at the same old grafics for then over a decade...

_________________
Image


Fri Jun 29, 2012 12:12 am
Profile
Diamond Eyed Freakazoid!
User avatar

Joined: Tue Jan 08, 2008 2:25 am
Posts: 776
Location: Moscow, Russia
Reply with quote
cybereality wrote:
@cirk2: UE4 was running on one GTX 680 I believe (at least the Samaratan demo was). If the new Xbox is at that level it will be pretty decent
cirk2 wrote:
Sorry but just the upgrade from 720p 30fps to 1080p 60fps with the same 7 Jears old grafics is no selling point.

As I said in an earlier thread, the Moore's Law of 1965 still holds. Both Microsoft and Sony are waiting for the general availability of the 22 nm manufacturing node because it gives them flexibility to either significantly reduce their manufacturing costs or greatly improve performance even comparing to today's cards. Most foundries will be offering 22 nm node by Autumn 2013.

It was widely speculated that both PS3 and Xbox360 were about $700-800 in manufacturing costs, where processors account for more than half of these costs. Assuming that Microsoft is willing to take similar initial losses, I undertook to compare the Xenos GPU with other high-end cards released by AMD. Note how this year's high-end $400-500 card is easily beaten in less than 1.5-2 years by a $250-300 middle-end card where a GPU is built to a better manufacturing process, and in 3-4 years by a $100-150 low-end card.

It would only be natural to use tomorrow's $250 hardware to offer today's $500 high-end graphics, which should be good enough for most console gamers up to the year 2018 and then reduce manufacturing costs through process node shrinks.


If I were responsible for GPU specs at AMD and Microsoft, the Xbox 3 GPU would be similar to a 22 nm shrink of the next year's HD8870 and would be performing just a bit short of HD7950; I would call it HD9770. That would allow over 100 fps in any today's game and would handle that UE4 demo in 1080p60; what's more important, game developers would target higher-end Direct3D 11 graphics right from the start.

A 28 nm HD8770 would probably do fine too, but it would be less appealing for hardcore gamers and not much better than the Nintendo Wii U (which is similar to HD4770 I believe).


PS. The original presentation is available at http://forum.beyond3d.com/showthread.php?t=62038


Quote:
2006 high-end performance bin
2005-11: Xenos (R500) - 90 nm, 232 million gates, ~160 mm2 die (GPU) - 500 MHz core 48:16:8 config - 700 MHz 128 bit GDDR3 memory, 22.4 GByte/s bandwidth (256 GByte/s to eDRAM)
2006-01: X1900XT (R580) - 90 nm, 384 million, 352 mm2 - 625 MHz 48:(8):16:16 - 1450 MHz 256 bit GDDR3, 46.4 GByte/s


2007 high-end performance bin
2007-05: HD2900XT (R600) - 80 nm, 600 million, 420 mm2 - 743 MHz 320:16:16 - 825 MHz 512 bit GDDR3, 105.6 GByte/s - 475 GFLOPS
2007-11: HD3870 (RV670) - 55 nm, 667 million, 192 mm2 - 775 MHz 320:16:16 - 900 MHz 256 bit GDDR3, 53 GByte/s - 427 GFLOPS
2008-09: HD4670 (RV730) - 55 nm, 514 million, 146 mm2 - 750 MHz 320:32:8 - 1000 MHz 128 bit GDDR3, 32 GByte/s - 480 GFLOPS
2010-01: HD5670 - 40 nm, 627 million, 104 mm2 - 775 MHz 400:20:8 - 1000 MHz 128 bit GDDR5, 64 GByte/s - 620 GFLOPS
2011-04: HD6670 - 40 nm, 716 million, 118 mm2 - 800 MHz 480:24:8 - 1000 MHz 128 bit GDDR5, 64 GByte/s - 768 GFLOPS
2012-01: HD7670 (same as HD6670)
2013-Q1: HD8670 - 28 nm, 900 million, 85 mm2 - 384:24:16 - 128 bit GDDR5


2008 high-end performance bin
2008-06: HD4870 (RV770) - 55 nm, 956 million, 256 mm2 - 750 MHz 800:40:16 - 900 MHz 256 bit GDDR5, 115.2 GByte/s - 1200 GFLOPS
2009-04: HD4770 (RV740) - 40 nm, 826 million, 137 mm2 - 750 MHz 640:32:16 - 800 MHz 128 bit GDDR5, 57.6 GByte/s - 960 GFLOPS
2009-10: HD5770 - 40 nm, 1040 million, 166 mm2 - 850 MHz 800:40:16 - 1200 MHz 128 bit GDDR5, 76.8 GByte/s - 1360 GFLOPS
2011-04: HD6770 (same as HD5770)
2012-02: HD7770 - 28 nm, 1500 million, 123 mm2 - 1000 MHz 640:40:16 - 1125 MHz 128 bit GDDR5, 72 GByte/s - 1280 GFLOPS
2013-Q1: HD8770 - 28 nm, 2000 million, 160 mm2 - 896:48:16 - 192 bit GDDR5


2009 high-end performance bin
2009-09: HD5870 - 40 nm, 2154 million, 334 mm2 - 850 MHz 1600:80:32 - 1200 MHz 256 bit GDDR5, 153.6 GByte/s - 2720 GFLOPS
2010-10: HD6870 - 40 nm, 1700 million, 255 mm2 - 900 MHz 1120:56:32 - 1050 MHz 256 bit GDDR5, 134.4 GByte/s - 2016 GFLOPS
2010-12: HD6970 - 40 nm, 2640 million, 389 mm2 - 880 MHz 1536:96:32 - 1375 MHz 256 bit GDDR5, 176 GByte/s - 2703 GFLOPS
2012-03: HD7870 - 28 nm, 2800 million, 212 mm2 - 1000 MHz 1280:80:32 - 1200 MHz 256 bit GDDR5, 153.6 GByte/s - 2560 GFLOPS
2013-Q1: HD8870 - 28 nm, 3200 million, 220 mm2 - 1536:96:32 - 256 bit GDDR5

(hypothetical) 2014 : HD9770 - 22 nm, 3000+ million???, 160 mm2??? - 1536:96:48???


2012 high-end performance bin
2012-01: HD7970 - 28 nm, 4313 million, 352 mm2 - 925 MHz 2048:128:32 - 1375 MHz 384 bit GDDR5, 264 GByte/s - 3789 GFLOPS
2013-Q1: HD8970 - 28 nm, 5100 million, 410 mm2 - 2560:160:48 - 384 bit GDDR5

(hypothetical) 2014 : HD9870 - 22 nm, 5000+ million???, 220 mm2??? - 2048:128:48???


2014 high-end performance bin
(hypothetical) 2014 : HD9970 - 22 nm, 7000+ million???, 350 mm2??? - 3072:192:64???


Sat Jun 30, 2012 2:05 pm
Profile
Certif-Eyed!
User avatar

Joined: Fri May 11, 2007 10:13 am
Posts: 521
Reply with quote
In general I agree with you Dimity

Speculations currently go that the Consoles won't be so subsidized as the current ones, and since price is one of the main battlegrounds will mean cheaper hardware.

The Release of next consoles is very likely end of 2013, and like you said the tape out for the chips should be mid 2012. So the possible "mother" chips for the console deviates are narrowed down to the refresh versions of current chips+ about 4 Month of development (GK110 taped out end of February according to semiaccurate).

IAs I read the signs, I expect the graphics to be performance level cards (gtx760 or ati equivalent) at maximum.
When i now look back at a comparison between benchmarks of GTX560 vs GTX580 ( http://www.computerbase.de/artikel/graf ... gtx-560/4/ ) the 560 has about 70% of the performance of a 580. The link in your list at the bottom expects the performance improvements to be 30-40%, which leaves us at ruffly the same performance for at GTX760 as the current GTX680.

But the CPU side wont be as meaty as the Current gen is. The Cell of the PS3 had over 200 Gflops, while any desktop available at that time was far away from that performance.

Now continuing under the assumption that the companys don't want to subsidize and sell them at no profit.
The PS3 came out for 600$, In our current speculation we have to devote 250$ for graphics. Leaving us with 350$ for the rest. If we assume 150$ For everything besides the CPU, we're left with 200$ for the CPU, which equals to a good i7-3000. This leaves us with a quite decent system that is comparable with my current PC setup.

But the story goes that it will sell for 300$. We assume that they go cheap on the other components, paying 80$, leaving 220$ for CPU and GPU. So there can't be a performance range card and no good cpu in this setup and we're ending up far below the performance of the 600$ system.

I know that I calculated with more or less consumer prices but the general direction is clear: Either they scarp the 300$ price target or next years high-mid range PCs will beat consoles performance wise.

I fear that the second case will be the one becoming true, letting us stuck at the UE4 or BF3 level of graphics and features (physics etc.) for the next 6-7 Years.

_________________
Image


Sat Jun 30, 2012 5:43 pm
Profile
Certif-Eyable!
User avatar

Joined: Sat Dec 22, 2007 3:38 am
Posts: 1154
Location: Montpellier, France
Reply with quote
cybereality wrote:
@cirk2: UE4 was running on one GTX 680 I believe (at least the Samaratan demo was).

If I remember correctly, Samaritan was running on a 2x SLI of GTX 580, this year Elemental demo was running on a single GTX 680.

_________________
Passive 3D forever !
DIY polarised dual-projector setup :
2x Epson EH-TW3500 (2D 1080p)
Xtrem Screen Daylight 2.0, for polarized 3D
3D Vision gaming with signal converter : VNS Geobox 501


Sun Jul 01, 2012 12:37 am
Profile
Diamond Eyed Freakazoid!
User avatar

Joined: Tue Jan 08, 2008 2:25 am
Posts: 776
Location: Moscow, Russia
Reply with quote
cirk2 wrote:
The Release of next consoles is very likely end of 2013, and like you said the tape out for the chips should be mid 2012. So the possible "mother" chips for the console deviates are narrowed down to the refresh versions of current chips+ about 4 Month of development (GK110 taped out end of February according to semiaccurate).


You are assuming that high-end 2014 Q1 GPUs are using 28 nm node and are being typed out at this node; they aren't.


20-22 nm node has been testing since 2009, Intel already started volume 22 nm production in March 2012 and volume production of 20 nm node will be available from both TSMC and GlobalFoundries (former AMD fabs) in 2013. So AMD should be typing out next-gen 20 nm GPU chips by 2012 Q2-Q3 and preparing them for volume production in 2013 Q3.

http://www.eetimes.com/electronics-news ... s-at-20-nm
http://www.brightsideofnews.com/news/20 ... -node.aspx


It's not probably about manufacturing costs this time, which should be the same as for 28 nm, but rather about more efficiency, i.e. performance per Watt.

Quote:
The link in your list at the bottom expects the performance improvements to be 30-40%, which leaves us at ruffly the same performance for at GTX760 as the current GTX680.

Either they scarp the 300$ price target or next years high-mid range PCs will beat consoles performance wise.

I fear that the second case will be the one becoming true, letting us stuck at the UE4 or BF3 level of graphics and features (physics etc.) for the next 6-7 Years


I've lived through a dozen of tech demos which were supposed to highlight the features of the next-gen consoles, and these demos have NEVER been in any way indicative of the real world hardware in the end.


From the pre-rendered Two to Tango for the Xbox in 1998, to Killzone 2 for PS3 in 2005, the graphics have always been scaled way, WAY back when shown on the real thing. Even the recent "real-time" tech demos like UE3 Samaritan in 2011 and Agni's Philosophy in 2012 are presented on very powerful PC hardware which probably outperforms the next gen consoles by a wide margin.

General public has been readily accepting this graphics quality tradeoff (quite significant one from my hardcore gamer point of view) once they were shown the magical $300 price figure (which again is heavily subsidized by the manufacturer at the start of sales).


I would love to prove myself wrong and witness the next Xbox with a HD8870-like 20 nm GPU part that packs a HD7970/GTX680 level of performance in a console-friendly 160 mm2 die that drains 100 W of power; given 4 Gbytes of RAM, this would handle these "real-time" tech demos just fine. I've just been misled too many times before.


FYI, AMD is the supplier for both PS4 and Xbox 3; the last rumours have it that Xbox 3 developer kits were equipped with HD77x0 (1 TFLOPS), and that PS4 is targeting a HD78x0 level of performance (2 TFLOPS)... http://forum.beyond3d.com/showpost.php? ... ount=12817


Sun Jul 01, 2012 1:07 am
Profile
Certif-Eyed!
User avatar

Joined: Fri May 11, 2007 10:13 am
Posts: 521
Reply with quote
i see we share the pessimistic view of future consoles.

Even if they tape out on 22/20nm they already set Hardware performance and features, so the better manufacturing will most likely be used for smaller size.

I used Gforce model numbers because I'm not fluid with ati models.

The flops of the dev kits are news to me... but 2 Tflops isn't that much, especially not next year.

What i personally fear the most is that we will now have one big jump in gaphics (Xbox 360 -dx9 to Xbox 720 -dx11), and then we will have next to no Developpment for the next 6 Years. Excluding awesome PC oriented companies like CDproject RED.

_________________
Image


Sun Jul 01, 2012 2:19 am
Profile
3D Angel Eyes (Moderator)
User avatar

Joined: Sat Apr 12, 2008 8:18 pm
Posts: 11394
Reply with quote
To be honest, BF3 graphics (on PC) will be good enough for a while. The current generation of GPUs have not even remotely been tapped out, since most developers are just porting from consoles. Truth be told, how "good" the graphics look has as much to do with the art as it does with the technology. There is still a lot to explore even given the current state-of-the-art. For example, look at some of those Nvidia tech demos even from 5 years ago. They still look better than any game released on any platform.

So I think if the consoles can reach current-gen PC levels (even mid-range) that will be more than enough to keep developers busy. Its not like developers are "held-back" by current consoles, in the sense that they cannot realize a particular idea or vision. Any possible type of game can be created, to a great deal of fidelity, right now. So I think the real innovation will come from more original stories and experiences, and more immersive technologies (3D, motion-tracking, haptics, etc.). Of course the graphics can get better, but there are diminishing returns compared to other avenues.

_________________
check my blog - cybereality.com


Mon Jul 02, 2012 9:38 pm
Profile
Diamond Eyed Freakazoid!
User avatar

Joined: Tue Jan 08, 2008 2:25 am
Posts: 776
Location: Moscow, Russia
Reply with quote
So here we are: http://www.youtube.com/watch?v=RiNGZMx2vhY

The PlayStation 4 is coming late 2013 and uses a custom AMD APU with a 8-core x64 Jaguar CPU, a HD7850-like GPU with 1152 shader cores and 1.84 GFLOPS of power, and 8 GBytes of GDDR5 memory. That's pretty much today's high-end gaming PC spec.

I'd still wish they used a more powerful GPU rated at 3.0-3.5 TFLOPS; unfortunately 20 nm Radeon HD parts slipped forward to 2014 :(

Microsoft is expected to unveil the next Xbox in April 2013.


Fri Feb 22, 2013 1:37 pm
Profile
One Eyed Hopeful

Joined: Mon Dec 15, 2008 7:03 pm
Posts: 38
Reply with quote
cybereality wrote:
Yep, consoles usually use a "loss leader" strategy where they price it below cost for several years in order to gain market share. The one exception to this is Nintendo, which has always made a profit on their consoles (but mainly because they didn't have the most cutting edge hardware). So its reasonable to think Microsoft could set an arbitrary price for the console which they thought would help them maintain their position.

@android78: Yes, it is good to think that MS is interested in stereo 3D and VR/AR experiences. Without that I doubt the "next-gen" will be the jump people are looking for.

Maybe you were too early with the Nintendo-statement.
They made severe losses-multiple 100s millions.

_________________
http://www.3dreal.tk
http://www.stereopan.org
Under reconstruction
3DStereo-Media
Aero-Marspanoramas
Innovative Online-Display


Sat Apr 20, 2013 12:41 am
Profile
Diamond Eyed Freakazoid!
User avatar

Joined: Tue Jan 08, 2008 2:25 am
Posts: 776
Location: Moscow, Russia
Reply with quote
DmitryKo wrote:
It would only be natural to use tomorrow's $250 hardware to offer today's $500 high-end graphics, which should be good enough for most console gamers up to the year 2018 and then reduce manufacturing costs through process node shrinks.

If I were responsible for GPU specs at AMD and Microsoft, the Xbox 3 GPU would be similar to a 22 nm shrink of the next year's HD8870 and would be performing just a bit short of HD7950; I would call it HD9770. That would allow over 100 fps in any today's game and would handle that UE4 demo in 1080p60; what's more important, game developers would target higher-end Direct3D 11 graphics right from the start.


Quote:

2007 high-end performance bin
2011-04: HD6670 - 40 nm, 716 million, 118 mm2 - 800 MHz 480:24:8 - 1000 MHz 128 bit GDDR5, 64 GByte/s - 768 GFLOPS
2012-01: HD7670 (same as HD6670)
2013-Q1: HD8670 - 28 nm, 900 million, 85 mm2 - 384:24:16 - 128 bit GDDR5


2008 high-end performance bin
2012-02: HD7770 - 28 nm, 1500 million, 123 mm2 - 1000 MHz 640:40:16 - 1125 MHz 128 bit GDDR5, 72 GByte/s - 1280 GFLOPS
2013-Q1: HD8770 - 28 nm, 2000 million, 160 mm2 - 896:48:16 - 192 bit GDDR5


2009 high-end performance bin
2012-03: HD7870 - 28 nm, 2800 million, 212 mm2 - 1000 MHz 1280:80:32 - 1200 MHz 256 bit GDDR5, 153.6 GByte/s - 2560 GFLOPS
2013-Q1: HD8870 - 28 nm, 3200 million, 220 mm2 - 1536:96:32 - 256 bit GDDR5
(hypothetical) 2014 : HD9770 - 22 nm, 3000+ million???, 160 mm2??? - 1536:96:48???


2012 high-end performance bin
2012-01: HD7970 - 28 nm, 4313 million, 352 mm2 - 925 MHz 2048:128:32 - 1375 MHz 384 bit GDDR5, 264 GByte/s - 3789 GFLOPS
2013-Q1: HD8970 - 28 nm, 5100 million, 410 mm2 - 2560:160:48 - 384 bit GDDR5
(hypothetical) 2014 : HD9870 - 22 nm, 5000+ million???, 220 mm2??? - 2048:128:48???


2014 high-end performance bin
(hypothetical) 2014 : HD9970 - 22 nm, 7000+ million???, 350 mm2??? - 3072:192:64???

1.5 years have passed and I admit I have been proven completely wrong, as TSMC have been more than a year late with 20 nm transition - forcing AMD to delay newer GCN2 chips until Q4 2013 and even recycle older 28 nm GCN1 chips in the high-mid range, which they never did before...

So in reality my proposed "20 nm shrink" of HD7970 never happened, and so did my proposed two-fold reduction of die size with associated cost reductions. Hmmm....
Guess I make a very terrible product planner.

I would steel like to see a full-blown 2500 GFLOPS R9 270X part in the XBOX One instead of a weak 1 GFLOPS custom CPU/GPU combo...



2007 high-end performance bin <= 900 GFLOPS
2012-02: HD 7750 (Cape Verde Pro) - 28 nm, 1500 million, 123 mm2 - 900 MHz 512:32:16 - 1125 MHz 128 bit GDDR5, 72 GByte/s - 921 GFLOPS
2013-03: HD 8730 (Cape Verde LE) - 800 MHz 384:24:8 - 614 GFLOPS

2013-10: R7 250 X (Oland XT) - 28 nm, 1040 million, 90 mm2 - 1000 Mhz 384:24:8 - 1125 MHz 128 bit GDDR5, 72 GByte/s - 768 GFLOPS


2008 high-end performance bin ~ 1200 GFLOPS
2012-02: HD 7770 GHz edition (Cape Verde XT) - 28 nm, 1500 million , 123 mm2 - 1000 MHz 640:40:16 - 1125 MHz 256 bit GDDR5, 72 GByte/s - 1280 GFLOPS
2013-Q1: HD 7790 (Bonaire XT) == HD 8760 (OEM) == HD 7770 GHZ edition

2013-10: R7 260 X (Bonaire XTX) == 28 nm, 2080 million, 160 mm2 - 1100 MHz 896:56:16 - 1625 128-bit GDDR5, 104 GByte/s - 1971 GFLOPS


2009 high-end performance bin ~ 2500 GFFLOPS
2012-03: HD 7870 GHz Edition (Pitcairn XT) - 28 nm, 2800 million, 212 mm2 - 1000 MHz 1280:80:32 - 1200 MHz 256 bit GDDR5, 153.6 GByte/s - 2560 GFLOPS
2013-Q1: HD 8860 (OEM) == HD 7870 GHz Edition
2013-10: R9 270X (Curaçao XT) == HD 8860 (OEM) + 1400 MHz GDDR5, 179.2 GByte/s


2012 high-end performance bin ~ 4000 GFLOPS
2012-01: HD 7970 (Tahiti XT)- 28 nm, 4313 million, 352 mm2 - 925 MHz 2048:128:32 - 1375 MHz 384 bit GDDR5 , 264 GByte/s - 3789 GFLOPS
2012-06: HD 7970 GHz edition (Tahiti XT2) - 1000 MHz core, 1500 MHz GDDR5, 288 GByte/s - 4096 GFLOPS
2013-Q1 HD 8970 == HD 7970 GHz edition
2013-10: R9 280X == HD 8970


NEW 2014 high-end performance bin ~ 6000 GFLOPS
2013-10: R9 290X (Hawaii XT) - 28 nm, 6200 million , 480 mm2 - 1000 MHz 2816:176:64 - 1250 MHz 512 bit GDDR5, 320 GByte/s - 5632 GFLOPS


Tue Jan 14, 2014 5:35 pm
Profile
3D Angel Eyes (Moderator)
User avatar

Joined: Sat Apr 12, 2008 8:18 pm
Posts: 11394
Reply with quote
It is funny looking back at the old predictions.

_________________
check my blog - cybereality.com


Tue Jan 14, 2014 9:52 pm
Profile
One Eyed Hopeful

Joined: Tue Jan 10, 2012 4:21 pm
Posts: 17
Reply with quote
cybereality wrote:
It is funny looking back at the old predictions.

LOL stop it, it's a little bit depressing! Hehehehe

Back to the dev kit. ;-)


Wed Jan 15, 2014 1:13 am
Profile
Display posts from previous:  Sort by  
Reply to topic   [ 25 posts ] 

Who is online

Users browsing this forum: No registered users and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
Powered by phpBB® Forum Software © phpBB Group
Designed by STSoftware.