Its also worth noticing that Microsoft lost about 3B$ on Xbox sofar. And Sony about 5B$ in the same timeframe. The only company of the 3 that made a profit in mentioned timeframe is Nintendo. Seems 2 was ok, but 3 is a crowd.
Is that on just the hardware, or on the hardware and software licensing fees?
If this industry is so unprofitable, then why be in it?
Its their entire divisions. So yes, software+hardware.
I guess its because there is only room for one of them. Plus they both might spend more than they normally would. A duel to the death!
Any sources on that? Those amounts are nothing to just shrug off, even for such large companies.
Yes, your optimizing for a processor, its particulars, how the oooE works, L/S works, how it decodes, what kind of throughput you get with different operations etc .
If jaguar is turlet slow then Xenon was sloth/koala slow when it was released ( both sleep 20 hours a day)
Its also worth noticing that Microsoft lost about 3B$ on Xbox sofar. And Sony about 5B$ in the same timeframe. The only company of the 3 that made a profit in mentioned timeframe is Nintendo. Seems 2 was ok, but 3 is a crowd.
http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.
Now the PC software architecture DirectX has been kind of bent into shape to try to accommodate more and more of the batch calls in a sneaky kind of way. There are the multi-threaded display lists, which come up in DirectX 11 that helps, but unsurprisingly it only gives you a factor of two at the very best, from what we've seen. And we also support instancing, which means that if you're going to draw a crate, you can actually draw ten crates just as fast as far as DirectX is concerned.
But it's still very hard to throw tremendous variety into a PC game. If you want each of your draw calls to be a bit different, then you can't get over about 2-3,000 draw calls typically - and certainly a maximum amount of 5,000. Games developers definitely have a need for that. Console games often use 10-20,000 draw calls per frame, and that's an easier way to let the artist's vision shine through.'
I didnt beliewe in this 8 jaguar talk, but your argument made me think twice
One Jaguar is 3.5mm2 per core = 28 mm2 for the 8 cpu cores (Hans de Vries numbers). If wafer prices is 2500 usd today (IDC TSMC 28nm Q4 numbers), and about 2200 8-core jaguars per wafer, thats about 1 usd for the cpu. Add some discount - if you google for the coupon code - i would say less than 1 usd production cost, for a 8 core cpu, is fairly cheap Using jaguar for this, while everyone and his brother is using it for his high-end tablet (the rest is mostly A7), looks like a decent cost effective solution. Or rather the definition of cost-effective.
Quite a bit more profitable than using an Intel 3570 wouldnt you think?
Seriously. If they can get this to work, its a far better start than the fat, expensive, power consuming, solutions they brought last time.
I was thinking about cost also. How much does a 7950M cost? It seem like it would be pretty expensive to put into a console.
1.6ghz CPU thats a laugh AMD 1.6ghz too. like a 1ghz intel CPU.
Its also worth noticing that Microsoft lost about 3B$ on Xbox sofar. And Sony about 5B$ in the same timeframe. The only company of the 3 that made a profit in mentioned timeframe is Nintendo. Seems 2 was ok, but 3 is a crowd.
Your math is somewhat flawed since one Jaguar core is measured without L2 cache(which cannot be excluded),FCH(SB) and NB. Even if we assume AMD is making a custom Jaguar based part just for IMB and Sony(Sony 100%) and if we assume it won't have iGPU onboard(as it would be pointless to have it and not use it?) then the math should be something like this : present QC Kabini die-iGPU die area+(4x3.1mm^2+ 2MB for L2 @ 28nm).I didnt beliewe in this 8 jaguar talk, but your argument made me think twice
One Jaguar is 3.5mm2 per core = 28 mm2 for the 8 cpu cores (Hans de Vries numbers). If wafer prices is 2500 usd today (IDC TSMC 28nm Q4 numbers), and about 2200 8-core jaguars per wafer, thats about 1 usd for the cpu. Add some discount - if you google for the coupon code - i would say less than 1 usd production cost, for a 8 core cpu, is fairly cheap Using jaguar for this, while everyone and his brother is using it for his high-end tablet (the rest is mostly A7), looks like a decent cost effective solution. Or rather the definition of cost-effective.
Quite a bit more profitable than using an Intel 3570 wouldnt you think?
Seriously. If they can get this to work, its a far better start than the fat, expensive, power consuming, solutions they brought last time.
Are we supposed to get excited about an AMD cpu powering the next gen consoles? They havent been competitive for years against Intel in that dept.
From what I am hearing this round of consoles is going to be a giant dud. And hardly will it advance PC gaming. They are on the lower end of PC specs as the starting point. At least the last consoles(360,PS3) were near the top of the PC specs when introduced.
dud-- I agree, am I the only one that feels even after 7 years this next gen is premature? I don't feel like the games have exactly maxed out the hardware
Since they're the same GPU, the 7970M comparison is more about looking at what power consumption might be like in a power optimized part. I don't think anyone is under the impression that AMD will get high-end mobile GPU premiums on a console part.Its not a 7970M, its a downclocked HD7850. Huge price difference.
http://www.anandtech.com/show/6670/dragging-core2duo-into-2013-time-for-an-upgrade
That was nice to see. An 8 core Jaguar should be decently competitive as long as it's clocked high enough.
It should obliterate what the consoles have now, unless those cores are better than what they appear to be.
I think the key word is active install base. And not units shipped. Many consoles have died/thrown out/hidden away over the last 7 years. I think from 2005 to 2007 the RROD rate for Xbox360 was 33% or so for example. And in 2009 a survey showed up to 54.2%.
It's neither 7970M nor desktop 7850 really: 7970M is a downclocked desktop 7870 - 1280 cores (20 CUs) @ 850MHz, desktop 7850 is 1024 cores (16 CUs) @ 860 MHz, and this console part is 1152 cores (18 CUs) @ 800 MHz according to the link.Since they're the same GPU, the 7970M comparison is more about looking at what power consumption might be like in a power optimized part. I don't think anyone is under the impression that AMD will get high-end mobile GPU premiums on a console part.
dud-- I agree, am I the only one that feels even after 7 years this next gen is premature? I don't feel like the games have exactly maxed out the hardware
http://www.anandtech.com/show/6670/dragging-core2duo-into-2013-time-for-an-upgrade
That was nice to see. An 8 core Jaguar should be decently competitive as long as it's clocked high enough.
It should obliterate what the consoles have now, unless those cores are better than what they appear to be.
Here's an interesting point from somebody optimizing code for their bobcat chip. Efficiency on jaguar is even higher.
http://forum.beyond3d.com/showpost.php?p=1692500&postcount=17957