Sony reveals PS4's CPU clock (1.6GHz) - 6 cores available to developers

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CakeMonster

Golden Member
Nov 22, 2012
1,428
535
136
this is the best possible situation -especially for AMD- because now we have software engineers designing game engines that are better threaded.

Who cares about single threaded performance if it is faster than the previous gen but slower than current gen pc parts if the tasks can be now multithreaded and use more than 2 -sometimes more than 4- threads.

How is this benefitting AMD? They are so far behind that even though they have 8 core CPUs on the market they are still inferior to Intel in gaming. The only advantage would be cost/cycle in full throttle encoding tasks. That won't help AMD one iota when it comes to the needs of gamers.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Let's put it this way, console sells are very price sensitive especially when they are gifts, every dollar counts.

If the ps4 was 20 dollars more expensive to build due to a bigger cpu, gpu, or cooling yet kept the same price then Sony would lose 120 million dollars in a loss or less profit since they have sold 6 million consoles since the start of march.

GPU is more important than CPU, yet even the ps4 runs there GPU at 800 mhz instead of 925 or 1000 mhz (every card in the r9 series runs their gpu at 925 or 1000 mhz) . That is a free 25% gpu performance!!! Doing so makes no sense unless the console has heat, noise, or yield problems. My bet is on the heat for you would probably have to use higher voltages to hit 1000 mhz instead of 800 MHz.
 
Last edited:

el etro

Golden Member
Jul 21, 2013
1,581
14
81
The problem was, sony needed to cut costs, that's why they asked AMD for its APU solution. They needed performance/watt, that's why they fused 2 4c jaguar(110mm²) in a single processor. Pitcairn is ~220mm² sized(I know they made the gpu with GCN1.1 architecture) and the next GPU chip is ~350mm² sized, so they not found a way to put a bigger GPU on the PS4 project.
 

NTMBK

Lifer
Nov 14, 2011
10,269
5,134
136
The problem was, sony needed to cut costs, that's why they asked AMD for its APU solution. They needed performance/watt, that's why they fused 2 4c jaguar(110mm²) in a single processor. Pitcairn is ~220mm² sized(I know they made the gpu with GCN1.1 architecture) and the next GPU chip is ~350mm² sized, so they not found a way to put a bigger GPU on the PS4 project.

The problem is not just die size- it's heat.

Top end GPUs draw twice the power that they did when the 360 launched, and have to dissipate twice as much heat. Trying to cram that much heat into a console is going to make a very expensive cooling solution, and is just begging for a repeat of the RROD disaster. Just look at how overengineered the XBox One's heatsink is, if you need an indication of how seriously they have taken heat this time.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
Trying to cram that much heat into a console is going to make a very expensive cooling solution, and is just begging for a repeat of the RROD disaster. Just look at how overengineered the XBox One's heatsink is, if you need an indication of how seriously they have taken heat this time.

I remember this, NTMBK. First 360's have still a horrible project, with CPU and GPU located closer and a horrible PSU. Last generation was a generation with the highest console failure rate, but this is the bad part of having strong graphics to compete with high-end PCs. Despite some graphical tricks, last gen consoles IQ was always acceptable compared to what most of gaming PCs can produce.


HD7970GHZ/GTX770 have one year left as 1600p gaming capable cards. By 2015 they should cannot more play the latest games at over 1080p with high quality. Last-gen consoles could not much times have 1080p image quality in its games, but for this generation play games at under this quality is much less acceptable.


360-PS3 graphics decision has proven to be right after many years. Build 360 with a strong graphic card made 360 last the eight years it lasted, in a gaming world were graphics means so much. With GPUs continuing to move with 80-100% more performance at every architectural jump, PS4-Xone Gpus can become obsolete much faster than in other generations.
 

jpiniero

Lifer
Oct 1, 2010
14,842
5,457
136
I think the thing that annoys me the most about the consoles is that they are wasting quite a bit of the resources on the OS and frivolous stuff. Far more than any of the previous consoles, I'm sure.
 

Makaveli

Diamond Member
Feb 8, 2002
4,761
1,160
136
360-PS3 graphics decision has proven to be right after many years. Build 360 with a strong graphic card made 360 last the eight years it lasted, in a gaming world were graphics means so much. With GPUs continuing to move with 80-100% more performance at every architectural jump, PS4-Xone Gpus can become obsolete much faster than in other generations.

The only thing I have to say about this is both of those last gen consoles barely had any games at 1080p most were 720p or lower, it lasted 8 years but you can't really say either of those gpus were strong if they were game would have been made at 1080p. I think both of the last gen consoles could have been replaced 5 years in instead of limping to the finish line at the 8th year. And I think this years consoles will have more staying power than the last.
 

jpiniero

Lifer
Oct 1, 2010
14,842
5,457
136
Well, even the highest end gaming PC bought/built right when the 360 was launched wouldn't be able to play recent titles at any resolution. That's not going to be the case with this era.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Piledriver and Jaguar are similar IPC per clock, Piledriver clocks higher but it needs more energy and die space to achieve that. The extra 20 to 30 watts tdp is a big deal when your goal is trying to make something a little larger than a ceral box.

Source?

Because I'm seeing here that at best Jaguar gets 90% of K10 IPC much less 100% of Piledriver. http://codedivine.org/2013/05/25/amd-jaguar-vs-amd-llano-k10-at-same-clocks/. Don't know how credible that source is so if you've got a better one I'm genuinely curious to see it

The power point is a fine point to make, but seeing how well Kaveri scales down I wish they had taken a 45w (or less) Kaveri with 2 or 3 modules over a 8 core Jaguar. As pokey as AMD can be about getting a core design out, I bet they could have done it with Kaveri given the amount of money they'd be getting from MS and Sony in order to develop it, had those chosen to do so.
 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Source?

Because I'm seeing here that at best Jaguar gets 90% of K10 IPC much less 100% of Piledriver. http://codedivine.org/2013/05/25/amd-jaguar-vs-amd-llano-k10-at-same-clocks/. Don't know how credible that source is so if you've got a better one I'm genuinely curious to see it

The power point is a fine point to make, but seeing how well Kaveri scales down I wish they had taken a 45w (or less) Kaveri with 2 or 3 modules over a 8 core Jaguar. As pokey as AMD can be about getting a core design out, I bet they could have done it with Kaveri given the amount of money they'd be getting from MS and Sony in order to develop it, had those chosen to do so.

comparing cinebench score between the two perhaps?
 

Rezist

Senior member
Jun 20, 2009
726
0
71
The only thing I have to say about this is both of those last gen consoles barely had any games at 1080p most were 720p or lower, it lasted 8 years but you can't really say either of those gpus were strong if they were game would have been made at 1080p. I think both of the last gen consoles could have been replaced 5 years in instead of limping to the finish line at the 8th year. And I think this years consoles will have more staying power than the last.

Alot of AAA games were 640p or something in between. This gen is only disappointing because I have little faith they can really "max out" 1080p There will still be alot of 720p even from the PS4 side once the details start getting cranked up.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Cost and noise were preferred by Sony and Microsoft vs absolute performance.

If you had Sony and Microsoft priorities, their decision made sense.

http://www.anandtech.com/show/7528/the-xbox-one-mini-review-hardware-analysis/5

Xbone and PS4 consume 120-140 W under gaming load for the entire system.

That is not much power when you consider that a 7870 running BF4 consumes a similar amount of power and is cooled by a reference two slot cooler. Such a cooler is not expensive nor difficult to manufacture.

With the significantly larger volume of the ps4/xbone it should be no difficulty to cool.
 
Last edited:

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Well, even the highest end gaming PC bought/built right when the 360 was launched wouldn't be able to play recent titles at any resolution. That's not going to be the case with this era.

Thats not right. A high end system from 2005-2006 would be able to play recent X360/PS3 games at the same 1024x600 resolution, no AA, and low-medium detail settings as the console.
 

jpiniero

Lifer
Oct 1, 2010
14,842
5,457
136
Thats not right. A high end system from 2005-2006 would be able to play recent X360/PS3 games at the same 1024x600 resolution, no AA, and low-medium detail settings as the console.

Remember, the 360 predates Conroe by 6 months. So you are talking about an Athlon FX with something like an 7800 GT. Using SLI the GPU might be enough for a recent title, but the CPU would just not be fast enough.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
The only thing I have to say about this is both of those last gen consoles barely had any games at 1080p most were 720p or lower, it lasted 8 years but you can't really say either of those gpus were strong if they were game would have been made at 1080p. I think both of the last gen consoles could have been replaced 5 years in instead of limping to the finish line at the 8th year. And I think this years consoles will have more staying power than the last.

The IQ diff is most times relatively low, based on amount of resources consumed. See Crysis 3 in PS3 and PC to get your conclusions.


Cinebench.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,689
1,224
136
Size:
http://i3.minus.com/jbutkcGiiJhs6i.jpg
+ http://i.imgur.com/hOphMuH.jpg
+ http://i.imgur.com/7OTEzxt.jpg

Jaguar gives you more for less area than Bulldozer/Piledriver/Steamroller and Llano

Jaguar = 16h
BD/PD/SR = 15h
Llano = 12h

Jaguar (4 cores): ~26.2 mm²
Bulldozer/Piledriver (2 cores): ~30.9 mm²
Steamroller (2 cores): ~29.47 mm²
Dual Core Llano (2 cores): ~32 mm²

Performance per core:

All clocks equal;
1 Jaguar Core ≈ 1 BD/PD/SR Core ≈ 0.66 Llano Core
^-- if you want to keep it simple with realistic capabilities.

8 Jaguar cores @ 1.6/1.75 Ghz ≈ 8 BD/PD/SR cores @ 1.6/1.75 GHz.

---
You probably want to point out that 16h is low power focused and for casuals. Not gamers, not overclockers, not enthusiasts, but normal people who have no idea what x86/x86-64 is.
 
Last edited:

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Source?

Because I'm seeing here that at best Jaguar gets 90% of K10 IPC much less 100% of Piledriver. http://codedivine.org/2013/05/25/amd-jaguar-vs-amd-llano-k10-at-same-clocks/. Don't know how credible that source is so if you've got a better one I'm genuinely curious to see it

The power point is a fine point to make, but seeing how well Kaveri scales down I wish they had taken a 45w (or less) Kaveri with 2 or 3 modules over a 8 core Jaguar. As pokey as AMD can be about getting a core design out, I bet they could have done it with Kaveri given the amount of money they'd be getting from MS and Sony in order to develop it, had those chosen to do so.

This cpus are picked for they are the best Amd of Kabini and Richland, and there ghz numbers makes it easier to do calculations

AMD A6-5200, Quad Core at 2.0 Ghz no turbo on cpu
http://www.notebookcheck.net/AMD-A-Series-A6-5200-Notebook-Processor.92895.0.html
AMD A10-5750M, 2 Module/"Quad Core" at 2.5 Ghz can turbo up to 3.5 Ghz
http://www.notebookcheck.net/AMD-A-Series-A10-5750M-Notebook-Processor.92882.0.html

2.5 ghz is 25% clock speed advantage vs 2.0 ghz, 3.5 ghz is a 75% clock speed advantage vs 2.0 ghz. Now Richland was designed for high frequency targets and Kabini was designed for low frequency targets, that said you would want to keep the voltage low on both targets for you are prioritizing power, and thus you may not want to run Kabini at the max speed it could reach.

3DMark 06 - CPU:
2659.5 vs 3056 (14.9% advantage richland)
Cinebench R11.5: CPU Single 64Bit
.5 vs .8 (60% advantage richland)
Cinebench R11.5: CPU Multi 64Bit
1.9 vs 2.3 (21.0% advantage richland)
WinRAR:
1151.5 vs 2690 (133.6% advantage richland)
X264 HD Benchmark 4.0: Pass 1
51.7 vs 78.6 (52.0% advantage richland)
X264 HD Benchmark 4.0: Pass 2
11.4 vs 15.2 (33.3% advantage richland)

So even though richland has a vast ghz advantage it is not more efficient in most tests ipc wise with the exception of WinRar. An 8 core Kabini has die space and thermal advantage over a 2 module Kabini. If you can parallelize your code enough it may even have a performance advantage over a 2 module Kabini.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
http://www.anandtech.com/show/7528/the-xbox-one-mini-review-hardware-analysis/5

Xbone and PS4 consume 120-140 W under gaming load for the entire system.

That is not much power when you consider that a 7870m running BF4 consumes a similar amount of power and is cooled by a reference two slot cooler. Such a cooler is not expensive nor difficult to manufacture.

With the significantly larger volume of the ps4/xbone it should be no difficulty to cool.

Both Microsoft and Sony want to keep the power consumption in that 120w range that you observed, they do not want a chip that was 150 or 180w power consumption (25% or 50% higher). They did so for they want to keep rmas low, use cheaper power bricks, keep the size small, keep the noise low etc.

Remember this devices are often in places with very bad air flow such as in an entertainment center, a wood or glass box with no air flow. Microsoft and Sony want to make money on these consoles, they do not want high rmas.

If heat/power consumption was not a concern they could get free performance by running the gpu at the same clock speeds they run the desktop chips at!
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Here is jaguar vs big cores analysis:
http://www.extremetech.com/computin...o-take-a-page-from-intel-and-dump-steamroller
CPU Efficiency:
he simplest way to measure the efficiency of the two chips is to divide their respective benchmark scores in a given application by (CPU Frequency * Core Count). This normalizes both variables and gives us a measure of intrinsic core performance. The next step was to turn each of these clock-and-core normalized figures into a percentage. In a test like Cinebench, a score less than 100% indicates that Kabini is less efficient than its big-core rival, while a score of greater than 100% means Kabini is more efficient.


So it is there... somewhere between Kaveri and Richland IPC. I think it has better power power consumption, better die area efficiency and it was made to be easily ported from fab to fab.

Having lots of small cores in consoles helps AMD keep up with intel. I know a lot of you do not agree with this statement, and try to argue that intel i7 is 8 core as well, so fx8350 gains nothing over i7.
But that is flawed logic. AMD is not going for i7. Look at games like arma, starcraft etc, that use 1-2 threads. All fx cpu are in the same spot, 4, 6, 8 module part performs the same. While this is bad in itself, what is worst for AMD is they have weak single core performance, which means all their fx series CPU loose to 2 core intel haswell. Who would buy 8core FX CPU to have worst performance than half as expensive 2core intel?
But here comes heavy multi-threaded tasks forced by console's design, where 8 core fx cpu steps up to i5 level. /OT
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
You are twisting my words, and this is exactly the reason I brought up the Geometry Wars exception that you conveniently deleted. Sure, if you cut out enough other detail (polygon count, etc.) you can render a crappy source at high resolution and probably a low frame rate. That won't make it a good-looking 4K game.

That isn't what "at all, period" means. You're twisting your own words.

GTA V is a good looking game.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Both Microsoft and Sony want to keep the power consumption in that 120w range that you observed, they do not want a chip that was 150 or 180w power consumption (25% or 50% higher). They did so for they want to keep rmas low, use cheaper power bricks, keep the size small, keep the noise low etc.

Remember this devices are often in places with very bad air flow such as in an entertainment center, a wood or glass box with no air flow. Microsoft and Sony want to make money on these consoles, they do not want high rmas.

If heat/power consumption was not a concern they could get free performance by running the gpu at the same clock speeds they run the desktop chips at!

I agree for those reasons, not because cooling 120-140W in a box the size of a PS4 is inherently hard.

Clockspeed being low is most likely because the chips are large (high chance for defects) and there is nothing that can be done with defective chips. They very likely run the chips at a lower frequency as otherwise they would have to throw out a larger number of them.
 

NTMBK

Lifer
Nov 14, 2011
10,269
5,134
136
Build 360 with a strong graphic card made 360 last the eight years it lasted, in a gaming world were graphics means so much.

Not sure how many of the original batch of 360s actually survived to the end of the 8 years...
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |