Level of GPU performance to match Xbox 360 and PS3

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
What level of GPU performance would be enough to play a game on a PC such that it is no worse in performance and graphics quality if played in an Xbox 360 or PS3?

If you wanted to play "console quality" (performance + visuals, Xbox360/PS3) in a PC (I'm not sure what consoles do, something like 720p @ 30fps?), what kind of minimum hardware specs would you have to have?

I was asked this question a few weeks back, and off the top of my head I said a Radeon 7770 / GTX 650 is probably all that's needed, based on 720p@30fps requirement. I'm not sure though if I underestimated or overestimated the capability of consoles (I don't own any myself), so I'm throwing this out there for the collective ATF borg mega-brain to answer.

Thanks.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Even an 8800 has plenty of power to play most games at 720p with all details at medium at much higher than 30 fps. It really is too complex of a question to ask. There is no way to know what the apples to apples settings are on any given game. I recently was toying around with my 8800 class cards to see what they could do and Tomb Raider was easily playable at 1920x1080 with settings set higher than I expected. I was playing Batman AA not too long ago on my X1900 XTX with awesome results.

I'm willing to bet that an 8800 would be able to compete against the consoles in 95% of the games out and still provide higher visual quality and fluidity.

The real trouble is that consoles are using gamepads that will make the game feel way more smooth than you flicking a mouse. 30fps with a controller is way smoother than a high response mouse. A fast mouse movement can make 60 fps feel slow.


**edit** Skip beat me to it
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
it's hard to tell, but for most games I think a 6670 would be enough,
consoles are highly optimized, but the PS3 GPU is slower than a 7900GS.

the trouble is, consoles and PC are not running exactly the same games, like if compare BF3 with the MP made for 2x the number of players on the PC, or Crysis 3 optimized for high end DX11 cards on the PC and requiring a lot more processing power.

anyway, this might help you
http://www.eurogamer.net/articles/df-hardware-introducing-the-digital-foundry-pc

they compared last a year some Sandy Bridge Pentium + 6770 (renamed 5770 from 2009) with consoles.
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
You're gonna need 3 Titan 6GB to match the power of an XBOX 360 or Playstation3, Am I right or am I right?
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
That's a tough one because the raw gpu specs/strength of the 360/PS3 do not translate equally to raw graphics strength of GPU when realized in graphics fidelity between a console and on PC.

It'd be best if someone could do a side by side comparison of graphics quality or get a game dev response to what level graphics settings on a pc port equal the quality on the console version.

My guess is that you'd need somewhere between GTX260 Core 216 and GTX460 256bit level GPU in your PC to match 360/PS3. For AMD equivalant i'd say between 5770 and 5850.

Games these days on the consoles are going to be better optimized, so the GPU strength of PC GPU's will have needed to grow in strength to match whats available through improved coding on the consoles.

GTAV for example looks and runs much much better than first gen games.
 

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
Well, technically the PS3 RSX Reality Synthesizer GPU is based on the G70 DirectX 9 class GPU and is slightly slower than a 7800 GTX.

The XBOX 360's Xenos C1 / R500 is based on the ATI Radeon X1800 R520 DirectX 9 class GPU's but had some features which were later built into the R600 series.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
amd a10 trinity/richland will handle 720p30 easy for the majority of games out there.

just wait for kaveri for 900/1080p30

check sig below for gameplay platlist.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
According to Anandtech's own Intel Iris Pro Review, the HD 4000 at 1150 MHz is rated a hair under 300 GFLOPS. Even with memory bandwidth limitations, I'd put it well above the Geforce 6800, at least with the 7800 or 7900. More modern desktop cards like GT 520, 620 and Radeons 6450/7450 are in general line of the HD 4000's performance.

As per what GPU I would call "guaranteed console quality" in a readily available card, I'd say 6670 GDDR5. Of course it has plenty extra power for more FPS, AA, AF, and/or higher resolutions. In general however, I recommend a Radeon 7750 simply because it's the best performing GPU out there that doesn't need an external power connector while being too good a value. I keep threatening my girlfriend that I'm going buy one for her desktop lol. She's not a much of a gamer, (she has a 360 and enjoys playing Left 4 Dead 2 on my PC), but I want to completely evangelize her
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Any 28 nm chips from Nvidia or AMD are going to out-muscle the GPUs in the PS3 and the 360. You only might run into some issues if you go with a DDR3 7750 or GT 640. Back when I had a 5770/Q6600 combo I was running console ports at 1080p with a little antialiasing thrown in (in particular, I played the heck out of Mass Effect 2 and 3, which have no added effects on the PC version aside from being able to dial up the anisotropic filtering and enable driver AA). Heck, when I was running an 9800 GT/E6600 combo I could play console ports smoothly at 1080p. Even that is overkill for matching console performance at 720p.

So I wouldn't even go as far as a 8800 GT just for the purpose of matching console performance. You could probably bring things up to par with a console with a low tier 40 nm GPU, like a Radeon HD 6670 or 5670. I gave my brothers a 5670 a while back (they've since upgraded to my 5770 after I got my current 7870). I didn't use it much but they were playing games at 1440x900 without much of a hitch.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
It needs to be a higher performance level than the raw capability of the console GPU, because consoles get a net bonus on performance for having customized code and some console ports are horrendously coded.

I think you overestimated quite a bit with 7770. I think you could get there with less. However, if you're a shrewd shopper, a 7770 is about as cheap as it gets. Some computers might need a 7750 if they have severe PSU constraints, at that point it doesn't really matter that you could do better with a lesser card. Once you get under the ~$70 ish range, lower tier cards become kind of a moot point, you end up taking a 30-50% performance hit for $10 cheaper

Keep the resolution low and you can do a lot with the low end gaming cards.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Red Hawk, I mentioned what I did because numerous "good enough" graphics cards like the 5670 GDDR5 are no longer available. Also, like Concillian mentioned, it's important to consider the optimization bonus console versions of multiplatforms may have. Fortunately there are exceptions to that rule (mostly "PC-first titles" like Far Cry 2, F1 2010, L4D2).

In short, I like to consider the "500-50" rule when comparing PC graphics to the consoles. It entails that a graphics card with double the Xbox 360's GPU GFLOPS (240 x 2, so 500 GFLOPS range) and double video memory bandwidth to the 360's GPU to main memory (X360 main memory bandwidth to GPU is ~22 GB/s, so about 50 GB/s or more) helps to make sure that performance is guaranteed to be just as good at the same or similar visual settings to the PS3 or 360 version of a game while still leaving room for improvement. Luckily meeting the 500-50 rule is pretty cheap and again as Concillian mentioned in his second paragraph, going under $70 is pretty stupid. Hopefully Oland is the lowest end GPU in the new lineup (bye bye Caicos). By default, in 128 bit GDDR5 form, Oland runs rings around the 360 and PS3.
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
Wow, awesome thread!!!
it's hard to tell, but for most games I think a 6670 would be enough,
consoles are highly optimized, but the PS3 GPU is slower than a 7900GS.

the trouble is, consoles and PC are not running exactly the same games, like if compare BF3 with the MP made for 2x the number of players on the PC, or Crysis 3 optimized for high end DX11 cards on the PC and requiring a lot more processing power.

anyway, this might help you
http://www.eurogamer.net/articles/df-hardware-introducing-the-digital-foundry-pc

they compared last a year some Sandy Bridge Pentium + 6770 (renamed 5770 from 2009) with consoles.
Thanks for the link - great find!!

Beyond3D has a thread showing all resolutions/kinds of AA used for games on the consoles:
http://forum.beyond3d.com/showthread.php?t=46241

Skyrim is probably the best-looking console game thus far, looking perhaps 5-10x better than Oblivion did on the same consoles. Amazing how far programming efficiency can get.. yet Skyrim still runs better on an 8800GTX than on the consoles (keep in mind that an 8800GTX has much more than 2x the GPU muscle).

HOWEVER - the PS360 games have become such an eye-sore for me that I just can no longer deal with them on my 65" screen. On smaller screens, I guess I could tolerate them, but on a screen that big, I feel like I'm losing my own eyesight - with everything being so fuzzy and low quality compared to what I've become accustomed to on the PC. The difference is absolutely staggering.
 

BoFox

Senior member
May 10, 2008
689
0
0
Well, technically the PS3 RSX Reality Synthesizer GPU is based on the G70 DirectX 9 class GPU and is slightly slower than a 7800 GTX.

The XBOX 360's Xenos C1 / R500 is based on the ATI Radeon X1800 R520 DirectX 9 class GPU's but had some features which were later built into the R600 series.
LOL - you probably feel so "at home" with these consoles, especially with your NOSTALGIC rig in your signature!!!

amd a10 trinity/richland will handle 720p30 easy for the majority of games out there.

just wait for kaveri for 900/1080p30

check sig below for gameplay platlist.
Wow!!! Your youtube videos.. you're a huge help for those with APU's!!

Red Hawk, I mentioned what I did because numerous "good enough" graphics cards like the 5670 GDDR5 are no longer available. Also, like Concillian mentioned, it's important to consider the optimization bonus console versions of multiplatforms may have. Fortunately there are exceptions to that rule (mostly "PC-first titles" like Far Cry 2, F1 2010, L4D2).

In short, I like to consider the "500-50" rule when comparing PC graphics to the consoles. It entails that a graphics card with double the Xbox 360's GPU GFLOPS (240 x 2, so 500 GFLOPS range) and double video memory bandwidth to the 360's GPU to main memory (X360 main memory bandwidth to GPU is ~22 GB/s, so about 50 GB/s or more) helps to make sure that performance is guaranteed to be just as good at the same or similar visual settings to the PS3 or 360 version of a game while still leaving room for improvement. Luckily meeting the 500-50 rule is pretty cheap and again as Concillian mentioned in his second paragraph, going under $70 is pretty stupid. Hopefully Oland is the lowest end GPU in the new lineup (bye bye Caicos). By default, in 128 bit GDDR5 form, Oland runs rings around the 360 and PS3.
Interesting!! Yeah, going under $70 is retarded!
 

tential

Diamond Member
May 13, 2008
7,355
642
121
HOWEVER - the PS360 games have become such an eye-sore for me that I just can no longer deal with them on my 65" screen. On smaller screens, I guess I could tolerate them, but on a screen that big, I feel like I'm losing my own eyesight - with everything being so fuzzy and low quality compared to what I've become accustomed to on the PC. The difference is absolutely staggering.

So true lol. I am loving Tomb Raider, Crysis 3, etc. on my 70 inch TV. I have to say, with Crysis 3, I'm just blown away. I didn't even change it from default settings yet....

Edit: To answer the question in the OP, I think APUs are the way to go for emulating console experience. In fact, I've been saying that APU's in 4 years time will be a HUGE thread to the console market with SteamOS coming out and Steam Box's as well. Now that it's finally here, I think this is where AMD can REALLY shine. Putting together a $400 system that can compete with an Xbone/PS4 without the shortage would be great. I don't think they'll really do anything special this generation though. They'll just be there as an underdeveloped option, similar to how Steam when it first came out was a joke, this willb e like that too.
Next gen though, I think with AMD advances in their APU tech, it'll definitely be a HUGE contender. Hell, it may also compete very well with the mid/late adopters. Not everyone adopts early, just enthusiasts that you'd find on forums.

I'd go with an AMD APU. Considering next gen games may also be running at 720p (for some anyway), you might not miss much at all.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
APUs have a way to go. For gaming today they are fine if you're aiming for current console capabilities. Versus the Xbox One and PS4, they are grossly outclassed in graphics performance and more importantly in memory bandwidth. HSA, hUMA, and DDR4 won't be enough if dual channel remains the norm. That [partially] means dedicated graphics certainly have a future because they not only possess the GPU capabilities necessary but the bandwidth to use it effectively and efficiently.

The AMD style APU possibly has a better future as maybe and I say maybe the go to central processor of choice if GPGPU takes off on consoles, in particularly the PS4.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
APUs have a way to go. For gaming today they are fine if you're aiming for current console capabilities. Versus the Xbox One and PS4, they are grossly outclassed in graphics performance and more importantly in memory bandwidth. HSA, hUMA, and DDR4 won't be enough if dual channel remains the norm. That [partially] means dedicated graphics certainly have a future because they not only possess the GPU capabilities necessary but the bandwidth to use it effectively and efficiently.

The AMD style APU possibly has a better future as maybe and I say maybe the go to central processor of choice if GPGPU takes off on consoles, in particularly the PS4.

Yup, and also they fit into nice little neat cases pretty nicely as well.
http://www.engadget.com/2013/09/12/gigabyte-brix-gaming-pc/
Something like this but with an APU instead.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Whoa that thing is pretty sweet! Now if only it was a bit bigger and had a GDDR5 Cape Verde!
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Just evaluating AMD A6-6400K and it can play lots of games at 720p 30fps+. Game settings varies from Low to High.
The A10 Trinity/Ritchland will play all current games at 720p 30fps+ with better graphics than Xbox and PS3.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Red Hawk, I mentioned what I did because numerous "good enough" graphics cards like the 5670 GDDR5 are no longer available. Also, like Concillian mentioned, it's important to consider the optimization bonus console versions of multiplatforms may have. Fortunately there are exceptions to that rule (mostly "PC-first titles" like Far Cry 2, F1 2010, L4D2).

Meh, there's little difference between the 5670 and the 6670 anyways. I just meant to say that mid-low tier 40 nm chips like Redwood and Turks are enough to match consoles (not absolute lowest tier though; Cedar and Caicos don't cut it.)

Edit: Here's a benchmark for Crysis 2 at High settings (probably better than on consoles) at 1280x1024, better than console resolutions:



The 6670 and 5670 are comfortably above the 30 FPS minimum mark. I wouldn't trust the 6570 result though; apparently review outlets were given a GDDR5 version of the 6570 to test, while virtually every GPU partner actually made 6570s with DDR3 (and why not? The GDDR5 6670 already filled a spot in the GPU lineup; DDR3 memory let them keep 6570 prices lower and more evenly filled the space between the 6670 and the Caicos-based 6450. More frustrating was when they released 6670s with DDR3). The 5570 is a more accurate picture; it is limited by GDDR3 memory and drops below a 30 FPS average.

APUs have a way to go. For gaming today they are fine if you're aiming for current console capabilities. Versus the Xbox One and PS4, they are grossly outclassed in graphics performance and more importantly in memory bandwidth. HSA, hUMA, and DDR4 won't be enough if dual channel remains the norm. That [partially] means dedicated graphics certainly have a future because they not only possess the GPU capabilities necessary but the bandwidth to use it effectively and efficiently.

The AMD style APU possibly has a better future as maybe and I say maybe the go to central processor of choice if GPGPU takes off on consoles, in particularly the PS4.

If Intel stays the course with its high-bandwidth Crystalwell ESRAM chip and its Iris Pro grpahics, it could actually probably overtake at least the Xbox One in two or three generations. With the jump down to 14 nm to increase resources on the die, an architectural update, DDR4 system memory, and good drivers, you'll probably get much the same experience as the Xbox One (with a faster CPU to boot).

Intel has just as much potential in GPGPU as AMD does. If AMD wants to maintain the lead in processor die graphics it needs to come up with an answer to Crystalwell.
 
Last edited:

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Wow, thanks for all the detailed replies along with your trains of thought about the issue and how you came to your conclusion, guys. Honestly, I thought I'd get about 4-5 replies and then matter closed. Very cool of you guys :thumbsup:

Spec-wise we know the 360 and PS3 are unremarkable and several several generations old (as nenforcer posted, thanks), but of course in actual delivered performance, the experience isn't exactly comparable apples to apples from console hardware to PC hardware on the basis of specs alone due to being closer to the metal in consoles and optimized for the specific hardware, instead of needing an abstraction layer to handle the wildly varying configs of users such as in PC land (as noted by SPBHM, Attic, Concillan, NUSNA_Moebius, etc).

skipsneeky2, lavaheadache, I actually did also have the 8800GT-class GPU's in mind (although I was hovering more on 8800GTX+/ Radeon 4770 territory in my mind). I just wasn't sure how these old cards could compete based on the performance of newer games for these old, soon-to-be-replaced consoles, the 360 and PS3. Skyrim, for instance. Better looking than Oblivion on the same console, so that has got to come from more efficient development, because the consoles themselves certainly didn't get any powerful waiting for Skyrim to arrive.

However, BoFox (thanks for chiming in Bo!) magically knew what I was thinking and answered the exact concern in my mind before I even got a chance to post about it So, indeed, looks like 8800GTX+ class cards do indeed play Skyrim as good or better than consoles.

AtenRa, tential: I actually did bench Skyrim on an APU (very weakest mobile Trinity APU with only dual-channel 1333MHz RAM) and got some playable enough results @ conservative settings (no AA, low shadows, etc). While it was playable enough, I wasn't sure how comparable to console quality and performance it was. I guess it would make sense a bit that the highest end parts could be at least console quality.

All the talk of APU's are great (I've got nothing against them) although I'm really quite more interested in Intel's APU's*. A much more powerful and power-efficient CPU + decent graphics is a great match if all you need to do is casual console-quality gaming every once in a while when taking a break from work (not sure when even their i3 offerings will provide at least console-quality performance, if it hasn't happened yet, but that would be very welcome) - I guess here I have a similar opinion as Red Hawk above (thanks for posting :thumbsup. I wouldn't mind if AMD keeps on increasing their mainstream and low-end APU performance though, and I do hope they come up with their own Crystalwell-like tech to mitigate memory bandwidth issues.

*I know they don't call them APU's, but it's like "GPU" term: AMD didn't initially call their products "GPU" - they invented the "VPU" term or something that didn't take off - but "GPU" became the de facto term for the particular piece of tech, and now AMD uses the "GPU" term even in their own website, despite it being coined by NV. It's kind of like that for Intel's CPU+IGP products. The APU term is widespread in usage now through AMD's use of it, everybody knows what you mean by it, eventually it's probably going to be used even in Intel's "CPU w/ integrated graphics" products, officially or not, when enthusiasts talk about "CPU's with Integrated Graphics" as a product.


@FalseChristian: 3 Titans, sir? Maybe if you said 1, I might have believed you, but 3 is just 2 Titans too many to match even the herculean power of a 360/PS3
 
Last edited:

Brunnis

Senior member
Nov 15, 2004
506
71
91
The thing to remember is that console versions of games usually employ a host of subtle and not so subtle tricks to fit the comparatively slow performance of the hardware, such as:

- Limited FOV
- Very low resolution, by PC standards. Often below 1280x720 and sometimes considerably so.
- Low performance. It's pretty much the norm with FPS frequently dipping down in the low 20s.
- Very scaled back details. Sometimes lower details than even the lowest settings on the PC version.

If you take a PC version of a game and scale it back according to the above, you'll get surprisingly far with some very dated hardware. An 8800GT will, in my experience, run almost any game considerably better (as in locked 60 FPS) than the 360 or PS3.

I actually did some tests years ago with an old X1950XT 256MB and an 8800GT 512MB. I don't have the details any longer, but I do remember that the X1950XT easily ran Gears of War better than my 360 (it looked better and hovered between 45 and 60 FPS in most cases).

Obviously, there's no need to go for hardware this slow anymore. Most people on the PC side are not prepared to run games as scaled back as the console versions anyway...
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |