Radeon 7900 Reviews

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Will update this list as more come along.

ArsTechnica:
(Ryzen 5800X3D, Asus ROG Crosshair VIII Dark Hero, 64GB DDR4-3200, Windows ???)
https://arstechnica.com/gadgets/202...0-gpus-are-great-4k-gaming-gpus-with-caveats/

Gamers Nexus:
https://www.youtube.com/watch?v=We71eXwKODw

Guru3D:
(Ryzen 5950X, ASUS X570 Crosshair VIII HERO, 32 GB (4x 8GB) DDR4 3600 MHz, Windows 10)
https://www.guru3d.com/articles-pages/amd-radeon-rx-7900-xtx-review,1.html

Hardware Canucks
(Ryzen 7700X, Asus X670E ROG Crosshair hero, 32GB DDR5-6000, Windows 11)
https://www.youtube.com/watch?v=t3XPNr506Dc

Hardware Unboxed:
(Ryzen 5800X3D, MSI MPG X570S Carbon Max WiFi, 32GB DDR4-3200, Windows 11)
https://www.youtube.com/watch?v=4UFiG7CwpHk

Igor's Lab:
(Ryzen 7950X, MSI MEG X670E Ace,32GB DDR5 6000)
https://www.igorslab.de/en/amd-rade...giant-step-ahead-and-a-smaller-step-sideways/

Jay's Two Cents:
https://www.youtube.com/watch?v=Yq6Yp2Zxnkk

KitGuruTech:
(Intel 12900K, MSI MAG Z690 Unified, 32GB DDR5)
https://www.youtube.com/watch?v=qThrADqleD0

Linus Tech Tips:
https://www.youtube.com/watch?v=TBJ-vo6Ri9c

Paul's Hardware:
(Ryzen 7950X, Asus X670E ROG Crosshair Hero, 32GB DDR5-6000, Windows 11)
https://www.youtube.com/watch?v=q10pefkW2qg

PC Mag:
(Intel 12900K, Asus ROG Maximus Z690 Hero, 32GB 5600MHz, Windows 11)
https://www.pcmag.com/reviews/amd-radeon-rx-7900-xtx

Tech Power Up:
(Intel 13900K, ASUS Z790 Maximus Hero, 2x 16 GB DDR5-6000 MHz, Windows 10)
AMD: https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/
ASUS: https://www.techpowerup.com/review/asus-radeon-rx-7900-xtx-tuf-oc/
XFX: https://www.techpowerup.com/review/xfx-radeon-rx-7900-xtx-merc-310-oc/

Tech Spot:
(Ryzen 5800X3D, MSI MPG X570S, 32GB of dual-rank, dual-channel DDR4-3200 CL14, Windows ???)
https://www.techspot.com/review/2588-amd-radeon-7900-xtx/

TechTesters:
(Intel 13900K, ASUS ROG Maximus Z790 HERO, 32GB DDR5-6000, Windows 11)
https://www.youtube.com/watch?v=3uQh4GkPopQ
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
6,111
10,482
136
Even if it is more CPU dependent in MP, wouldn't you just expect the cards to tie in performance, not see the "lesser" card severely beating the much faster card?

AMD’s drivers with modern APIs have less CPU overhead than Nvidia’s. If you look at the chart, the Nvidia GPUs have trouble scaling at 1440p like you would expect when more CPU bottlenecked whereas the Radeon cards scale roughly like you would expect relative to their GPU power.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
I have a debate in a local forum about the difference of RX7900XTX vs RTX4080 in Ray Tracing performance.

Some people were arguing that Techpowerup 8-9 games tested in RT are a small sample and the difference in performance they found to be 16% is not accurate and the difference should be higher in favor of the RTX4080 .

So another user posted the link of BabelTech review of the RX7000XTX and they tested 19 games with RT enabled. So I took the time to compile the result in excel and here is the results (if I have made any errors pls correct me)







 

Abwx

Lifer
Apr 2, 2011
11,535
4,323
136
AMD’s drivers with modern APIs have less CPU overhead than Nvidia’s. If you look at the chart, the Nvidia GPUs have trouble scaling at 1440p like you would expect when more CPU bottlenecked whereas the Radeon cards scale roughly like you would expect relative to their GPU power.


Drivers seems not finalized, according to NBC who made a short review AMD only provided a beta driver for the time...

 
Reactions: lightmanek

Hitman928

Diamond Member
Apr 15, 2012
6,111
10,482
136
Drivers seems not finalized, according to NBC who made a short review AMD only provided a beta driver for the time...


Aren’t all non-WHQL drivers from AMD listed as beta?
 
Reactions: Kaluan and Stuka87

Abwx

Lifer
Apr 2, 2011
11,535
4,323
136
Aren’t all non-WHQL drivers from AMD listed as beta?


Dunno, but seems that they have some work on the table, if we look at the benches scores are out of the place and not homogeneous at all.

An update was released today and if we look at the adressed issues they are still at an early stage of the rework, so far they are sorting early bugs, improving perfs will come later apparently.

 

Hitman928

Diamond Member
Apr 15, 2012
6,111
10,482
136
Dunno, but seems that they have some work on the table, if we look at the benches scores are out of the place and not homogeneous at all.

An update was released today and if we look at the adressed issues they are still at an early stage of the rework, so far they are sorting early bugs, improving perfs will come later apparently.


For sure they need some time and its good that they seem to be quickly acknowledging and (hopefully) fixing some of the release day bugs (e.g. high idle and video playback power consumption). But I believe AMD only releases a couple of WHQL drivers through the course of a year and everything in between gets marked as beta so RDNA3's initial driver was always going to be a "beta" release no matter how ready it was. Someone can correct me if I'm wrong about this.
 

Kaluan

Senior member
Jan 4, 2022
504
1,074
106
Even if it is more CPU dependent in MP, wouldn't you just expect the cards to tie in performance, not see the "lesser" card severely beating the much faster card?
- SP vs MP
- built-in vs custom in-game benchmarking
- choice of CPUs/patforms used
- certain effects being turned on/off affecting one architecture more than the other (anyone remember the diminishing returns 64x tesselation in "Gameworks" games that crippled Radeon performance?)
- SAM/ReBAR on or off

etc

Take your pick, mix as well, all of these are factors, many of which reviewers don't notice or disclose in their testing and methodology

It's not the first game were a slight variance is observed. Either way, it's a fact Radeon GPUs traditionally perform very well in CoD games. Trolls should deal with it and find another hill to get triggered on and hijack AMD threads about. Even OP that brought it up has called it quits 4 pages ago. Good lord.

I have a debate in a local forum about the difference of RX7900XTX vs RTX4080 in Ray Tracing performance.

Some people were arguing that Techpowerup 8-9 games tested in RT are a small sample and the difference in performance they found to be 16% is not accurate and the difference should be higher in favor of the RTX4080 .

So another user posted the link of BabelTech review of the RX7000XTX and they tested 19 games with RT enabled. So I took the time to compile the result in excel and here is the results (if I have made any errors pls correct me)








I call it the "Cyberpunk Effect", people see AMD doing bad at RT in it and just assume Radeon runs equally trash everywhere else lol

And it's not entirely by chance. nVidia paid big bucks to push this mediocre game as THE benchmark for RT. They also tried with Quake II, Minecraft and currently Portal. But those hardly have the same pull.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
No one has argued the 7900xtx is faster than the 4090. No one. There are some corner cases where it might, but I don't think anyone has argued it is faster outside of a couple of these corner cases.

I never said that anyone was saying the 7900xtx is faster than the 4090, because it clearly isn't. I'm talking specifically about COD MW2.

And the dude said, "COD:MW II loves Radeon. Every reviewer got comparable results in this game. Call of Duty games in general run better on Radeon."

I made a mistake when I thought that @amenx was the one that posted that graph at 1440p, but it was @biostud.

I don't know why you keep running interference for this dude, but his words clearly state that he is claiming COD MW2 runs better/faster on Radeon....and he's correct, at lower settings and resolutions due to AMD's DX12 driver being more efficient.

But overall it's not faster than the RTX 4090 under heavy GPU loads.

So if someone is looking to game at all competitively on MW2 multiplayer, this is a great test case to see. If it's not relevant, ignore it and move on. There are plenty of corner cases that are not relevant at all to me on both sides that I just plainly ignore. At most I'll say I don't think it's relevant for x, y, and z and that's it. If others find it useful, then great. No need to go further than that.

I never said it was irrelevant, I only questioned why it was in a GPU hardware review that's supposed to center on GPU performance.
 
Last edited:

lightmanek

Senior member
Feb 19, 2017
413
874
136
I think gauging raw performance is important when testing a new GPU. Don't you want to find out how it behaves when the settings are dialed up and it's stressed to the max?

Not only from a performance standpoint either, but from a build standpoint. What kind of temperatures and power draw can you expect under full load during gaming.

Instead, we have certain reviewers presenting benchmarks that aren't using ultra settings even when the option exists. And for CPU reviews, HWUB doesn't even test games under CPU limited scenarios for the most part.

I can't believe we're even debating this. Computerbase.de tested all of these GPUs at maxed settings with RT on and off, and also with DLSS and FSR on and off.

That should be the standard, and none of this half assed crap.



Yeah I remember that setting, and the developer stated it was for "future GPUs." I don't think it took a decade to become playable either. Doesn't mean that testing it with contemporary GPUs isn't warranted though, and reviewers did test with that setting as I recall but just for laughs.

But in any case, these reviews are supposed to show us what the GPUs are capable of. How can you expose their capabilities when they're not being pushed to the max?

But I do get all that data, just spread across various sites and YT channels. I'm happy with the data I've seen over the last 3 days and already know that sitting on my 6800XT will be a better choice than grabbing new 7900XTX.
Feels like for the first time in a long time I will be sitting longer on my CPU waiting for Zen 4 X3D as well as GPU, either waiting for a price drop of 7900XTX or 4080, alternatively waiting for a 'fixed' refresh 7950XTX or 4080Ti before upgrading.

For my needs, 7900XTX would fit well where it is now if not for power use. I don't mind high power use when cranking things to 11, but equally enjoy when card can sip power in less demanding vSync limited titles. RDNA1/2 was great at that and new RTX40xx series is impressive there too, NAVI31 not so much. I guess this is a first iteration trade-off of chiplet architecture and won't be helped by driver iterations as from what I've seen so far the cards are not able to lower memory clock even in light loads, but I hope I'm wrong and with time things get better.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
I call it the "Cyberpunk Effect", people see AMD doing bad at RT in it and just assume Radeon runs equally trash everywhere else lol

And it's not entirely by chance. nVidia paid big bucks to push this mediocre game as THE benchmark for RT. They also tried with Quake II, Minecraft and currently Portal. But those hardly have the same pull.


Funny thing is the Review saying that 7900XTX has the same performance in RT as RTX3080, but according of the review own numbers the 7900XTX is 39% faster at 4K in RT games.
 
Reactions: scineram

Hitman928

Diamond Member
Apr 15, 2012
6,111
10,482
136
I never said that anyone was saying the 7900xtx is faster than the 4090, because it clearly isn't. I'm talking specifically about COD MW2.

And the dude said, "COD:MW II loves Radeon. Every reviewer got comparable results in this game. Call of Duty games in general run better on Radeon."

I made a mistake when I thought that @amenx was the one that posted that graph at 1440p, but it was @biostud.

I don't know why you keep running interference for this dude, but his words clearly state that he is claiming Radeon runs better/faster on Radeon

"The dude" you are quoting wasn't @biostud , it was a different poster later in the thread. It's hard to follow your posts when you reference instead of us the quoting system and then conflate multiple posters.

....and he's correct, at lower settings and resolutions due to AMD's DX12 driver being more efficient.

So what are you even arguing anymore?

But overall it's not faster than the RTX 4090 under heavy GPU loads.

Again, no one made a counter argument to this. One person simply posted a graph that they found unusual because the 7900xtx was so far in the lead and your and @amenx response was that it was crap, nonsensical, a flawed benchmark. Now you're saying it's probably legit, it's just irrelevant to you.

I never said it was irrelevant, I only questioned why it was in a GPU hardware review that's supposed to center on GPU performance.

So you never said it's irrelevant, you're just questioning its relevance?

If you don't find the test useful, choose to ignore it. The majority of HWUB's followers specifically wanted this type of test so obviously many people out there find this type of test useful. It's not the reviewers fault or the audience's fault if Nvidia's drivers are holding back their card in this scenario. Just like it's not the reviewers fault or the audience's fault that AMD's drivers aren't properly allowing the RDNA3 cards to idle under certain scenarios. That doesn't mean we just don't report RDNA3 idle power numbers because their drivers have a bug in them that makes the card behave the way it does.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
7,146
7,640
136
For sure they need some time and its good that they seem to be quickly acknowledging and (hopefully) fixing some of the release day bugs (e.g. high idle and video playback power consumption). But I believe AMD only releases a couple of WHQL drivers through the course of a year and everything in between gets marked as beta so RDNA3's initial driver was always going to be a "beta" release no matter how ready it was. Someone can correct me if I'm wrong about this.

- I mean, if there was ever a time to do an out of cycle WHQL driver release... you'd think launching your next gen flagship card would be it
 
Reactions: blckgrffn

Hitman928

Diamond Member
Apr 15, 2012
6,111
10,482
136
- I mean, if there was ever a time to do an out of cycle WHQL driver release... you'd think launching your next gen flagship card would be it

Does anyone care? Not trying to be antagonistic at all, it's an honest question. Having WHQL certification doesn't really mean much at all as far as I can tell for a DIYer or even end user of a prebuilt. Only thing I can see it is good for is for an OEM/system integrator to use to cover their butts. Then, when something goes wrong, they can at least give lip service and say they were using certified drivers.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,097
460
126
This meme is really getting old, HWUB retested 6800xt vs RTX3080 10GB earlier this year... and the overall positions are basically identical to what they were at launch (6800xt by a bit in low resolutions, 3080 by a bit in high resolutions).
Buying an AMD card with expectation that performance improvements will change its competitive position down the line is likely to just disappoint.
You mean this didn't just happen 7 months ago?

Wow, I swear it did... I mean, I am pretty sure it did. At least according to all these sites it did:


I mean I could keep going and going.... But the main point is exactly what I said. AMD's drivers for new cards at release suck. In general they also have serious problems, some so bad that they can find 90% performance boosts once someone realizes what they screwed up in the driver code and optimizations.... You don't see 90% improvements without having royally messed up initially.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,290
3,435
136
www.teamjuchems.com
Does anyone care? Not trying to be antagonistic at all, it's an honest question. Having WHQL certification doesn't really mean much at all as far as I can tell for a DIYer or even end user of a prebuilt. Only thing I can see it is good for is for an OEM/system integrator to use to cover their butts. Then, when something goes wrong, they can at least give lip service and say they were using certified drivers.

I’ve used Radeon drivers long enough that I only stay on the recommended drivers - that means the WHQL. The last one was in what, April or May? There has been enough stuff added that I went ahead and jumped on the latest in our household. Minecraft is a huge title for us and it felt silly leaving that massive uplift on the table.

About memory usage - I wish at idle or even with Chill enabled it could turn off memory (ideally configurable) as most of the time 4 GB would suffice and you’d think with the chiplets they could idle 5 out of 6 of them completely. I know this would really harm performance but I mean many titles just don’t need the juice - and neither should two or more monitors!
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
You mean this didn't just happen 7 months ago?

Wow, I swear it did... I mean, I am pretty sure it did. At least according to all these sites it did:


I mean I could keep going and going.... But the main point is exactly what I said. AMD's drivers for new cards at release suck. In general they also have serious problems, some so bad that they can find 90% performance boosts once someone realizes what they screwed up in the driver code and optimizations.... You don't see 90% improvements without having royally messed up initially.
So the fact that AMD did a rewrite of their garbage OpenGL drivers after so many years is now evidence in favor of FineWine?
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,097
460
126
So the fact that AMD did a rewrite of their garbage OpenGL drivers after so many years is now evidence in favor of FineWine?
I never argued in favor of saying buy this/that AMD card because the cards age better. My argument has and always was that AMD has serious driver issues which takes them months if not years to fix for all their cards, especially new ones at release. Once they fix those issues, performance finally hits what it should have been at release.

I have said this many times, in many threads. I think a lot of it has to do with the different approach AMD has towards their drivers from Nvidia's approach (building a specific driver for each release vs building a unified driver for all cards). I think so many issues get lost in the various teams doing the development work where the team supporting the current gen cards don't tell the team working on the next gen card various issues/fixes because they all think the code is too different from each other and it can't be easily adapted. Having a general framework that is uniform across all the cards helps keep many of those kinds of things from slipping, where-in a problem might get fixed for one card, but not another, or an optimization might have happened for one but not another. It also helps prevent the team working on the latest card from re-creating past issues that were fixed on other cards, but just re-introduced in the next gen card.
 
Last edited:
Reactions: lopri

Kaluan

Senior member
Jan 4, 2022
504
1,074
106
Funny thing is the Review saying that 7900XTX has the same performance in RT as RTX3080, but according of the review own numbers the 7900XTX is 39% faster at 4K in RT games.
Well if someone is basing all their RT assumptions purely off of 3DMark's DXR/RT feature test then that would be the case.
But we know real world RT performance standings are more like those from Port Royal or Speed Way graphics scores.

Either way, relying on synthetics too much is a bad idea. Personally, I look for case by case data.
 
Reactions: Leeea

Leeea

Diamond Member
Apr 3, 2020
3,777
5,540
136
My two cents. Since RDNA. Shouldn't the AMD drivers be fairly streamlined since RDNA1 and RDNA2? They are now on RDNA3. How much could have changed in the architecture?
There are big changes in RDNA3, with the chiplets talking to each other.

It is the bleeding edge again.

I would suggest waiting at least a month to let others do the bleeding before purchasing.
 

CP5670

Diamond Member
Jun 24, 2004
5,535
613
126
The XTX is a strong card overall, if you can actually get it at its original price. I would like to see more reviews of the AIB models with higher power limits. It seems like the card has a lot of OC headroom for improved performance.
 

Gideon

Golden Member
Nov 27, 2007
1,765
4,114
136
My two cents. Since RDNA. Shouldn't the AMD drivers be fairly streamlined since RDNA1 and RDNA2? They are now on RDNA3. How much could have changed in the architecture?
A lot. The CUs are VLIW2, meaning to get anything out of the theoretical 2x increase in FP32 performance, you need some driver magic. Without it, it's just the small ~20% increase in CUs (80 ->96) and whatever the extra clockspeed gives you (which isn't much, compared to 6950XT).
 
Reactions: lightmanek

biostud

Lifer
Feb 27, 2003
18,654
5,384
136
A lot. The CUs are VLIW2, meaning to get anything out of the theoretical 2x increase in FP32 performance, you need some driver magic. Without it, it's just the small ~20% increase in CUs (80 ->96) and whatever the extra clockspeed gives you (which isn't much, compared to 6950XT).

So, a bit like the Bulldozer succes with double int cores?
 

Gideon

Golden Member
Nov 27, 2007
1,765
4,114
136
The XTX is a strong card overall, if you can actually get it at its original price. I would like to see more reviews of the AIB models with higher power limits. It seems like the card has a lot of OC headroom for improved performance.

Yeah, but in Europe the price/perf ratio truly suck at the moment.

On mindfactory.de the cheapest RTX 4080 is 1359€, while there are no XTX available and the XT is 1129€ or up. Considering the chepest RX 6800XT is 689€ and only ~20-25% slower than the 7900 XT, the pricing is absoutely outrageous. The card should cost 800-850€ to at least be on par with the rest on price/performance.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |