Discussion RDNA4 + CDNA3 Architectures Thread

Page 421 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
Mar 3, 2017
1,773
6,746
136





With the GFX940 patches in full swing since first week of March, it is looking like MI300 is not far in the distant future!
Usually AMD takes around 3Qs to get the support in LLVM and amdgpu. Lately, since RDNA2 the window they push to add support for new devices is much reduced to prevent leaks.
But looking at the flurry of code in LLVM, it is a lot of commits. Maybe because US Govt is starting to prepare the SW environment for El Capitan (Maybe to avoid slow bring up situation like Frontier for example)

See here for the GFX940 specific commits
Or Phoronix

There is a lot more if you know whom to follow in LLVM review chains (before getting merged to github), but I am not going to link AMD employees.

I am starting to think MI300 will launch around the same time like Hopper probably only a couple of months later!
Although I believe Hopper had problems not having a host CPU capable of doing PCIe 5 in the very near future therefore it might have gotten pushed back a bit until SPR and Genoa arrives later in 2022.
If PVC slips again I believe MI300 could launch before it

This is nuts, MI100/200/300 cadence is impressive.



Previous thread on CDNA2 and RDNA3 here

 
Last edited:

gaav87

Senior member
Apr 27, 2024
649
1,267
96
Anyone can explain to me how i managed to do 418W TBP when max is 374W ?
Also can't replicate this powerdraw hit 401fps at 378W in furmark... I also beat 5080 in furmark vulkan by 1 fps but win is, a win
Also why is my max 374W when i do 378W constantly ?
TPU also reports my taichi as 404W !?!
Im 100% on performance bios maybe im on some old bios its from 2024-12-03 ? Yes my card bios was 4 months old when i bought it...
Also why is gpuz reporting 3100mhz when at 0mhz offset im 3300mhz and +300ofset gives me 3600mhz ? -500offset is 2800mhz







 

Josh128

Senior member
Oct 14, 2022
705
1,228
106
AMD is stalling gaming progress! AMD GPUs slowing down innovation! Gotta get rid of them!

Absolutely nothing to do with Nintendo consoles based on underpowered Nvidia chips forming the turd baseline for game devs.
Thats 100% Nintendos fault, not Nvidias, lol. But then again, they have it right.
 

Mopetar

Diamond Member
Jan 31, 2011
8,298
7,302
136
Anyone can explain to me how i managed to do 418W TBP when max is 374W ?

How are you measuring the power draw? Software can always be inaccurate, but there's also the case that the hardware is capable of more than advertised. If you get an OC card it might be overbuilt because they expect customers to push it and if it cuts out at the printed specs it would get trashed in reviews.

I do t have any reason to think this is the case, but be careful you're not degrading your silicon. Intel CPUs could be pushed well beyond their rated TDP, but there was often a cost for flying so close to the sun. Impressive results all the same though.
 

poke01

Diamond Member
Mar 8, 2022
3,381
4,625
106
Thats 100% Nintendos fault, not Nvidias, lol. But then again, they have it right.
True, I actually like their games. Anyway games nowadays are scalable across platforms.

It just goes to show that creativity is better than shoving eye candy at games.

If Nintendo went with AMD, they would have picked a low power SoC too. It’s not Nintendos thing to go all out with performance.
 

poke01

Diamond Member
Mar 8, 2022
3,381
4,625
106
I would also argue that Nintendo pushes the industry forward gameplay and games wise than NV, Sony and MS. These companies are meh in terms of creative design when it comes to their ethos.
 
Jul 27, 2020
23,462
16,510
146
If Nintendo went with AMD, they would have picked a low power SoC too. It’s not Nintendos thing to go all out with performance.
Dumb Nintendo executives probably blamed the Wii U's sales numbers on going with AMD.

Or Jensen got down on one knee and begged them to take his SoC because he couldn't fathom why Nvidia isn't synonymous with console gaming.

I think BOTH of those things happened.
 
Reactions: blackangus

Josh128

Senior member
Oct 14, 2022
705
1,228
106
Dumb Nintendo executives probably blamed the Wii U's sales numbers on going with AMD.

Or Jensen got down on one knee and begged them to take his SoC because he couldn't fathom why Nvidia isn't synonymous with console gaming.

I think BOTH of those things happened.

AMD didnt have a viable equivalent low power solution at the time comparable to the Maxwell + ARM solution used in the Switch.
 

Saylick

Diamond Member
Sep 10, 2012
3,866
8,969
136
no they just had to make Switch work and X1 was the best off the shelf option.
Yeah, Jensen kind of lucked into winning the contract. I like to believe he had a pile of cheapo X1 SOCs that no one wanted anymore after Nvidia’s mobile SOC ambitions failed and Nintendo being the cheapskates they are said, “Sure, we can use them if you’re willing to sell them for cheap.”
 

Josh128

Senior member
Oct 14, 2022
705
1,228
106
Yeah, Jensen kind of lucked into winning the contract. I like to believe he had a pile of cheapo X1 SOCs that no one wanted anymore after Nvidia’s mobile SOC ambitions failed and Nintendo being the cheapskates they are said, “Sure, we can use them if you’re willing to sell them for cheap.”
Without a doubt, the heads of Nintendo are very shrewd businessmen. Im sure AMD also made a pitch, having powered the previous 3 Nintendo consoles (4 if you consider that the ArtX engineers were some of the same SGI guys that worked on the N64). AMD simply didnt have the ultra low power tablet hardware that Nvidia did at the time. The choice worked out quite well for them.
 

DaaQ

Golden Member
Dec 8, 2018
1,767
1,252
136
It used to be a top end model would be around $80-100 over MSRP, or around 10-15% markup. [Excluding rare, special XOC models like K|NGP|N or HOF]

Now we are seeing $200, $300, $500, even over $1000 markup which is on the range of 30%, 40%, 50%+ in some cases for normal, air cooled cards.

Please, explain to me why this is ok?
Ok I am writing 10 pages early, bare with me.

It is not ok is my answer. BUT stupid anticipated tariffs are making people jump.
If I would have known 9070XT would be as good as is, I would not have bought a 7900xtx last late November. But I got it under a grand brand new. Sapphire Nitro.
The prices have increased by $100 and that's why retailers are stupid. People have no issue punching their grandmother in the throat and stealing her social security for GPU money. They could have jacked the prices up $300 across the board and they'd still get scalped for double that cost at least. Also, how can I make sure our resident autist extraordinaire knows I didn't mean any disrespect to him? If he's that smart then that makes him kind of scary, sort of in a Mr. Robot kind of way.
I hope this refers to someone else. I am not sure ATM.
People arent willing to pay as much as you seem to think. Most scalpers listings are around $1000-1200 and aren't selling.

This ebay listing ended without hitting reserve at $180 over retailer pricing ($760)

View attachment 119835

Heck, newegg is selling cards in combos with a $120 power supply and they arent selling quickly at all. People aren't super willing to pay even $120 extra despite getting something for it they can use or resell to cover most of that additional cost.
When I caught my 9800x3d during 2nd or 3rd week, Newegg backorder came a week early, I seen all of the bundles they made while single chip was sold out. In hindsight, I should have bought the bundle and sold not wanted parts off. It would have saved me money but not trouble/time. I had no problem paying the 479 retail for 9800x3d. Heck, back in the day I spent 1k for a 6600k but those were extenuating circumstances. ( I had the money to burn).

I did see that very early on ALOT of chips went into bundle deals, but I wanted to see what X870E brought, and needed at minimum a PCIEx1 slot for sound card.
Wow, another plus for AMD. They're so undesirable that cannot even be scalped in this market!

0 mindshare doing the Lord's work right now
Reminded me of the series "The Deathgate Cycle" just bought and revisited on audio book. Dragon Lance authors btw.

Sorted by sold items and highest price first. If you scroll through many, you'll see how the sold price generally falls slowly as the release date gets farther away. The panicked gamer has done generational damage, yet again, with their absolutely insane, unhinged, lunatic buying behavior. By purchasing at these prices, the zombified, brain dead gamer has sent an enormous bat signal style message in the skies above to the world that they crave, love, and enjoy higher GPU prices now and in the future.
It feels like or seems like you still think of the panicked gamer as someone that has still not graduated high school. Most have are are pulling down pretty decent disposable income. Possibly from scalping all the way back in the 290x days. Since they got older maybe started scalping larger items like 2020 Ford Broncos just to name one.
These numbers are a drop in the ocean for the market size by the 100s of thousands of mainstream buyers considering Radeon for upgrades (I would put it in millions for those buying Nvidia). The numbers would anyways swell to millions over a year for AMD too.
With 9070 series competitiveness against Nvidia's worst slip up in recent memory, this situation was largely mitigatable if AMD wanted to. They had a LOT of time to gauge market sentiment. And AMD doesn't really have much to stand on with 9060 series with much nerfed cards.

We can see from earlier posts that most people are not buying even +100 to 150 inflated prices. Lack of sales will force hands anyways, and NV supply situation too will be much better in the medium term.
Problem is Nvidia's AIB partners inflated prices. Might be tariff fears, or just gross greedflation.
Take 7900 XTX and update to modern RDNA5 main chiplet - at least 600 sq mm, 384 bit GDDR7 plus at least 256 MB cache, let it be 600W card.
In jest, we can't have an AMD card pulling that much power, heck, the 9070xts doing over 300 is a carnal sin. Even though 40/5090 @ 600w is A OK.

I just upgraded my XTX VBIOS to the aqua XOC 550w one. At least Arock had the stones to put it out. It does compete with 4090 in benchmarks. Well you have to look on OCN for that, had to double check witch tab I was on.
Meh. The frequency and power is already juiced enough on XT...the price is too, now. They should just produce as many of those as possible and not waste time on this.
I just can't figure you out. You complain about AMD power usage, but lower tier Nvidia is good? Is that what I am missing? You are a mid tier board seeker and can't stand that 9070xt was raised by AIB to what 320w and 5070/ti is what something supposedly like 220w?

What about the missing ROPs?
 

DaaQ

Golden Member
Dec 8, 2018
1,767
1,252
136
Anyone can explain to me how i managed to do 418W TBP when max is 374W ?
Also can't replicate this powerdraw hit 401fps at 378W in furmark... I also beat 5080 in furmark vulkan by 1 fps but win is, a win
Also why is my max 374W when i do 378W constantly ?
TPU also reports my taichi as 404W !?!
Im 100% on performance bios maybe im on some old bios its from 2024-12-03 ? Yes my card bios was 4 months old when i bought it...
Also why is gpuz reporting 3100mhz when at 0mhz offset im 3300mhz and +300ofset gives me 3600mhz ? -500offset is 2800mhz

View attachment 120677

View attachment 120673


View attachment 120671
View attachment 120672
Could be software, or could be that ASRock magic, AKA XOC vbios. see above.
 

Josh128

Senior member
Oct 14, 2022
705
1,228
106
I just can't figure you out. You complain about AMD power usage, but lower tier Nvidia is good? Is that what I am missing? You are a mid tier board seeker and can't stand that 9070xt was raised by AIB to what 320w and 5070/ti is what something supposedly like 220w?

What about the missing ROPs?

5070Ti


9070XT


Its easy, I just call it like it is. The 9070XT Tai Chi is averaging 364W per TPU in gaming, while losing in almost everything to the Galax 5070 Ti, which averages 279W in gaming per TPU. 9070XT is not a bad GPU, but already being pushed out of its perf/power curve. A 400W XTX version which draws 430W would be kind of ridiculous, IMO.
 
Last edited:
Reactions: DaaQ
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |