Question Speculation: RDNA2 + CDNA Architectures thread

Page 25 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,703
6,405
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
These types of wishful thinking fanboy-friendly rumors are almost never true. For example, I remember people back in the day essentially betting their lives on Wii main ASIC having special hardware not present in the Gamecube.

Yeah, other than the Wifi and USB stuff, that wasn't the case. lol
Wait... what? PS 4 Pro had rapid packed math which wasn't part of the IP originally used, it only came with vega. It's called semi-custom for a reason. Nobody's talking about a secret thermonuclear weapon in every shader engine, just feature(s) that may not be present in the soon to be released consumer RDNA2 GPUs - there's no wishful thinking there. It may or may not happen, but the statement 'these fanboy-friendly rumors are almost never true' is just wrong in this particular case.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
It's not precedence to point at one unrelated hardware feature in a different generation of likewise unrelated hardware and then somehow extrapolating that into an internet rumor regarding two different generations of hardware being true.

Likewise I'm not wrong when I said that these rumors really are almost never true. They really are almost never true! This basic sort of rumor has been around for ages, and they're a dime a dozen whenever a new console generation is about to come out, surely you've noticed that by now.

For example, some nutty guy who went under the handle of Chaphack (and a whole bunch of similar variations, because he kept getting banned) at Beyond3D, who was a huge MS fanboy, claimed before what later became known as Xbox 360 was revealed that it would have a "Tejas" Pentium4 CPU clocked at 10GHz. As we now know, that wasn't quite the case. lol

The basic principle behind these rumors and how they proliferate are easy enough to understand; people really want their preferred console to be special in some way. Hence all the dumb stuff we heard about the Wii, and the Wii U too, and many other consoles besides. Hell, console manufacturers sometimes engage in such rumormongery themselves, like when Sony BS-ingly claimed that exporting PS2s to Iraq had been banned by the UN, because the consoles were allegedly so powerful they could be converted into control systems for cruise missiles! lol

And not long ago I heard some loose talk about MS special sauce they'd held back about the Xbox SeX for example. But we've now had a hotchips presentation about the machine and there doesn't seem to be any such sauce in there, or they'd most likely mentioned it. Because why wouldn't they? A hot chips presentation is meant to brag about the machine and its capabilities, why leave stuff out on purpose, thus making it look weaker. And the thing goes on sale in like two months or maybe three anyhow. The time for keeping hardware secrets for this coming gen of consoles is essentially up.

So one oddball feature from PS Pro doesn't give this rumor any special credibility. It's just yet one more unsubstantiated claim with nothing in reality to back it up.
You're just contradicting yourself. Why wouldn't it give this rumor any credibility? Why? I mean, it actually happened, and in contrast, what credibility does your post give to your argument, when you're talking about literally idiotic delusions like 10 GHz CPUs that were spread by deranged individuals? This rumor, true or not, represents zero game-changing capabilities, but it can be different between the two consoles and if true, it will most likely be faster in some cases, slower in others.

I'll give you a good example: it's like saying that nothing should be rumored about Zen 3 and the manufacturing process used, because 7 years ago NostaSeronx made some strange claims about manufacturing processes.
 

NTMBK

Lifer
Nov 14, 2011
10,269
5,134
136
I also have a particular beef with Nintendo over inflating PR of hardware specs - they clearly implied that WiiU CPU was POWER7 based by using the Watson AI computer in early language about it to generate hype.

As we know it turned out to be little more than an even higher clocked, triple core version of the GC/Wii CPU which uses a much earlier, and less performant PPC core - needless to say I wasn't impressed at the time and Nintendo have long since dropped beneath my RADAR so far as interesting HW internals are concerned.

When did Nintendo say anything like that? All I remember them saying was "IBM" and "eDRAM".
 

soresu

Platinum Member
Dec 19, 2014
2,968
2,192
136
When did Nintendo say anything like that? All I remember them saying was "IBM" and "eDRAM".
They used the Watson computer in a press release or interview that was talking about the WiiU CPU.

At the time people presumed it to mean POWER7 which the Watson computer was using - turns out all that they meant was the same old and battered PowerPC uArch from GC and Wii.

The speculation going around was clear as day though - they had to know what people had thought even if that was not their initial meaning, the fact that they decided to leave clarification up until much closer to release tells me that they were fine with the false hype, given the WiiU CPU was actually weaker than XB360 that isn't a surprise, but very underhanded of them.
 

Mopetar

Diamond Member
Jan 31, 2011
8,010
6,454
136
Saying RDNA3 tech without saying anything specific isn't saying much at all. If it's some minor tweak that wasn't quite ready for RDNA2 and got cut it probably isn't a big deal. If it were a significant upgrade or a major game-changed, it wouldn't be a completely unsubstantiated rumor

I also don't put too much stock into either company's claims of heavy customization. AMD is perfectly happy to let them claim that but it's mostly PR fluff for marketing purposes. If Microsoft truly did have some amazing custom tech they wouldn't shut up about. Notice how much Sony has talked about the custom SSD tech they made for the PS5. I think it was easily more than a third of one of their early hardware presentations.

I wouldn't be surprised if they both end up having some of the same "custom" changes in both of their implications. Both companies are targeting the same things and would make some similar observations about shortcomings in RDNA2.
 

soresu

Platinum Member
Dec 19, 2014
2,968
2,192
136
I also don't put too much stock into either company's claims of heavy customization. AMD is perfectly happy to let them claim that but it's mostly PR fluff for marketing purposes.
I'm not so sure about that.

From a PR and customer trust perspective it seems a risky thing to take that tack when AMD's official deep dive on RDNA2 will render MS or Sony's words on the consoles as untrustworthy.

Given MS already had a seriously bad first impression with XB1 I would think that they would be careful not to rock that particular boat again if at all possible.

I know plenty of gamers wouldn't give a hoot if their favourite console vendor were found to be lying one way of another, but some would - and the kind that would are likely to be more vocal about it than others.
 

Panino Manino

Senior member
Jan 28, 2017
847
1,061
136
Wait... what? PS 4 Pro had rapid packed math which wasn't part of the IP originally used, it only came with vega. It's called semi-custom for a reason. Nobody's talking about a secret thermonuclear weapon in every shader engine, just feature(s) that may not be present in the soon to be released consumer RDNA2 GPUs - there's no wishful thinking there. It may or may not happen, but the statement 'these fanboy-friendly rumors are almost never true' is just wrong in this particular case.
I don't believe this will happen again, how can anyone believe this?
Making RDNA must have been a lot of work on a tight budget, they had to refine the architecture at the same time they added new important this that NEED TO WORK because the consoles will be stuck with it for years on RDNA2 and at the same time the GPU division was divided to work on CDNA on parallel. How it's even possible that they would have time and hands to finish anything planed to RDNA3 to backport to the PS5's GPU? Would be nice but isn't doable
They used the Watson computer in a press release or interview that was talking about the WiiU CPU.

At the time people presumed it to mean POWER7 which the Watson computer was using - turns out all that they meant was the same old and battered PowerPC uArch from GC and Wii.

The speculation going around was clear as day though - they had to know what people had thought even if that was not their initial meaning, the fact that they decided to leave clarification up until much closer to release tells me that they were fine with the false hype, given the WiiU CPU was actually weaker than XB360 that isn't a surprise, but very underhanded of them.

They also said that the WiiU would be 19 times faster than the previous gen... or something like that.
.
 

soresu

Platinum Member
Dec 19, 2014
2,968
2,192
136
How it's even possible that they would have time and hands to finish anything planed to RDNA3 to backport to the PS5's GPU? Would be nice but isn't doable
As I said in an earlier post, it could well be a feature designed for the PS5 in the first place, as I believe to have been the case for early RPM in the PS4 Pro GPU, a year before Vega on the PC.*

RPM being designed for PS4 Pro makes sense because it wasn't in the Scorpio/XB1X GPU at all, despite coming a year later and almost 3 months after Vega.

*The rumour was extremely vagye anyways, I haven't a clue where to is supposed to have come from.
They also said that the WiiU would be 19 times faster than the previous gen... or something like that.
Oof that would certainly be a stretch.

I think the GPU was definitely a serious step above Wii and even PS3 and XB360, though that really isn't saying much after 6-7 years.

If it had come out a couple of years earlier it might have even been moderately successful, but as it is they got greedy and stretched an already old HW design far too long with the Wii.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
How it's even possible that they would have time and hands to finish anything planed to RDNA3 to backport to the PS5's GPU? Would be nice but isn't doable
I'm really scratching my head reading your post. Why on Earth wouldn't it be doable? A lot of GPU IP parts are interchangeable and many times there are 2-3 or even more different 'mixture' of IP is being produced at any given time by the same company, depending on what the target market is. Not always, but it is fairly common. There is nothing magical or technically impossible about it. It's also fairly common that different development solutions have different design times and one or more of them can end up going into a product earlier than it was commercially planned. This is literally what semi-custom means, I can't stress that enough.
 

Panino Manino

Senior member
Jan 28, 2017
847
1,061
136
I though about something that made me remember something that someone from AMD said.
They said that they would only deliver RT support when they could guarantee that it would be universal, that top to bottom all GPUs would come with RT hardware, right? This is what they did with RDNA2 than that we're seeing on SeX's CPU. Because it's part of the basic CU than no matter how small the GPU, even the x50 tier will have this RT hardware.


As I said in an earlier post, it could well be a feature designed for the PS5 in the first place, as I believe to have been the case for early RPM in the PS4 Pro GPU, a year before Vega on the PC.*

RPM being designed for PS4 Pro makes sense because it wasn't in the Scorpio/XB1X GPU at all, despite coming a year later and almost 3 months after Vega.

Then why wouldn't RDNA2 also have it if this was being worked so long ago?
(and XB1X don't have RPM because contrary to Sony, Microsoft chose to use the same uarch as before instead of updating it)

I'm really scratching my head reading your post. Why on Earth wouldn't it be doable? A lot of GPU IP parts are interchangeable and many times there are 2-3 or even more different 'mixture' of IP is being produced at any given time by the same company, depending on what the target market is. Not always, but it is fairly common. There is nothing magical or technically impossible about it. It's also fairly common that different development solutions have different design times and one or more of them can end up going into a product earlier than it was commercially planned. This is literally what semi-custom means, I can't stress that enough.

Already explained, it's to much simultaneous work.
Lately I tend for hope for the worst, not for the better, regarding AMD's future.
 

soresu

Platinum Member
Dec 19, 2014
2,968
2,192
136
Then why wouldn't RDNA2 also have it if this was being worked so long ago?
Again like I said - PS4 Pro had RPM on a GPU uArch that was basically otherwise Polaris.

RPM on PC did not come out until Vega in August the following year.

Are you sure that XB1X is just shrunk Sea Islands uArch?

I was under the impression it was closer to Vega minus RPM - though that might have been idle speculation based upon the timing coming nearly 3 months after Vega's release.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Already explained, it's to much simultaneous work.
Lately I tend for hope for the worst, not for the better, regarding AMD's future.
What are you on about? If the development is done on an IP that a customer wants, it's not even AMD's work anymore, but the customer's. I'm not sure you understand exactly how IHVs plan and build a GPU. Usually a large part of the work is already done and dusted and there are others that hold up the launch. Take Intel for example. You can be 100% sure that Willow Cove was ready ages ago, then 10nm happened.
 

Panino Manino

Senior member
Jan 28, 2017
847
1,061
136
What are you on about? If the development is done on an IP that a customer wants, it's not even AMD's work anymore, but the customer's. I'm not sure you understand exactly how IHVs plan and build a GPU. Usually a large part of the work is already done and dusted and there are others that hold up the launch. Take Intel for example. You can be 100% sure that Willow Cove was ready ages ago, then 10nm happened.
I dare hope, but not dream.

Possible, yes, it is.
But I just don't believe that AMD was able to do so much.
On the contrary I fear that the PS5 may come with less, not more.
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
They have made no mention of mesh-shaders at all, so I wouldn't rule that out either (though it sure would suck if these were not a common feature next-gen)
Weren't there references that PS5 may be based on RDNA1 and Series X on RDNA2 generation wise? Only the latter supports mesh shaders.
 

Konan

Senior member
Jul 28, 2017
360
291
106
I was wondering on how amd got the 50% perf/watt, without any massive changes as shown by xboxsx...

Than this slide, "color compression everywhere"... RDNA1 don't do alot of compression if we follow those arrows... I mean, it's even a possible hint for RDNA3 gains

https://images.app.goo.gl/Eh11YrBbfxMtBAJYA

re: Corporate slides aren’t to be trusted from anyone in the industry imo.
 
Reactions: Tlh97 and psolord

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
Weren't there references that PS5 may be based on RDNA1 and Series X on RDNA2 generation wise? Only the latter supports mesh shaders.
Gods... Cerny confirmed MONTHS ago now that PS5 is RDNA2. Go back and watch his presentation if you need a refresher.

And it'll support the standard RDNA2 features like mesh shaders, even though Sony hasn't explicitly mentioned this feature or that feature and those other features over there; it goes with the territory.

Btw, didn't the Epic PS5 graphics demo use mesh shaders? *shrug* Maybe I need a refresher too! lol
 
Last edited:

Gideon

Golden Member
Nov 27, 2007
1,714
3,937
136
[QUOTE="FaaR, post: 40259175, member: 228685"

Btw, didn't the Epic PS5 graphics demo use mesh shaders? *shrug* Maybe I need a reminder too! lol
[/QUOTE]

Good catch. BTW Nanite does have a vertex shader path as well. But you're absolutely correct it was indeed using primitive shaders on PS5:
On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders.

And raytracing is also obviously confirmed. So should have the entire RDNA2 feature set
 
Reactions: Tlh97 and FaaR

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
[QUOTE="FaaR, post: 40259175, member: 228685"

Btw, didn't the Epic PS5 graphics demo use mesh shaders? *shrug* Maybe I need a reminder too! lol

Good catch. BTW Nanite does have a vertex shader path as well. But you're absolutely correct it was indeed using primitive shaders on PS5:


And raytracing is also obviously confirmed. So should have the entire RDNA2 feature set
[/QUOTE]
at least
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
Gods... Cerny confirmed MONTHS ago now that PS5 is RDNA2.
I was thinking of this list:
Navi10Lite - gfx1000 (PS5)
Navi14Lite - gfx1001 (Lockhart?)
Navi10 - gfx1010 (5700XT/5700/5600XT)
Navi12 - gfx1011 (Unknown, but 40CUs and HBM2)
Navi14 - gfx1012 (5500XT/5500M/5300M)
Navi21Lite - gfx1020 (Xbox Series X)
Navi21 - gfx1030 (Rumour: ~500mm^2)
Navi22 - gfx1031 (Rumour: ~250mm^2)
Navi23 - gfx1032 (?)
VanGogh - gfx1033
VanGoghLite - gfx1040
In any case as I wrote before the extend of the customization still needs to clear up. The gfx10xy numbering is likely only reflective of when the work on it started, not what feature set it supports at the end.
 

Gideon

Golden Member
Nov 27, 2007
1,714
3,937
136
In reply to the discussion in the Ampere thread about AMD's runaway performance expectations.

I took the performance summary data from techpowerup's 5700XT review (at 4K) in order to project RDNA2's performance based on know and likely specs:

Current GPUs vs projected Big Navi and Ampere:
GPUPerf compared to 5700 XTTDPMem-Bus widthMemory Bandwidth
Geforce 206083%160W *192 bit336.0 GB/s
Radeon 570088%180W256 bit448.0 GB/s
Geforce 2070100%175W256 bit448.0 GB/s
Radeon 5700 XT100%225W256 bit448.0 GB/s
Geforce 2070 Super114%175W *256 bit448.0 GB/s
Geforce 2080 Ti156%250W *352 bit616.0 GB/s
-- Everything below is rampant speculation --
est. Radeon 6800 XT (roughly 5700 clocks/IPC)~165% (@ around 93% scaling)~300W?512 bit? **896 GB/s ? **
est. Radeon 6800 XT (@2.0 Ghz, 5700 as baseline)~190% (same scaling)~300W?512 bit? **896 GB/s ? **
est. Radeon 6800 XT (@2.0 GHz, 5700XT as baseline)~ 203%???512 bit? **896 GB/s ? **
est. Geforec 3090 (low bar)~202%? (130% of 2080 Ti ***)350W?352 bit (GDDR6X)1008.0 GB/s
est. Hypotetical 3090 (still nothing exceptional)~ 234%? (150% of 2080 Ti ***)350W?352 bit (GDDR6X)1008.0 GB/s

* Yes AMD and Nvidia TDP are a bit apples to oranges but they are similar enough to do basic comparisons. Overall 2070 vanilla has very similar layout (number of shader, ROPs, TMUs etc) and nearly identical TDP to the 5700. Likewise 2070 Super is technically bery similar to 5700XT, yet Nvidia performs 10-15% better.

** I'm still a little skeptical of the 512 bit mem-bus rumors for GDDR6 (though 16GB of VRAM heavily points in the direction) considering the die-size. Otherwise I would have guessed 384 - 448 bit GDDR6. Given the leaks though let's assume this is true. From Linux driver leaks we know that the initial versions will be GDDR6 with HBM2(E?) Pro versions coming early next-year.

*** This looks like a really estimation considering 1.3x the performance with 1.4x the TDP at a full-node-shrink. This is an absolute worst case AMD could possibly have projected when starting design of Navi 21, with 150-170% of 2080 Ti being the more pessimistic/probable result (especially if Nvidia were still to use 7nm).

A couple of points from the Table:
  1. RTX 2080 Ti is pretty close to 2 x RTX 2060 in terms of resources (shader cores, memory bandwidth, etc) it's not exactly 2x but it's close. It even has similar clock-speeds. Therefore it's a good reference for a hypothetical 80CU, 512bit memory-bus RDNA2 chip. Yet It doesn't quite double up the RTX 2060 performance. It gets a decent 93% scaling that I'll be using as a rough reference to AMD's scaling. An interesting point to note is that it doesn't have anywhere near 2x RTX 2060 TDP.
  2. Even If AMD manages to double the 5700XT's performance (no easy task @ 300W), it still wouldn't reach the most probable RTX 3090 performance levels (140-150% of 2080 Ti performance).
    • It would be more likely if it could reach 2.1-2.2 Ghz but I'm really skeptical of that. I can see them hitting these clocks with mid-range ~40-50CU cards but not with a huge flagship die.
    • I'm also skeptical of AMD significantly improving IPC. Microsoft claimed similar IPC gains (Polaris to RDNA2) that AMD claimed in 5700 XT release slides for RDNA1 (Vega to RDNA1). IPC gain shouldn't be more than a couple of percent.

Anyway, these are just some random thought based on late-night napkin math. Feel free to disagree/argue or shoot it all down.

TL;DR:

Expecting anything more than 2x Radeon 5700 (or Radeon 5700 XT) is exceedingly unlikely. On the other hand, AMD may only be able to pull off 150 - 160% of 5700XT's performance (that is what Coretex predicted after all)
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |