Question Speculation: RDNA2 + CDNA Architectures thread

Page 177 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,702
6,405
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

JujuFish

Lifer
Feb 3, 2005
11,031
752
136
Also thinking of this for the first time too. But I have a gsync only monitor so not really convinced 100% yet.
Same. I'm debating whether I just want to deal with giving up gsync until some time in the future when I do a monitor upgrade (either to an ultrawide or to 4K, not sure which way I'd want to go).
 

sze5003

Lifer
Aug 18, 2012
14,184
626
126
Same. I'm debating whether I just want to deal with giving up gsync until some time in the future when I do a monitor upgrade (either to an ultrawide or to 4K, not sure which way I'd want to go).
I have the 32 or 34 alienware ultrawide 1440p monitor right now. I love it for games and productivity.

My only option would be to go to a LG Cx 48 but that's not a monitor and I don't see myself going smaller than what I have now. That LG is huge and probably would take up a lot of space on the desk. It feels really unnecessary in my room when I also have a projector setup to watch stuff at night. Just doesn't feel right despite one of my friends getting one to use as a monitor.

Now I've got some upgrade itch but there's too much uncertainty. I do want to upgrade my aio cooler, maybe the cpu, the gpu for sure.

I guess I'm waiting to see if a 3080ti comes out as i am not too comfortable with a 3080. It would be nice to get started on putting new components in my rig right now. Seems like everytime I try to do an upgrade, it's never a good time.
 

ModEl4

Member
Oct 14, 2019
71
33
61
You still don't get it?

If your calculations are correct, 40 CU RDNA2 GPU having 87% of performance of RTX 3070 will lose IPC compared to RX 6800XT and 6900XT.

And that is flawed logic, mainly because in your calculations you do not take into account for scaling.

40 CU GPU Clocked at 2.2 GHz will not be 50% of performance of 80 CU 2.2 GHz RDNA2 GPU.

Why? Two things. Scaling. 80 CU die may have 90% of CU scaling, compared to 40 CU die, so it alone effectively pushed performance target higher. Secondly, we don't know how 192 bit bus affects performance.

Using basic, armchair logic, for 40 CU die to be exactly 50% of 80 CU die, it would have to have 128 bit bus. But it doesn't. It has 192 bit bus.

So no. Just taking into accounts, those two factors, 40 CU GPU, clocked at 2.5 GHz(which it won't ) will not be "only" 87% of RTX 3070.

Even RedGamingTech info says that it will be about the same as RTX 3070 in performance.

So I would trust a guys who leaked Infinity Cache name, its size, etc way before everyone thought it is real .
I'm not getting it? Child you don't even seem to know basic maths, you claim 2.5GHz 40CU navi22 with +20% IPC vs RDNA1 to be 15% faster than 2080Ti and then you accuse me about RDNA2 IPC loss?
Regarding scaling and memory everything is accounted in my prediction I guess you just assume that I calculated with the notion that 6800XT = 3080 in 4K in a Intel testbed, you didn't asked me (I could have calculated with the notion that 6800XT = -0.5% or -5% or whatever, you didn't ask before start discrediting my calculations?)
Regarding scaling although a reality if you see just the fps results, GPUs are near perfect scalers (99%) , the reason that a design is not scaling 100% can be perfectly explained if you analyze each individual GPU design aspect (or in high-end designs there is the factor of being system limited etc so no GPU related, or it could be the game engine etc) people claiming general rules like 90% scaling etc when a design cannot seem to scale perfectly is because they don't have the knowledge and tools to analyze which aspect of the fundamental GPU components influence and at what degree the results, and trying to rationalize the results they oversimplify and generarilize something that is in contrast with the basic nature of rasterization. (I'm expecting comments like "tell that to 3090" lol)
You said that you trust RedGamingTech which is fine, I have my own predictions/calculations, remember I didn't engaged in a post you made, you replied to a post I made, which is fine of course, except is strange that you seem hell bend on trying to discredit my calculations, I guess trying to maintain a 6700XT is close to 3070 rhetoric is very important to you, I don't care...
Why does it matter if 6700XT is 2.6GHz or something above 2.5GHz , I just made my calculations based on the 2.5GHz rumor, if AMD can hit higher clocks you adjust accordingly the results.
And about the +2.5GHz frequency using the 😉 emoji like you know something more than the rest of us, lol, child are you the only one in the world that saw the +30% clock in AMD's presentation?
You don't even seem to realize that my 6700XT prediction (87% of 3070 in 4K) is potentially very close to 3070, add SAM, add Rage and you get different results, lol if you calculate with a 2.58GHz frequency and let's say SAM is 5% and Rage is 2% you can potentially achieve 96.5% of 3070...
Although I really don't know how you can save from failing your 32CU prediction (performance between 2080 and 2070S) to hit 2080 the clocks have to be above 2.5GHz and you have to add SAM & rage, my calculation is based on the 2.3GHz rumor like I said in my original post. I wonder what difference a cut down Navi22 is going to have with your 2080 prediction... (I expect a response like 2080Ti -> 2080 difference 😉, lol)
Anyway seeing your activity in the forum, certainly You seem invested in the AT community, my intension is not to discredit anything you write, so I wish you the best regarding your predictions even if this means that mine will fail, because to me maintaining a certain status in AT forum is meaningless, I have other things to worry, I haven't even posted in AT since 2010, I really don't care that much, I prefer sitting back and enjoy the show!
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Also thinking of this for the first time too. But I have a gsync only monitor so not really convinced 100% yet.

My monitor is the only issue for me right now too. I really like it, but it has Gsync only, which is a huge negative all of a sudden, lol. It has good uniformity and I like the subtle curve of this one. Uniformity can vary wildly between individual models, so buying another monitor is a huge gamble. I might end up with whites that look yellow on one side and all dark and dingy in certain areas, etc. Mine has nice clean whites and the quality is good. I can't count on that with a new monitor, but I want an all AMD build with a new Ryzen and 6800 XT, so I'm afraid I need to either live with this Gsync monitor for a while or just bite the bullet and ditch it for a Freesync/Gsync compatible model. LG has some decent 3440x1440 options.
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
I'm not getting it? Child you don't even seem to know basic maths, you claim 2.5GHz 40CU navi22 with +20% IPC vs RDNA1 to be 15% faster than 2080Ti and then you accuse me about RDNA2 IPC loss?
First of all, you lost your mind.

Secondly.

20% IPC increase on 40 CU RDNA2 GPU means that it is getting exactly same performance as 48 CU RDNA1 GPU.

RDNA1 GPU clocked at 1887 MHz. 2.5 GHz is 33% above this.

Which puts this GPU 15% above RTX 2080 Ti performance levels.

Thats for the end of that stupid discussion about a mathematical calculation, ergo, speculation. I didn't claimed that it WILL perform like this. I guess some people take analysis, and mathematical reasoning as stating ones opinions. Thats fair enough.

I have said months ago, based on information from my source, that 40 CU die should be around 10% above RTX 2080 Super, 60 CU die should be around 10-20% faster than RTX2080 Ti, and 80 CU die should be around 40-50% faster than RTX 2080 Ti. The last two turned out pretty darn accurate.

And looking at benchmarks of RX 6800 and 6900XT, I have to say. My source could've been extremely wrong about 40 CU die. It may genuinely turn up as fast as RTX 3070, within 5% of its performance.
 

ModEl4

Member
Oct 14, 2019
71
33
61
First of all, you lost your mind.

Secondly.

20% IPC increase on 40 CU RDNA2 GPU means that it is getting exactly same performance as 48 CU RDNA1 GPU.

RDNA1 GPU clocked at 1887 MHz. 2.5 GHz is 33% above this.

Which puts this GPU 15% above RTX 2080 Ti performance levels.

Thats for the end of that stupid discussion about a mathematical calculation, ergo, speculation. I didn't claimed that it WILL perform like this. I guess some people take analysis, and mathematical reasoning as stating ones opinions. Thats fair enough.

I have said months ago, based on information from my source, that 40 CU die should be around 10% above RTX 2080 Super, 60 CU die should be around 10-20% faster than RTX2080 Ti, and 80 CU die should be around 40-50% faster than RTX 2080 Ti. The last two turned out pretty darn accurate.

And looking at benchmarks of RX 6800 and 6900XT, I have to say. My source could've been extremely wrong about 40 CU die. It may genuinely turn up as fast as RTX 3070, within 5% of its performance.
OK, I'm going to end this right here wishing you again the best regarding your predictions!
 

JujuFish

Lifer
Feb 3, 2005
11,031
752
136
I have the 32 or 34 alienware ultrawide 1440p monitor right now. I love it for games and productivity.

My only option would be to go to a LG Cx 48 but that's not a monitor and I don't see myself going smaller than what I have now. That LG is huge and probably would take up a lot of space on the desk. It feels really unnecessary in my room when I also have a projector setup to watch stuff at night. Just doesn't feel right despite one of my friends getting one to use as a monitor.

Now I've got some upgrade itch but there's too much uncertainty. I do want to upgrade my aio cooler, maybe the cpu, the gpu for sure.

I guess I'm waiting to see if a 3080ti comes out as i am not too comfortable with a 3080. It would be nice to get started on putting new components in my rig right now. Seems like everytime I try to do an upgrade, it's never a good time.
You've got an ultrawide, so obviously you just need to go wider
 

sze5003

Lifer
Aug 18, 2012
14,184
626
126
My monitor is the only issue for me right now too. I really like it, but it has Gsync only, which is a huge negative all of a sudden, lol. It has good uniformity and I like the subtle curve of this one. Uniformity can vary wildly between individual models, so buying another monitor is a huge gamble. I might end up with whites that look yellow on one side and all dark and dingy in certain areas, etc. Mine has nice clean whites and the quality is good. I can't count on that with a new monitor, but I want an all AMD build with a new Ryzen and 6800 XT, so I'm afraid I need to either live with this Gsync monitor for a while or just bite the bullet and ditch it for a Freesync/Gsync compatible model. LG has some decent 3440x1440 options.
My biggest issue would be getting rid of my current monitor. Selling it on offer up or those other apps is a pain as I've tried with other electronics.

I guess I'm not too sure what the consequences of an all amd build and keeping this monitor would be. Lots of tearing? Not as smooth, becuause it's silk smooth movement now on everything.

Then there is the worry about AMD's drivers although I never had an issue when I had their top of the line 7 series card, years ago, before I got the 1080ti.
You've got an ultrawide, so obviously you just need to go wider
Good God man! I use an ikea Bekant desk which is like 60 inches long by 30 deep, and I have my flight gear peripherals and other stuff on the desk as well as decent speakers. This monitor would hog up all that space!

What we need is a 34 or 36 inch 4k with all the bells and whistles (ultrawide would be nice but don't think it's an option for 4k).

Something like the LG CX 48 but not as freaking big and an actual monitor not tv. I have an LG C8 tv and absolutely love it.
 
Reactions: Tlh97

soresu

Platinum Member
Dec 19, 2014
2,950
2,167
136
I bet you now that the whole Infinity Cache thing is helping out a huge deal.
No doubt at all.

Everything I've read from academic papers on the subject of HW RT acceleration (alas with my admittedly poor comprehension of the specifics) is that memory access and bandwidth are key factors in determining the efficiency of an implementation.

If Infinity Cache's main aim is negating memory bandwidth issues then it should work perhaps even better for RT workloads than raster ones.

Perhaps this might negate any need for HBM in APU's - though obviously for the sake of total platform simplicity in laptops and NUC's then HBM may still have a use if only to reduce the total PCB footprint and simplify the cooling system.
 
Reactions: Tlh97

Elfear

Diamond Member
May 30, 2004
7,114
690
126
My biggest issue would be getting rid of my current monitor. Selling it on offer up or those other apps is a pain as I've tried with other electronics.

I guess I'm not too sure what the consequences of an all amd build and keeping this monitor would be. Lots of tearing? Not as smooth, becuause it's silk smooth movement now on everything.

Then there is the worry about AMD's drivers although I never had an issue when I had their top of the line 7 series card, years ago, before I got the 1080ti. Good God man! I use an ikea Bekant desk which is like 60 inches long by 30 deep, and I have my flight gear peripherals and other stuff on the desk as well as decent speakers. This monitor would hog up all that space!

What we need is a 34 or 36 inch 4k with all the bells and whistles (ultrawide would be nice but don't think it's an option for 4k).

Something like the LG CX 48 but not as freaking big and an actual monitor not tv. I have an LG C8 tv and absolutely love it.

I've been looking for the perfect monitor for the last couple years and it seems like there is always a compromise. Finally decided to bite the bullet and get the 48" CX. I'll mount it to the wall right behind my desk so I can sit back a few more inches. Right now I have a 40" 4k monitor stacked on a 35" UW. Both monitors are ~36" away which works great for me. I think the 48" CX will look great from 40-46" away.

Also, Costco will have a great deal on a 34" Acer UW for Cyber Monday ($350). Checks all the boxes for an UW (144hz, HDR 400, Freesync Premium Pro, etc.) except for IPS. It's a new monitor so there aren't many if any reviews right now but generally there is a high likelihood of G-Sync compatibility since it has Freesync Premium Pro.
 

amenx

Diamond Member
Dec 17, 2004
4,005
2,275
136
I use an ikea Bekant desk which is like 60 inches long by 30 deep, and I have my flight gear peripherals and other stuff on the desk as well as decent speakers. This monitor would hog up all that space!

What we need is a 34 or 36 inch 4k with all the bells and whistles (ultrawide would be nice but don't think it's an option for 4k).

Something like the LG CX 48 but not as freaking big and an actual monitor not tv. I have an LG C8 tv and absolutely love it.
Large displays (40" and above) are best wall mounted above a desk. Too impractical to put them on the desk.
 

sze5003

Lifer
Aug 18, 2012
14,184
626
126
Large displays (40" and above) are best wall mounted above a desk. Too impractical to put them on the desk.
Yea I would agree except my desk is in front of a very large window. Perhaps mounts that clamp to the desk would work if I ever decide to go 40 or bigger.. although I would like to stay under 40 and still go to 4k at some point.
 

zinfamous

No Lifer
Jul 12, 2006
110,802
29,553
146
Large displays (40" and above) are best wall mounted above a desk. Too impractical to put them on the desk.

Yea I would agree except my desk is in front of a very large window. Perhaps mounts that clamp to the desk would work if I ever decide to go 40 or bigger.. although I would like to stay under 40 and still go to 4k at some point.


I think this is the best solution that I have seen:


Though I think in that thread in an earlier post, maybe, another member had the same idea, but the display was lower, which I think should be preferred because it's where you want the center image to be. giant displays like that need to sit a bit lower.

Anyway, I'm also considering going the LG OLED route, next year, maybe. I'm not very happy with the super value Monoprice VA UW Freesync 100hz 34" 3440 panel I picked up two years ago. (this was being discussed quite a bit when it came into existence, because of price and listed specs...but I think I'm the only one that picked it up, because I am stupid like that. lol. Anyway...freesync doesn't really work, the range being super low and I can't get it to happily overclock very well, or whatever that's called.)

But, if I'm going such a route....that means 6800XT, plus I will need to drop-in upgrade a 5600X at least, I think, in this 470 board. I don't want to build something entirely new. I already at least need to upgrade the memory (16gb PC3000) and storage (at least one more TB via 2nd NvME, and probably 2 more TB with an sata SSD). so....this is all already sounding pricey and stupid, so it will probably just be fantasy.

Things I actually need to spend money on:

--roof
--house re-pipe
--new dedicated outlets (this is a ticking time bomb of a problem, lol)

and uh, new wheels and tires for the car! Yes!

I'm already planning to buy a PS5 and the new Surface Laptop w/ AMD whenever that becomes real. ...I kinda want to replace my 10 year-old Polk sound system with a Nakamichi 7.2 or 9.2....this sounds sacrilege to me but, they are apparently extremely great these days (soundbars--plus the nakamichi gets 2x dedicated subs), and I'd really like to downsize in "Stuff." getting the floorstanding fronts, the center and receiver (2nd Denon in this system...I swear, PSUs or whatever in these devices have always crapped out on me. This one is also popping on and off at times...of course, could be the non-dedicated power issues I am having, lol), wall hanging speakers and cables to the rear that I still haven't properly dealt with, is a huge mental benefit in my mind. ....and I kinda also sorta want to go the UST laser route, but I'm not ready to spend that money; or I might just grab the 65 (or 75) LG CX model instead (or at least this generation when next is out, at a discount...hopefully).

Already too much money for things I don't need
 

PhoBoChai

Member
Oct 10, 2017
119
389
106
No doubt at all.

Everything I've read from academic papers on the subject of HW RT acceleration (alas with my admittedly poor comprehension of the specifics) is that memory access and bandwidth are key factors in determining the efficiency of an implementation.

If Infinity Cache's main aim is negating memory bandwidth issues then it should work perhaps even better for RT workloads than raster ones.

IC works for both Raster + RT.

The BVH structure is built on the CPU/driver side, and gets sent to the GPU VRAM. You've seen how RTX on, GPU vram spikes up like 1-2GB? Yeah that's the acceleration structure for that entire scene.

But you don't want to process RT on vram, because there are lots of back & forth calculations all the time as the ray traverse the structure. VRAM latency is way to high and would kill performance.

What devs do is break up the BVH into small chunks, they call it Treelets, and its small enough to be on the chip cache. This gets broken down further by the workgroup distributors or gigathread engine, sending it to the SM or CU. Now it has to reside in the SM/CU shared memory pool, for fast access to the "RT cores" and FP32 ALUs to get to work.

128MB is a ton bigger than 4MB L2 of typical GPUs. You can fit many more Treelets on chip, so you can get many more concurrent RT workloads done, so even if per RT core you are slower (not assuming RDNA2 is inferior on its RA), more of it can be utilized at any one point in time.
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
Smart Access Memory isn't new or exclusive to Zen3+500 chipsets... And it's an open spec.

Source:
That's great information! Hope other players will pick up and support that tech as well. Also intriguing that AMD's GPU driver under Linux apparently supports that for some time already and without the limitations SAM under Windows now has. I guess some reviewer should do some benchmarks under Linux testing how different system configurations (Zen 2 vs 3, PCIe 3 vs 4 etc.) affect performance when using ">4GB MMIO".
 
Reactions: Tlh97

Shivansps

Diamond Member
Sep 11, 2013
3,873
1,527
136
It would be very low for AMD to try to pass SAM as a Ryzen 5000/PCI-E 4.0 feature when it is a driver/SO feature that works on any modern cpu. Also, resizable BAR is a WDDM 2 feature, Windows has supported it on the driver model for a long time.

I really want to think is it something else at a lower level, lower than the S.O.
 

andermans

Member
Sep 11, 2020
151
153
76
That's great information! Hope other players will pick up and support that tech as well. Also intriguing that AMD's GPU driver under Linux apparently supports that for some time already and without the limitations SAM under Windows now has. I guess some reviewer should do some benchmarks under Linux testing how different system configurations (Zen 2 vs 3, PCIe 3 vs 4 etc.) affect performance when using ">4GB MMIO".

I think that for d3d 9/10/11 there might also be some driver work to actually make good use of the capability. For d3d12/vulkan however that is up to the game (though I wouldn't be surprised if there is a wide set of patterns of how games do allocations that result in reasonable use in the wild already).
 

lightmanek

Senior member
Feb 19, 2017
401
810
136
I would not be so confident that SAM is just resizing BAR window over a bog standard PCIe protocol. My gut feeling is that once both RDNA2 GPU and Zen3 CPU are detected and MB BIOS supports SAM, it engages IF protocols over PCIe.

We have to wait a bit more for a proper RDNA2 deep dive. But first, ZEN3 has to launch in 5 days
 

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
It would be very low for AMD to try to pass SAM as a Ryzen 5000/PCI-E 4.0 feature when it is a driver/SO feature that works on any modern cpu. Also, resizable BAR is a WDDM 2 feature, Windows has supported it on the driver model for a long time.

I really want to think is it something else at a lower level, lower than the S.O.
You spend half a grand or more on a graphics card every two years and people have the nerve to complain about getting a motherboard? You’re just like the rest of the crowd.. wanting AMD to put pressure on Nvidia to lower prices while having no intention on buying AMD graphics.
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
I would not be so confident that SAM is just resizing BAR window over a bog standard PCIe protocol. My gut feeling is that once both RDNA2 GPU and Zen3 CPU are detected and MB BIOS supports SAM, it engages IF protocols over PCIe.
That's my guess as well. I just consider it good news that at least a part of this tech appears to be OS and vendor agnostic (how much we'll have to see, as I wrote I hope somebody will do a closer look at and benchmarks the Linux drivers which appears to have been a testbed for AMD on this). Making good or even inventive use of already existing tech should motivate more market players to do the same (also happened with other tech like eDP's Adaptive Sync etc.).
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |