biostud
Lifer
- Feb 27, 2003
- 19,248
- 6,254
- 136
At least I don't have to regret buying a 6800XT in January
At least I don't have to regret buying a 6800XT in January
I mean, it's got the new shiny display bits and a couple years down the road if you are looking at used cards these are going to be more desirable than an older card (obviously?) but yeah... clearly engineering challenges were encountered.What's waiting 8 months when you could have got a GPU that's 3% faster?
Same! For both me and my daughter!At least I don't have to regret buying a 6800XT in January
N32 is mixed 6nm (memory controller dies) and 5nm (compute die). Nvidia doesn't call their customized 5nm "4nm" but "4N" and notably this is not TSMC's N4. In either case, no idea what you're talking about they both have the "real 5nm stuff". Basically RDNA3 is not enhanced 7nm except for N33.The reviews are pretty bad. Basically RDNA 3 is enhanced 7nm that they call 6nm and Nvidia is running the real 5nm stuff that is enhanced that they call 4nm. The performance between the 6800xt and 7800xt is negligible at best and the power efficiency is due to RDNA 3 being on slightly more efficient 7nm silicon.
You are arguing semantics here. What AMD is doing with their silicon is very confusing. It should be the same silicon that AMD uses in Zen 4 but it's not. Some of their GPU's are based on N6 which is what I was referring to as 7nm silicon enhanced which some refer to as 6nm.N32 is mixed 6nm (memory controller dies) and 5nm (compute die). Nvidia doesn't call their customized 5nm "4nm" but "4N" and notably this is not TSMC's N4. In either case, no idea what you're talking about they both have the "real 5nm stuff". Basically RDNA3 is not enhanced 7nm except for N33.
No, I'm not arguing semantics. N32 benefits from "being on the most advanced TSMC silicon". Unfortunately, AMD bungled it. But it's not the processes fault just their implementation.You are arguing semantics here. What AMD is doing with their silicon is very confusing. It should be the same silicon that AMD uses in Zen 4 but it's not. Some of their GPU's are based on N6 which is what I was referring to as 7nm silicon enhanced which some refer to as 6nm.
My point is that Nvidia is benefiting from being on the most advanced TSMC silicon short of 3nm which doesn't release until the Iphone 15 later this month or next month. Apple referred to the N4 TSMC silicon as 4nm when they released the iphone 14 last year.
You are wrong. N32 is following the same arrangement as N31 but with 4 MCD and a smaller GCD. But it's pushed further out of its efficiency curve because it only has 60CU.I could be wrong but the 7800xt and 7700xt appear to be on the N6 silicon. The 7900xtx is on the same 5nm TSMC silicon as the Zen 4 CPU's.
You are arguing semantics here. What AMD is doing with their silicon is very confusing. It should be the same silicon that AMD uses in Zen 4 but it's not. Some of their GPU's are based on N6 which is what I was referring to as 7nm silicon enhanced which some refer to as 6nm.
My point is that Nvidia is benefiting from being on the most advanced TSMC silicon short of 3nm which doesn't release until the Iphone 15 later this month or next month. Apple referred to the N4 TSMC silicon as 4nm when they released the iphone 14 last year.
Apple introduced the A16 Bionic chip with the iPhone 14 Pro and iPhone 14 Pro Max last year. Apple claims that it is a 4nm chip because it uses TSMC's "N4" process, but in reality it is made with an enhanced version of TSMC's 5nm N5 and N5P processes.
You can clearly see the power savings the 40 series cards have over the RDNA 3 cards. The efficiency gains from RDNA 2 6800xt to RDNA 3 7800xt are not very substantial. I could be wrong but the 7800xt and 7700xt appear to be on the N6 silicon. The 7900xtx is on the same 5nm TSMC silicon as the Zen 4 CPU's.
I sold my 6800XT to a guy in april for $600 CAD he is laughing right now.At least I don't have to regret buying a 6800XT in January
Yeah nothing I have seen since makes me regret buying a 6700 XT last November.At least I don't have to regret buying a 6800XT in January
Ok, performance lands about were expected, a bit slower than a 6800xt. power use sadly a bit high, yes better than a 6800 xt but not much for a new node.
But here they are not listed yet, so no price yet but given some mentiniing lower than US dollar pricing I'm optimistic it might actually be cheaper than I expected.
I don't hate any company, but I hate the lack of competition...I hate Nvidia but they are damn good at what they do.
You are wrong. N32 is following the same arrangement as N31 but with 4 MCD and a smaller GCD. But it's pushed further out of its efficiency curve because it only has 60CU.
Forget the Ada series, if a year ago anyone on this forum even dare speculate that any RDNA 3 SKU would have worse efficiency than a Nvidia Ampere SKU they would have gotten laughed at/banned, yet here we are...