That power consumption would be real bad. May as well just do Ampere rebrands if that was even remotely accurate.
Do they really have 7 different chips? That's never happened before.
What 7 chips? Do you mean die or product?Do they really have 7 different chips? That's never happened before.
Also, knowing the performance numbers now about the lower end chips that might take up to a year to release seems a bit suspicious.
Color me skeptical about this.
Ok, I see, the two different versions of 104 and 106 are just bins then basically?
There are almost always cut down versions of chips that not meet the full specs. A 4060Ti for instance is most likely a cut down 4070.
I imagine they validate multiple different cuts for every die, and make the decision on what actual cut to use for a given product later.
They've probably already decided. They can calculate the defect rates and such.
Only the 4090 is locked in presumably. All the others could change.
They already know what percentage of AD102's they need to bin down to 4080 Ti if they cut the 4090 a certain way and how much the gap would be to a nearly full AD102 which will go into the 4090 Ti. Then near to a full AD103 goes into the 4080, with a moderate cut for the 4070, where again, they can calculate the percentages that go into each bin.
Latest rumors have the 4080 being a 10% AD103 cut and the 4070 being a mostly full AD104. That could change of course.
Sharknado 7 sounds epic!That would work if the 4080 Ti is a full AD103, rather than an AD102 that was attacked by sharks with lasers.
They've probably already decided. They can calculate the defect rates and such.
OK this is a fun rumor. nVidia might have a 4080 16 GB and 4080 12 GB. The latter being the full AD104 SKU rumored earlier. Which yes would have less shaders than the 3080.
As to why they won't call it the 4070 Ti, perhaps they are concerned about what people think of the 4080 16 GB's price... so making a cheaper model still called 4080 makes everything okay.
So this rumor is saying its going to be like the GTX 1060 3GB vs 6GB all over again? Where the 3GB had fewer shaders than the 6GB?
It would not surprise me if they wanted to say, "Look, we're still offering a xx80 class GPU at the low, low price of $699" until you find out it's the AD104 12 GB, cut down version and the true xx80 AD103 16 GB version is $999.
OK this is a fun rumor. nVidia might have a 4080 16 GB and 4080 12 GB. The latter being the full AD104 SKU rumored earlier. Which yes would have less shaders than the 3080.
As to why they won't call it the 4070 Ti, perhaps they are concerned about what people think of the 4080 16 GB's price... so making a cheaper model still called 4080 makes everything okay.
Was thinking the same. A new low in deception. Desperation setting in?It would not surprise me if they wanted to say, "Look, we're still offering a xx80 class GPU at the low, low price of $699" until you find out it's the AD104 12 GB, cut down version and the true xx80 AD103 16 GB version is $999.
Hm, maybe it is legit after all... Basically Ampere cooler but bigger fan, revised fan blades, probably thicker head sink as well.Another day, another rumor. This time, it's about the cooler itself. New font and 3-slots apparently. The fan has a new blade design as well. Take with a grain of salt. The fan looks like it clips through the frame, so could be shopped.
View attachment 66957
Vs. Ampere for comparison:
View attachment 66958