What makes people NOT believe that RX 5500 will catch up to GTX 1660 Super?
What makes people NOT believe that RX 5500 will catch up to GTX 1660 Super?
That can also be due to AIB cooler and VRM choices. Some choose to overbuild, some not. I agree that NV is largely sand-bagging when they set their base clocks, because their cards almost always stay boosted, to some extent.Going back to Navi 14 / GTX 166x.
The specs are comparable, but again I think the sustained clocks will be the difference.
I would love to be wrong though
Ok, I'll bite.My speculation:
Radeon RX5300 (1408 SP, 1448MHz, 4GB part) ~ Radeon RX 570
Radeon RX 5500 (1408 SP, 1670MHz, 4GB part) ~ Radeon RX 590
Radeon RX 5500 XT (1408 SP, 1717MHz, 8GB part) ~ GTX 1660.
I do not think the 1408SP parts will catch the GTX 1660 super. A full die Navi 14 (RX 5500 XTX?) with 1536 SP would catch the 1660 super, and such part would be even faster than the GTX 1660 Ti.
The fact they (AMD) compared it with the 1650, not the 1660, and definitely not the 1660 Ti. If it was performing at that level they would be comparing it to the 1660.What makes people NOT believe that RX 5500 will catch up to GTX 1660 Super?
Does it perform then on GTX 1650 level?The fact they (AMD) compared it with the 1650, not the 1660, and definitely not the 1660 Ti. If it was performing at that level they would be comparing it to the 1660.
Ok, I'll bite.
My speculation:
Radeon RX5300 (1408 SP, 1448MHz, 4GB part) > Between RX 560 and RX 570
Radeon RX 5500 (1408 SP, 1670MHz, 4GB part) ~ Radeon RX 580
Radeon RX 5500 XT (1536 SP, 1717MHz, 8GB part) ~ Radeon RX 590
I expect to see the 5500 a week before Thanksgiving. I don’t know if we’ll see the 5500 XT this year or next, same with the 5300 cards. I’m still expecting a future RX 5600 (Navi 12?) to fill the $250 price spot but I do not expect to see it until 2020.
I hope I'm wrong and your right.We'll see.
I think you are too low in your estimates, but time will prove someone right... which might be neither of us
And I'm very happy to say I was quite surprised. AMD came back strong much to my delight. But your now missing the point. 5700 XT beat the RTX 2070, not the RTX 2070 Super. And in line with that same idea, RX 5500 will beat the GTX 1650 but not the 1650 Super, and RX 5500 XT (if it ever gets here) will beat the GTX 1660 but not the GTX 1660 Super.The same discussion before was that no way RX 5700 XT can be faster than RTX 2070. Then it was that it cannot be more effiecient, than RTX 2070.
Oh how that aged well.
Let me quote exactly what I believe will happen:And I'm very happy to say I was quite surprised. AMD came back strong much to my delight. But your now missing the point. 5700 XT beat the RTX 2070, not the RTX 2070 Super. And in line with that same idea, RX 5500 will beat the GTX 1650 but not the 1650 Super, and RX 5500 XT (if it ever gets here) will beat the GTX 1660 but not the GTX 1660 Super.
This is my opinion what will happen with RX 5500.
RX 5500 4 GB: 150$ - GTX 1660 performance.
RX 5500 XT 8 GB: 199$ - GTX 1660 Super/GTX 1660 Ti performance.
Honestly I wouldn’t put any weight on those results. Laptops are so variable, each one changes the power, speed, have different cooling. Even if the mobile 5500 got it’s butt kicked up and down the hall I wouldn’t assume the retail 5500 would perform similarly.So it appears that GTX 1660 Ti Max-Q on average 12-14% faster than RX 5500M, based on MSI Alpha tests.
And we still have to remember that RX 5500M is hampered by the CPU, and GTX 1660 Ti Max-Q was not(Notebookcheck has had Nvidia GPU laptops paired with Intel CPUs). That is not bad.
The most interesting result for me is Overwatch. GTX 1660 Ti Max-Q is just 6% faster.
A lot of games are showing extremely bad results for RX 5500M(slower than GTX 1650), which in real world should be complete BS, but knowing state of AMD drivers for this GPU - it is quite possible.
So it appears that GTX 1660 Ti Max-Q on average 12-14% faster than RX 5500M, based on MSI Alpha tests.
And we still have to remember that RX 5500M is hampered by the CPU, and GTX 1660 Ti Max-Q was not(Notebookcheck has had Nvidia GPU laptops paired with Intel CPUs). That is not bad.
The most interesting result for me is Overwatch. GTX 1660 Ti Max-Q is just 6% faster.
A lot of games are showing extremely bad results for RX 5500M(slower than GTX 1650), which in real world should be complete BS, but knowing state of AMD drivers for this GPU - it is quite possible.
If your going to extrapolate from laptop results, you need to take into account how much further clocks can go from their mobile variants.
And what data have you to judge the actual clock speeds of RX 5500M in that specific NoteBook?You just have to look at the specs and you will see your comparison is off by quite a bit.
The Max Q variants of the chips have a lower TDP than regular laptop variants and if we look at the clocks of the max q, we will see they are way further depressed than the rx5500m. 1130-1335mhz for the max q and for the 5500m, 1327-1645mhz. This is reflected by power consumption in the laptops. The 1660 ti max q laptop consumed 30% less power than the 5500m. The closer match is the regular 1660 mobile which has clocks in the 1455 -1590mhz range.
The final nail on the coffin is the TGP of the chips. The GTX 1660 ti is rated at 80watts, the 5500m at 85watts and the max q is rated at 60watts.
If your going to extrapolate from laptop results, you need to take into account how much further clocks can go from their mobile variants.
There are a lot of lines of analysis here. With that 9880 Intel chip at 35W cTDP and 45W TDP, and if it still operates on the USB-C power (100W max) then this makes the power draw of the 5500M interesting to say the least, at 24 CUs.RX 5300M - 4 GB GDDR6, 20 CUs.
RX 5500M - 4/8 GB GDDR6, 24 CUs.
Both models are in new MacBook Pro 16.
The GPUs also have 189 GB/s bandwidth available. 3 GB/s less than Vega 20 Pro had.RX 5300M - 4 GB GDDR6, 20 CUs.
RX 5500M - 4/8 GB GDDR6, 24 CUs.
Both models are in new MacBook Pro 16.
There are a lot of lines of analysis here. With that 9880 Intel chip at 35W cTDP and 45W TDP, and if it still operates on the USB-C power (100W max) then this makes the power draw of the 5500M interesting to say the least, at 24 CUs.
About 5 TFLOPS gaming power from GCN perspective then, not bad for 50W.1536 ALUs, 1.3 GHz, 189 GB/s of bandwidth, 4 TFLOPs of compupte power.