VGA and DVI is dead, manufacturers need to stop putting them on new displays and video cards.This!!!
PS. Thanks for this thread, Sweepr.
Edit: For the 1050 to truly "replace" the GTX950, it would have to support VGA monitors. Does it?
Kinda disappointed about the clocks, with the architecture these should be capable of like 1.8GHz at decent power consumption should they not, possible justify a 6-pin version.
Why are the boost clocks of GP107 300+ Mhz lower than GP106 and GP104. Is GP107 manufactured at Samsung 14 LPP ? Heck if this is true then we can clearly see that TSMC 16FF+ is more like a half node ahead in terms of transistor performance vs 14LPP. If this is true then it would be interesting to see Vega built at TSMC 16FF+ which I think is likely to be the case as TSMC has superior transistor performance and much better yields especially for large die GPUs.
Can we not turn this into a Vega discussion? Thread's about the GTX 1050
Why are the boost clocks of GP107 300+ Mhz lower than GP106 and GP104. Is GP107 manufactured at Samsung 14 LPP ? Heck if this is true then we can clearly see that TSMC 16FF+ is more like a half node ahead in terms of transistor performance vs 14LPP. If this is true then it would be interesting to see Vega built at TSMC 16FF+ which I think is likely to be the case as TSMC has superior transistor performance and much better yields especially for large die GPUs.
Anyway even on topic whats the reason for these much lower boost clocks. I am thinking the 14LPP process is the factor at play here.
Anyway even on topic whats the reason for these much lower boost clocks. I am thinking the 14LPP process is the factor at play here.
I know there were rumors that nVidia were going to do GP107 and GP108 at SS. But I don't think that's the case. Pascal Refresh OTOH...
There is no Pascal refresh. Also 14LPP is an inferior process, so why would a hypothetical (and non-existent) Pascal refresh be manufactured on that process over 16FF+?
I would guess power consumption, quite frankly. They need to get it comfortably sub-75W so that it not only doesn't require a PCIe connector but, out of the box, has overclocking headroom for enthusiasts/geeks to mess around with.
The lower power consumption should also help for thin and light gaming capable notebooks.
That said, I hope NVIDIA sticks with 16FF+ on this one, it's clearly the superior process to 14LPP. But, who knows. We'll find out soon-ish.
I'm sure they want to release something new to consumers next year, and it's not going to be Volta. It is denser so they would be able to stick in more cores in the same die area.
SS's process is worse, but it's not as bad as you think. It's GloFo's bastardization of Samsung's node that's bad.
Doesn't matter, this is just an example where shading performance doesn't translate to overall gaming performance. And you don't have data to prove that shading (or bandwidth) will be the bottleneck regarding 1050 vs 1060, as you carefully stated here:
You're just assuming the former is the case, and doing some pessimistic performance predictions based on it. I think it will be closer to a 960 than you expect.
It would <profanity redacted> if it was actually released at 950 price point, that card was massively overpriced at $160 for the worthless 2GB and &190 for the 4GB version. Something like this needs to be in the price range of $100 to $110 to compete with the RX 460.
Why would nvidia lower the price point of the 1050, making it cheaper then the 950, which was a pretty successful card? As for the RX 460, it's just about able to compete with the current 950, so it's hardly like AMD is going to be putting much pricing pressure on the significantly faster 1050. Sure we'd all like the 1050 to be cheaper, but I've got to be realistic - without sufficient competition I expect the price to go up slightly.
Yeah, if the 1050 outperforms RX 460, it's going to be priced higher. Period.
It would <profanity redacted> if it was actually released at 950 price point, that card was massively overpriced at $160 for the worthless 2GB and &190 for the 4GB version. Something like this needs to be in the price range of $100 to $110 to compete with the RX 460.
Profanity is not allowed in the technical forums,
Markfw900
That's because the RX 470 is the competitor of the 1050.... not the 460. And the competitor of the 460 is supposed to be the GT 1040 and I see nVIDIA releasing it along the 1030 as the bottom of the barrell.For sure will beat 460 easily. Well, that was not a very hard thing to do I guess.. Let's just see how much nvidia tax will be imposed
Can't you use a DVI to VGA adapter? And besides what is so great about VGA anyway?Anyways, I'll be the first to tell you that I'm not really a big NVidia fan, but I'm going to wait to see how this 1050 turns out. Was considering a RX 460 4GB Gigabyte dual-fan model, but they canned the VGA output capability, apparently.
Can't you use a DVI to VGA adapter? And besides what is so great about VGA anyway?
That's because the RX 470 is the competitor of the 1050.... not the 460. And the competitor of the 460 is supposed to be the GT 1040 and I see nVIDIA releasing it along the 1030 as the bottom of the barrell.
I'm expecting $150 for 2GB and $180 for 4GB and GTX 960 performance.
Problem is being $180 puts it way too close to RX470 4GB and 1060 3GB for way less performance.