cbn
Lifer
- Mar 27, 2009
- 12,968
- 221
- 106
Edit: Would be funny to see the Asus Voltage Mod version, claiming an extra 50% of power! Just think about that for a bit..
I think that could be an awesome part if coupled to a full cover waterblock.
Edit: Would be funny to see the Asus Voltage Mod version, claiming an extra 50% of power! Just think about that for a bit..
Also, I don't think anyone else mentioned this: 90C idle with dual displays.
Wasn't that 50% faster? and I believe it referred to the length of time it took to fry up a bacon strip.
just imagine the extra power consumption though. the gtx480 already uses 110-120 watts more than a 5870 under load so increasing the voltage and overclocking could push it to insane levels.I think that could be an awesome part if coupled to a full cover waterblock.
just imagine the extra power consumption though. the gtx480 already uses 110-120 watts more than a 5870 under load so increasing the voltage and overclocking could push it to insane levels.
when the 4870s were being blown out for under $150 AR about a year ago that was the best value in a video card since the voodoo3 tnt2 days.
Yeah, and I don't like the idea of having to overclock my video card to get a decent performance boost over what I currently have.
Also, I don't think anyone else mentioned this: 90C idle with dual displays.
It wasn't exactly as I thought before; GTX 2x0 vs HD 4x00 series again, it was close though, but this time the performance difference is even smaller this time, with the ATi's HD 5850 being slighly faster overall than the GTX 470 and the GTX 480 being slighly faster overall than the HD 5870. Will an HD 5870 2GB change it?
If you see the HardOCP review, the impression is even worse than the already bad impression of the Anand's review, because in HardOCP they raised the image quality to the maximum possible with each card, and is fairly unimpressive what nVidia is offering now.
You didn't think the DX11 scores were impressive? Since you're in the industry, isn't it logical to assume that as people get more familiar with the UDA and an improving ISA that the performance will be even better?
I got in trouble on hardforum for trying to tell some members there that the UDA isn't just drivers, it's a continually improving API that provides sort-of an ISA for the nVidia video card. Since they've adopted a certain processing unit where it's backwards compatible and the older units would decode non-native instructions to older, but appropriately native instructions (thus incurring penalty, since an atomic operation on a newer architecture may not be the case for older ones), it will get better.
I don't think that anyone can really justify the GTX 480 unless it's at MSRP, but since I was debating scrapping my WC setup that uses a PA120.3 for 2 GTX 280 H2OCs, I'm going to stay the course.
I feel a watercooled setup will provide the performance (albeit it's more expensive) while giving me the flexibility I want. I want to start toying with the SDK a bit more, and it's good to know that as the drivers get better we're going to see near-linear performance in scaling among their offerings..
The GTX 480 is scaling linear to the chip frequency: An 18 percent increase results in an equally big performance benefit. http://www.pcgameshardware.com/aid,...Fermi-performance-benchmarks/Reviews/?page=16
I have a GTX 260 black Edition and I think it's a little bit faster than your 4850.
Performance linear scaling never happens unless if there's an extreme bottleneck situation. Those numbers seems fishy to me, because the HD 5870 had a much higher overclock and didn't yield the same scaling results. So far, the PCGH is the worst review of Fermi I've seen.
Wow, that is ridiculous. That has to be some kind of driver bug right? I mean 90C at idle just because of a 2nd display? Can you imagine the heat & power to run nVidia's version of Eyefinity? D:
5850 here I come.
It's a bug. Nvidia admitted it existed to complaining customers but not in public FAIK. They claimed more then a year ago that it was on the buglist but they never fixed it.
No Nvidia cards can get into 2D power saving mode when a second screen is activated in Windows. It's not half as bad as that bug where the fan always spins at 40% and you have to manually monitor temps and adjust fan speed to stop the card from catching fire....
^ what he said. It's not *impossible* to have linear performance, but it sure is hard in the real-world to do so. You can try, but the amount of optimization defeats the purpose of a good API.
Also, I don't think anyone else mentioned this: 90C idle with dual displays.
That's a bit of a killer, although at least NV aren't pushing multiple displays, unlike ATI.
If NV were trying to say "woo look at our display outputs and you can game with 3 monitors" then it would be quite amusing.
That's a deal-breaker for me. +80W just because there's an extra screen attached? Hell no.
In general the numbers are what was rumored. It's equal or faster than a GTX295 which is fine in terms of performance I guess. But the card is 6 months late and waiting for it was not worth it imo. You could get roughly the same speeds from AMD for the past half a year (HD5870). Not to mention a HD5970 totally destroys it. Yeah yeah, blah blah blah two GPUs blah blah blah didn't stop some green folks from saying a GTX295 is better than a HD5870 cause it was still the fastest.
All in all we can expect high prices on graphics cards in the near future Which is a bummer
Wow, that is ridiculous. That has to be some kind of driver bug right? I mean 90C at idle just because of a 2nd display? Can you imagine the heat & power to run nVidia's version of Eyefinity? D:
5850 here I come.
I hate it that they launch on Friday. Why don't they do it on a Monday.
Has anyone considered the ambient temps during reviews are in the lows 20s. temps here in Africa usually hover around 35-40c so It could become a serious problem.