happy medium
Lifer
- Jun 8, 2003
- 14,387
- 480
- 126
Yessir.
Why would it not be lol.
Ok just for the sake of asking , how many volts? Can it reach 1000 core?
Yessir.
Why would it not be lol.
Way to miss the point that was stated quite clearly in a bunch of posts. I own two EVGA 580s superclocked, want a photo?
Some of us care that faulty reasoning is being employed to try to cast a card in a positive light - that's it. Plenty of the Nvidia-friendly posts seem to highlight Nvidia cards and overclocking while failing to realize that AMD cards also overclock. Is that news to you, or something?
I didn't wanna mess around with the voltage to find the lowest stable one so I shot it at 1.3, 1000 core is not my max OC though I just used it for the sake of comparison :thumbsup: .
If you find one let me know.Is there a program out there that you can use to see how much more wattage you use? I saw a review with a 6950 at 1010 core at 1.3v using 300 watts. I can't find it now, me and Castiel we talking about it.
I noticed there are some still in denial that the gtx 460 ftw was a retail card.
At that resolution, with these cards, I'd be using 8xAA or perhaps SSAA.
@1920x1080 the gtx580 is 17% faster then a gtx570 on average.
Is 17% way faster? With video cards so close in performance today, that is dam close to way faster.
SSAA in most games, maybe even 4x in some. It's actually quite fun dragging out your older games when you get a new video card, seeing ridiculous eye candy on 16x10 is a blast!
For a few more dollars (13$?) you can just buy a gigabyte SOC @ 1000 core garaunteed and it runs much more cool and quiet and uses much less power.
I can get it up to 1500 easy.
I just didn't because it doesn't have that big of an impact, BRB testing.
Stop spreading misinformation. Most data will point you to the opposite conclusion: The 6950 uses less power
It was the memory timings and the voltage that killed the memory in 6950s flashed to 6970s BIOS.Hmm...I thought the memory not clocking high was the reason some people were killing their 6950s when unlocked. Err...maybe it was the memory timings.
The 6950s running at 1250mem are at the rated speed of the chips (5Gbps) aren't they? Also remember GDDR5 has some sort of error correction that underclocks itself even if you think you're running much higher.
Hmm...I thought the memory not clocking high was the reason some people were killing their 6950s when unlocked. Err...maybe it was the memory timings.
The 6950s running at 1250mem are at the rated speed of the chips (5Gbps) aren't they? Also remember GDDR5 has some sort of error correction that underclocks itself even if you think you're running much higher.
Find me a factory overclocked 6950@ 1000 core that uses 195 watts and I'll stop spreading the so called mis-information
Here is a gtx560 @ 1000 core using 195 watts and 25 watts idle.
http://www.guru3d.com/article/gigaby...i-soc-review/7
@happy_medium, a 6950 will also perform better when it's clocked at 1000 core though right?
You have to be joking. I just showed you a graph of both a 560 and 6950 and how much power they use per clockspeed per voltage increase.
Also it would be incredibly pointless to compare any results I find from another site to the ones you are looking at from Guru3D. Guru3D has a different testing methodology that they do not release to the public, as they state right there on that very page. Since different workloads stress video cards differently, it would be incredibly pointless to compare results of two different cards from one website to another.
Also that is Guru3D's estimated TDP. It isn't the actual measured value for the card itself. They are basing that number off of data they collected by measuring power consumption at the wall, and these numbers are affected by the efficiency curve of the power supply.
So to recap...
1. 195W is ESTIMATED - it's an educated guess.
2. They use a different testing methodology than other sites.
The only way to get a valid comparison is for Guru3D to test their own 6950 under the same conditions they test other cards. They actually do test the 6950, but it's at stock, and it uses less power than even the stock 560. So I'll reiterate... stop spreading misinformaton (or rather, uninformation? uknownformation?).
You have absolutely no ground to stand on. That doesn't make you look good. And... stealth edits. Read them.No , I stand by what I said.
No , I stand by what I said.
Just from the Guru3D link you gave, the stock 6950 is using ~40w less than the 560 SOC (and those are "estimates" by the way). Considering a 1GHz core 6950 is faster than a 1GHz 560, judging by power consumption is hard to do. You would have to equalize the performance, then check the power consumption which I don't think anybody has done.
Here's some more numbers:
http://www.tweaktown.com/reviews/3796/gigabyte_geforce_gtx_560_ti_1gb_soc_video_card/index18.html
The SOC and a 6950 at 970 use almost the same power, but at that speed the 6950 would probably be faster anyway.
Thats a good link , now thats a 9 watt difference from 970 core on the 6950 to the gtx560's 1000 core. Add another notch of voltage and up the clocks 30 more on the core for the 6950 and wa-laa you have more then 9 watts EASY.
Thanks for making my point. I rest my case.
Edit: we were not talking about which card was faster. At least I wasn't. I really don't know. My arguement started at post #112.
My point was you can max both the 6950 and gtx560 's clocks and the gtx560 will come out ahead in every category, price ,performance, noise, temps.
This has to be a joke...OCed the GTX 560 so much yet left the 6970 practically stock?
While the comparison between a stock HD6970 and an overclocked GTX 560 does seem unfair,
2. The GTX560 is way overclocked compared to the 6970 which has a mere 10Mhz. (Oops!)
No doubt the 560 is a good chip, so there is no need for the results to be swayed like this.
Comparing a stock card to one that is overclocked 200mhz is pretty stupid.
Sorry my englishAt a res that no one buying a 6970 would play? No OC on the 6970. Cherry pick a couple benches. It's crap.
ASUS Radeon HD 6870 DirectCU 915/1050 @ 1000/1120 & 1680*10502010-11-10 1746 - metro2033
Frames: 7285 - Time: 221103ms - Avg: 32.948 - Min: 18 - Max: 69
comparing chart2011-01-29 10:08:22 - metro2033
Frames: 7200 - Time: 228679ms - Avg: 31.485 - Min: 15 - Max: 65
This has to be a joke...OCed the GTX 560 so much yet left the 6970 practically stock?
While the comparison between a stock HD6970 and an overclocked GTX 560 does seem unfair,
2. The GTX560 is way overclocked compared to the 6970 which has a mere 10Mhz. (Oops!)
No doubt the 560 is a good chip, so there is no need for the results to be swayed like this.
Comparing a stock card to one that is overclocked 200mhz is pretty stupid.
At a res that no one buying a 6970 would play? No OC on the 6970. Cherry pick a couple benches. It's crap.