I'm looking closely to what he is saying for two months or so. Seems he knew details leaked recently a long time ago.
I think sushiwarrior said he had a friend that worked at AMD IIRC.
Bring it on Sushi What else you got for us??
I'm looking closely to what he is saying for two months or so. Seems he knew details leaked recently a long time ago.
I think sushiwarrior said he had a friend that worked at AMD IIRC.
Because non-Boost GPU clocks don't matter since the card boosts out of the box. So why would I care to link non-boosted clocks? It's just as pointless as linking non-boost GPU clocks for 680/770/780 cards. No one cares. What matters are is minimum boost clocks. For gamers the spec that matters is actually IN-game boost clocks, kinda like GTX770 operating at 1136-1241mhz out of the box, or a reference 780 hitting 990mhz.
http://www.anandtech.com/show/7392/the-geforce-gtx-770-roundup-evga-gigabyte-and-msi-compared/7
Did you remove GTX680's boost clock of 1058mhz too? What's next you are going to tell us to ignore R9 290X's boost clock of 1Ghz, ignore Intel CPU's turbo clocks out of the box and only focus on the non-boost clock? Ya ok.....that is not logical unless you are insinuating false advertising by AMD/Sapphire.
You made a claim that R9 280X will be slower than 7970GE and I showed you that in Sapphire's stack 2 of 3 cards will have faster clocks.
Kinda like after-market 760/770 cards often sell for $0-20 premiums, or kinda like after-market 7950/7970 cards that happen to have rebates, negating the price increase over reference models, implying that there will be R9 280X cards with 7970GE speeds for $300. Kinda like Gigabyte GTX670 Windforce offering boosted clocks, after-market cooler and upgraded VRMs for $0, available for sale on launch day.
And your point is?
R9 280X ~ 7970GE implies for $50 more over 760 2GB, you get a card that's 24% faster at 1080P and 32% faster at 1600P. Game over 760:
http://www.computerbase.de/artikel/grafikkarten/2013/nvidia-geforce-gtx-760-im-test/4/
People on our forum have pointed out for 18 months in a row HD7970/GE's market pricing that undercut 680/770, and yet NV users would say those prices don't matter since we are cherry-picking lowest SKUs with MIRs.
Now R9 280X's official MSRP will be $299, undercutting 770 by $100-150 straight up. That means in 2 months R9 280X with rebates will be hitting $279-289 and making 760-770 completely irrelevant starting next week unless NV drops prices.
Little off topic, but does anyone know when the new Ruby demo will be available for download?
Not sure why you try to over dramatise it. And boost speed is not certain, if it was, it wouldnt be called boost speed. So I dont get why you try to compare the 2. Regular R9 280X will not perform like a HD7970GE, it will be slower like a non GE. And that ruins your GTX760 compare as well. Also the 280X is not 100-150$ below the GTX770, its 100$. Its not that hard to calculate. 299 vs 399.
Making more bold claims? Slower than 7970GE. Tagging this for tomorrow.
Making more bold claims? Slower than 7970GE. Tagging this for tomorrow.
What happens tomorrow?
Regular R9 280X 870Mhz baseclock, 1070Mhz boost. HD7970GE, 1000Mhz baseclock, 1050Mhz boost.
Regular R9 280X 870Mhz baseclock, 1070Mhz boost. HD7970GE, 1000Mhz baseclock, 1050Mhz boost.
I can't stand this base clock/boost clock nonsense from nV and AMD. Assuming the specs are the same between the 2, wouldn't the higher boost clock lead to higher performance?
Regular R9 280X 870Mhz baseclock, 1070Mhz boost. HD7970GE, 1000Mhz baseclock, 1050Mhz boost.
Yes, yes it will. So i'm not sure what the point exactly of arguing over the base clock is. With the Kepler, the base clock is pretty much a worthless data point that everyone ignores and the 280X is the same in that respect. Boost clock is what will be used in 3d games that need the horsepower. Flash games played in a browser window within chrome might use the base clock. And...........who cares. I think the answer would be no one.
I can't stand this base clock/boost clock nonsense from nV and AMD. Assuming the specs are the same between the 2, wouldn't the higher boost clock lead to higher performance?
Yes and no. The boost clock is not a guaranteed clock. If it was, they would have used baseclock instead. When the load is light enough, boost works great. When it doesnt work so great you get something like this as the extreme case:
That chart has no bearing on base clock and clockspeeds used in games. It doesn't measure clockspeed at all. If you have a kepler card, if you use software to dynamically watch clockspeeds (such as MSI afterburner), you know what it does. I don't know if you're arguing for arguing' sake because you're wrong. Only boost clock matters. When you surpass TDP or temperature thresholds, your kepler card will dynamically adjust downwards in 13mhz increments. This is what typically happens: let's say you have a GTX 780 with an advertised 1018mhz boost. Your actual boost in 3d games will probably be 1100 or higher. If your temperature passes 80C and you pass TDP limits, you will go down to, let's say, 1083mhz. And it will stay there unless your card has a catastrophic failure where the fans just go kaput. You will always be above boost. I suspect you know this if you have a Kepler card.
The only way for you to go down to base clock is if your fans fail on your card. Other than that you WILL always be at your advertised boost speed or higher, assuming the 3d application in question requires the horsepower. If you're playing a game from 2006 or a flash based game that uses 1% of the GPU power available, then you may be at base clock. But for games that need it, you will always be at boost clockspeeds or higher.
Thats a 7950, we all know 7950 was limited by powertune, which is why it bounces up and down like that. The 7970 was not.
Like this?
When you surpass TDP or temperature thresholds, your kepler card will dynamically adjust downwards in 13mhz increments. This is what typically happens: let's say you have a GTX 780 with an advertised 1018mhz boost. Your actual boost in 3d games will probably be 1100 or higher. If your temperature passes 80C and you pass TDP limits, you will go down to, let's say, 1083mhz. And it will stay there unless your card has a catastrophic failure where the fans just go kaput. You will always be above boost. I suspect you know this if you have a Kepler card.
Thank you Intel, AMD, and nVidia for introducing this boost clock nonsense in CPUs and GPUs to muddy the waters.
At least in my case it provides me with no advantages. It just seems like it will start boosting up in games and scenes that don't require the horsepower anyway.