Radeon HD 7970 SALE !

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Die size doesn't mean anything when it comes to determining high or low end GAMING CARDS. It is performance only. It performs better than last gen and that's why they are priced as such.
I'm not talking about the cards, genius. My argument addresses die sizes specifically. If every business in the electronics industry based prices off of performance relative to the previous generation, the prices of things would be astronomical.
It's not rocket science like you're trying to make it.
I'm not trying to make it into rocket science -- if you were capable of reading, you'd have seen that I said it was "simple."
The simple subject you're avoiding is that Nvidia and AMD don't care what the die size is as long as the performance increases are enough to sell you a card. It is high end because the GTX 680 beats the GTX 580 and the HD7970 beats the HD6970.
The simple subject you are avoiding is that I don't give a rat's ass about what Nvidia and AMD think. It is an artificial high end because small dies are being put in cards labeled selling as if they were high end parts. I'm not stupid enough to pay $500 for a 294mm² die -- not when I have a good idea of what wafer costs are.

I've never heard that metric used when determining if a part is high end or not. That's like saying any GPU under 800MHz is not high end.
Not even remotely -- you clearly don't understand how ICs work. Clock speed can vary wildly, depending on architecture, process, yields
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Seriously, it's been stated numerous times all over the internet, and several times by me directly to you. A 300-350mm² die is not high end. If you can't comprehend a subject so simple, please refrain from commenting on it.

I know what you are saying. Trust me I was very vocal when HD7970 launched at $550 and I still think that GK104 is Kepler mid-range chip. However, right now GK104 is NV's high-end GPU. Think about this, let's go with your theory and agree that GK104 is just a mid-range GTX560Ti successor and we got 'ripped off'. Fair enough. But now if NV got away with it this round, you think they'll give us 50-75% faster GTX780 for $500 in 2013? NV just saw that they can sell a 294mm^2 chip that was just 35% faster than GTX580 and gamers gobbled it up for $500 no problem. They have little to no incentive now to go back to the 500mm^2 strategy in the consumer gaming market and charge $500 for that. Unless HD8000 series is some miracle GPU that brings 50% performance increase, NV is in no hurry to drop Big K and reduce their profit margins when GK104 is flying off the shelves for $400-500. Interestingly, it seems gamers are even willing to forego some performance for lower power consumption with Kepler as we are seeing AMD dropping HD7970 well under $400 just to move them off the shelf despite HD7970's overclocking capabilities. Maybe gamers got too fed up with GTX480 and are really focusing more and more on performance/watt and not just absolute performance?

At the same time if what NV has been saying is true (i.e., the fabs are increasing wafer pricing as it becomes harder and harder to get good yields on ever shrinking node processes and new fabs cost even more billions of dollars to build), I am just not sure even NV is going to be eager to launch a 500mm^2 GTX780 for $550 even next year. I really don't know if this generation is an outlier or a new trend. We'll have to see how fast GTX780/HD8970 are to see if we got rapped this generation for real.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm not talking about the cards, genius. My argument addresses die sizes specifically. If every business in the electronics industry based prices off of performance relative to the previous generation, the prices of things would be astronomical.
I'm not trying to make it into rocket science -- if you were capable of reading, you'd have seen that I said it was "simple."
The simple subject you are avoiding is that I don't give a rat's ass about what Nvidia and AMD think. It is an artificial high end because small dies are being put in cards labeled selling as if they were high end parts. I'm not stupid enough to pay $500 for a 294mm² die -- not when I have a good idea of what wafer costs are.

Ok...you know, you can sit here and argue this all you want. The fact remains that the cards we have are the highest performing GPUs available. That's high end whether you like it or not. Nothing you say will change the fact that the GTX 680 is the fastest card Nvidia has produced and the HD7970(now the GHE) is the fastest card from AMD.

Nobody buying these cards really cares what you think of the die size. The thing you're talking about here doesn't matter and that's what I've been trying to say to you and you conveniently ignore it. Die size doesn't matter ever when determining Video Card pricing because you can take a card that by your standards would be crap and make it outperform everything before it. That's what was done here and honestly the people buying the cards don't even care. Wafer costs don't matter, your thought process is all flawed. Nvidia or AMD aren't going to magically price their fastest cards ever down at $250. The bleeding edge of technology always has costs and the current cost is what we have to pay to stay up to date. If you don't want to that's fine too, but don't crap on everyone because you think you deserve GTX 680 performance for $250.
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
But now if NV got away with it this round, you think they'll give us 50-75% faster GTX780 for $500?
No, the price of the 780 will likely be over $600, and that's a very conservative estimate. Still, I can't imagine that perf/$ will be much worse than the 680 at launch, as TSMC's production woes will hopefully be much less pronounced. At that die size though, the performance premium would be reasonable. It's understandable to charge extra for a high end part -- the 680 didn't fit that bill, nor did the 7970.

NV just saw that they can sell a 294mm^2 chip just 35% faster than GTX580 and gamers will gobble it up for $500 no problem. They have little to no incentive now to go back to the 500mm^2 strategy and charge $500 for that.
Right. Unlike the majority of consumers out there though, I'm not ignorant when it comes to wafer economics and the GPU market.

Unless HD8000 series is some miracle GPU that brings 50% performance increase, NV is in no hurry to drop Big K and reduce their profit margins when GK104 is flying off the shelves for $400-500.

At the same time if what NV has been saying is true (i.e., the fabs are increasing wafer pricing as it becomes harder and harder to get good yields on ever shrinking node processes and new fabs cost even more billions of dollars to build), I am just not sure even NV is going to be eager to launch a 500mm^2 GTX780 for $550 even next year.
I think $550 is far too generous, really. It depends on whether or not the chip is partially disabled or not.
I really don't know if this generation is an outlier or a new trend. We'll have to see how fast GTX780/HD8970 are to see if we got rapped this generation for real.
Well, even if it were a new trend, the fallout would be a one-time issue. They can't continually raise the prices. Still, 20nm and 14nm look concerning, and I'm hoping that some sort of breakthrough helps alleviate the headaches that are sure to come.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'm not trying to make it into rocket science -- if you were capable of reading, you'd have seen that I said it was "simple."

Come on now. cmdrdredd was very respectful in replying to your post. Even if you guys disagree, you don't have to be so harsh.

The simple subject you are avoiding is that I don't give a rat's ass about what Nvidia and AMD think. It is an artificial high end because small dies are being put in cards labeled selling as if they were high end parts. I'm not stupid enough to pay $500 for a 294mm² die -- not when I have a good idea of what wafer costs are.

That's fair. No one is forcing anyone to pay $500 for a 300mm^2 die if you don't think it's reasonable. However, what if GK110 is a 500mm^2 die with dynamic scheduler and double precision and is only 35-40% faster than GTX680? Would you pay $500 for that?

GTX580 -- > GTX680 ~ 35% for $500
GTX680 -- > GTX780 ~ let's assume 40% for $500

If you take away the die sizes, the performance increase may be similar. So I think if you want to make a stronger point, it's more like you won't pay $500 for any GPU unless you get an 80-100% performance increase. I think that's a much better argument than to argue die sizes vs. price alone. R600 was a huge die GPU (2900XT) and its performance was terrible.

Put it this way, if you have a GTX580 today or similar and you skipped GTX680, you might wait to get a GTX770/780 or something to get 70-80% performance increase. Well the GTX670/680 user can skip GTX770/780 series and upgrade during Maxwell and get the same 70-80% increase. To me it's not about die sizes per say but how much of a performance increase do I need to justify paying $ for a certain % increase in GPU speed. It's mostly about timing, budget and when do you need more performance. GPUs get faster and/or cheaper over time. So the longer you wait the more value you'll get for every $ you spend.

My 7970 is 75% or so faster than an HD6970. If I had a GTX580 OCed and I was gaming only, I probably would have skipped this generation.
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Ok...you know, you can sit here and argue this all you want. The fact remains that the cards we have are the highest performing GPUs available. That's high end whether you like it or not. Nothing you say will change the fact that the GTX 680 is the fastest card Nvidia has produced and the HD7970(now the GHE) is the fastest card from AMD.

Nobody buying these cards really cares what you think of the die size. The thing you're talking about here doesn't matter and that's what I've been trying to say to you and you conveniently ignore it. Die size doesn't matter ever when determining Video Card pricing because you can take a card that by your standards would be crap and make it outperform everything before it. That's what was done here and honestly the people buying the cards don't even care. Wafer costs don't matter, your thought process is all flawed. Nvidia or AMD aren't going to magically price their fastest cards ever down at $250. The bleeding edge of technology always has costs and the current cost is what we have to pay to stay up to date. If you don't want to that's fine too, but don't crap on everyone because you think you deserve GTX 680 performance for $250.
You are missing a really simple concept -- voting with your wallet. If someone's stupid enough to pay $100 for a graphite pencil and telling pencil manufacturers that $100 pencils are okay, why shouldn't I speak up about it?

All your argument sums down to is "I'm right, you're wrong!" I don't have time for someone with such a fundamentally flawed understanding of the GPU industry.

Also, although a 680 for $250 would be great, that's absolutely ridiculous right now and shows how little you understand.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
You are missing a really simple concept -- voting with your wallet. If someone's stupid enough to pay $100 for a graphite pencil and telling pencil manufacturers that $100 pencils are okay, why shouldn't I speak up about it?

All your argument sums down to is "I'm right, you're wrong!" I don't have time for someone with such a fundamentally flawed understanding of the GPU industry.

Also, although a 680 for $250 would be great, that's absolutely ridiculous right now and shows how little you understand.

It's not an I'm right and you're wrong approach. It's marketing, profits, and pleasing the stock holders at the meetings that I'm trying to convey.

What I said is pretty much how your thinking translates. By you're reasoning a die like the one used in the 670/680 shouldn't be priced at $500 even when it is the fastest thing produced by Nvidia. They aren't here to make friends, they are here to make money and I don't care. I'm buying a GPU for gaming and I want one that does what I want and performs. I have no options and I'm certainly not going to suffer with 30fps because I think $400 is too much. I have far more expensive hobbies.

We'll just have to disagree because while I understand what you're saying. It doesn't work with the marketing team. They see benchmarks, performance, and profits.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
No, the price of the 780 will likely be over $600, and that's a very conservative estimate. Still, I can't imagine that perf/$ will be much worse than the 680 at launch, as TSMC's production woes will hopefully be much less pronounced.

That's exactly my point. If you have a GTX580 or HD6970 and you waited all this time, then you'll argue that the price/performance for GTX780 has improved over the GTX680. Of course it did since you waited almost 2 years to upgrade and GPUs get faster with time. What about the guy who upgraded from HD6950 or GTX560Ti to GTX680/7970 and got 75-100% performance increase this generation?

Your argument is more about wanting a substantial performance increase for $500-600 than die sizes per say. If GTX680 was a 500mm^2 die GPU and cost $500 and was just 35% faster than GTX580 (but the extra die size was used exclusively for GPGPU and compute), would you have upgraded to it then? It sounds like you just think the performance increase is too small and you use the die size to validate the point that GK104 is a mid-range GPU and that NV held back the real flagship. Sure, that's probably true. But at the same time it also means the longer you wait to upgrade, the more price/performance you get. So then when do you finally pull the trigger (when the card is 50%, 75%, 100%, 500% faster)?

You realize when you upgrade to GTX780 for $550-600, the HD7970 OC / GTX680 owners may just skip that and get Maxwell/HD9000 series because to them a 40% increase for $500 may be poor price/performance upgrade? What you are talking about is upgrading when you think the asking price is worth a certain % increase in performance. This varies from 1 gamer to the other. This is not a die size argument but more of a Moore's law / technological performance increase per $ spent argument. You think this generation didn't provide sufficient performance increase from last, which is fair. Actually, I agree with you that this generation is underwhelming compared to the GTX580. But if GTX780 is a 500mm^2 GPU and is only 40% faster than a 1.15ghz HD7970/GTX680 OC, it's just as much of a 'fail' using your train of thought as GTX680/7970 were compared to the 580. To you it might look like an 80-100% faster card for $550 though.

Since I upgraded from an HD6970, I got a ~60-75% increase in the programs I use. I think that's fair considering after reselling the 6900 card, my upgrade cost was a fraction of the asking price. So even if GTX780 is 40% faster than the 7970 OC, it'll actually be a smaller increase for me. That's the thing -- it also depends where you are on the GPU upgrade cycle if you think the new generation is worth it or not.

Alternatively, you could buy AMD GPUs and bitcoin mine on the side, OR resell your GPUs frequently and incur a smaller cost of ownership through depreciation. There a some small steps that can be taken to minimize the upgrade costs. Buying $500+ GPUs and holding onto them for 4-5 years isn't a good idea though.
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Come on now. cmdrdredd was very respectful in replying to your post. Even if you guys disagree, you don't have to be so harsh.
I don't appreciate arguing with someone that can't spend the time to grasp what I'm saying. I find it quite disrespectful. I definitely need to find better ways of conveying that though.

That's fair. No one is forcing anyone to pay $500 for a 300mm^2 die if you don't think it's reasonable. However, what if GK110 is a 500mm^2 die with dynamic scheduler and double precision and is only 35-40% faster than GTX680? Would you pay $500 for that?

GTX580 -- > GTX680 ~ 35% for $500
GTX680 -- > GTX780 ~ let's assume 40% for $500

If you take away the die sizes, the performance increase may be similar. So I think if you want to make a stronger point, it's more like you won't pay $500 for any GPU unless you get an 80-100% performance increase. I think that's a much better argument than to argue die sizes vs. price alone. R600 was a huge die GPU (2900XT) and its performance was terrible.

Put it this way, if you have a GTX580 today or similar and you skipped GTX680, you might wait to get a GTX770/780 or something to get 70-80% performance increase. Well the GTX670/680 user can skip GTX770/780 series and upgrade during Maxwell and get the same 70-80% increase. To me it's not about die sizes per say but how much of a performance increase do I need to justify paying $ for a certain % increase in GPU speed. It's mostly about timing, budget and when do you need more performance.

My 7970 is 75% or so faster than an HD6970. If I had a GTX580 OCed and I was gaming only, I probably would have skipped this generation.
You raise a valid point, and no, I probably wouldn't care if the 680 performed as if it were a "100" class part as opposed to it performing as a "104" class part, if you understand what I mean. Die size by itself is not the issue, but given the fact that GPU performance is governed highly by how many units there are and that each of those units take up a certain amount of space, there is a huge correlation between die size and performance. Unlike performance though, dies actually are a tangible resource -- a wafer is only so large and costs a certain amount. It's easier to point the finger here.

At any rate, regardless of the die size or cost, the performance simply isn't there. A full node shrink shouldn't have only resulted in a 35% increase (using your numbers) performance. On AMD's side of course, it didn't -- the number is a bit larger. When you throw cost into the equation, it's simply been a depressing generation, and I'd bet it's largely due to TSMC.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,690
2,671
146
Please remember to be civil at all times. Insulting other people's intelligence, or being demeaning is not acceptable and further squabbling in this thread will likely be met with infractions.

Moderator Shmee
 

djnsmith7

Platinum Member
Apr 13, 2004
2,612
1
0
We are all entitled to our opinions. When it comes to a product such as a GPU, we're going to have varying opinions, no doubt.

Die size shouldn't necessarily be the primary point of the argument though, but more so the results the product is capable of producing. Each person is going to have their own individual set of requirements / parameters, depending on their particular situation.

Take my situation 3 months ago, for example. My requirements: I wanted the fastest AMD product on the market, wanted to max out BF3 and have future support for 4K resolutions in the near future. Did the 7970 meet these requirements? Absolutely. Was I worried about die size? No. Why? Because die size didn't affect whether or not the product met my requirements.

Are there folks that will consider the 7970 to be a mid-level GPU? Sure. What about the GTX680? At some point, sure.

So, we can debate mid-level vs. high end all...day...long. And feel free, while the rest of us are playing the you-know-what out of BF3 or enjoying an absolutely amazing 70" 4K LED display in the meantime. And trust me, my little rinky dink 7970 TriFire setup will push that beast to its limits.

Now, while you love birds are still throwing rocks at each other, I'm going to grab some popcorn & let the show continue.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Come on now. cmdrdredd was very respectful in replying to your post. Even if you guys disagree, you don't have to be so harsh.



That's fair. No one is forcing anyone to pay $500 for a 300mm^2 die if you don't think it's reasonable. However, what if GK110 is a 500mm^2 die with dynamic scheduler and double precision and is only 35-40% faster than GTX680? Would you pay $500 for that?

GTX580 -- > GTX680 ~ 35% for $500
GTX680 -- > GTX780 ~ let's assume 40% for $500

If you take away the die sizes, the performance increase may be similar. So I think if you want to make a stronger point, it's more like you won't pay $500 for any GPU unless you get an 80-100% performance increase. I think that's a much better argument than to argue die sizes vs. price alone. R600 was a huge die GPU (2900XT) and its performance was terrible.

Put it this way, if you have a GTX580 today or similar and you skipped GTX680, you might wait to get a GTX770/780 or something to get 70-80% performance increase. Well the GTX670/680 user can skip GTX770/780 series and upgrade during Maxwell and get the same 70-80% increase. To me it's not about die sizes per say but how much of a performance increase do I need to justify paying $ for a certain % increase in GPU speed. It's mostly about timing, budget and when do you need more performance. GPUs get faster and/or cheaper over time. So the longer you wait the more value you'll get for every $ you spend.

My 7970 is 75% or so faster than an HD6970. If I had a GTX580 OCed and I was gaming only, I probably would have skipped this generation.
I assume you are saying that your oced 7970 is 75% faster than a stock 6970? even that sounds like a stretch though as a stock 7970 is only 30% faster than a stock 6970 at 1920 according to techpowerup.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
And feel free, while the rest of us are playing the you-know-what out of BF3 or enjoying an absolutely amazing 70" 4K LED display in the meantime.

You have a 70" 4K LED TV in your house?

I assume you are saying that your oced 7970 is 75% faster than a stock 6970? even that sounds like a stretch though as a stock 7970 is only 30% faster than a stock 6970 at 1920 according to techpowerup.

Ya, sorry about that. I meant "my" HD7970 (i.e., @ 1150mhz GPU). Ya, for games @ 1150, it's much less, around 50-60%, but in GPGPU tasks it's 70-75% because they take advantage of all the shaders (1150mhz * 2048 SP / (880mhz * 1536 SP) = +74%). My main point was that it also depends on how much performance increase one is getting and what their upgrade cycles are and what they sold their previous GPU for, etc. that affects the price/performance of this generation. For example, if you had a GTX460 and upgraded to a GTX670, then you'd think it's a great value at $400. If you had a GTX580, you'd think GTX670 is a very small upgrade. I think the stock 7970 is about 40-41% faster on avg than a 6970 at 1080P:

http://www.hardwarecanucks.com/foru...s/49646-amd-radeon-hd-7970-3gb-review-25.html
http://www.computerbase.de/artikel/grafikkarten/2012/test-amd-radeon-hd-7970-ghz-edition/4/

TPU has it 100% for HD7970 GE vs. 65% for 6970 (so 100/65 = 53% faster)

Also, for AMD GPUs, because of bitcoin mining on the side, the upgrade costs are skewed. For example, after selling the 6950 and converting some BTC coins that it made on the side, my 7970 upgrade was free. So I mean I didn't really care that the card was > $400. That was pure luck though since I had no idea that NV cards were so poor BTC back then when I got the 6950. So I guess for a GTX560Ti/570 owner, this generation is pretty underwhelming if they wanted a card 50-60% faster for $200-250. HD7950 is getting close with an overclock, but it's still > $300. AMD has no card that's better than the 560Ti/570 yet.
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Seriously, it's been stated numerous times all over the internet, and several times by me directly to you. A 300-350mm² die is not high end. If you can't comprehend a subject so simple, please refrain from commenting on it.

lol. There goes the good old saying that size doesn't matter. Seriously though, this is one of the biggest loads of poo I've read in a while. Also what your saying is that there hasn't been a "highend" Radeon since the 2900XT?
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I think the show's over... not really much left to be said.

I actually really like the highend cards this round. We have pretty near equal options performance wise to choose from and that isn't something to be upset or disappointed with.
 

BababooeyHTJ

Senior member
Nov 25, 2009
283
0
0
I guess you are right.

I think GTX280 was $650 and dropped to $500 but NV offered $150 back to early adopters. GTX280's resale value tanked 12 months from launch when AMD launched HD4890 with the 90-95% of the performance for $269 and when NV brought out the cheaper GTX275. Until that point, I am pretty sure GTX280 held steady and you could sell it and recoup most of that $500 value if you timed it right.

In the case of GTX480, it did tank. Still GTX480 @ 580 speeds was the fastest single GPU for almost 2 years (from March 26, 2010 to first week in January, 2012 when HD7970 was for sale in retail). It also had strong tessellation performance and 1.5GB of VRAM which means 2 years later and it's still a good card vs. HD5870 1GB that struggles in DX11 games.

EVGA offered those rebates and maybe BFG. I don't know of any manufacturers that offered that rebate. So if you bought a card through someone else you were SOL. I bought a brand new EVGA GTX 280 in less than six months after release. Thats not some craps shoot with a warranty brand like Visiontek. Yes, resale value on the 280s tanked, fast.

As far as GTX480 goes, that tanked in resale value too. You've got to remember that just nine months after GTX 480 was released Nvidia offered a quieter, cooler, less power hungry video card with comparable performance for $330.

Its the nature of being an early adopter, there is always something better around the corner. You want bleeding edge performance? Its going to cost you. At the end of the day 7970 is the fastest single gpu on the market. It overclocks further than a GTX680. Although performance is close most of the time.

Also 5870 released six months before GTX480. It better have been faster especially considering how much more it cost.

HD7970 went from $550 to $370 in just 6 months....That's way worse than GTX480. I am pretty sure in September of 2010 (6 months after launch) that GTX480 was still selling for at least $450. Also, it's doubtful that HD7970 OCed will remain the fastest single-GPU for almost 2 years from January 2012. So I think it's worse than either the GTX280/480 cases.

Its not uncommon to get a 30% overclock out of a 7970. So, that has yet to be seen. Right now you're just making assumptions.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I don't appreciate arguing with someone that can't spend the time to grasp what I'm saying. I find it quite disrespectful. I definitely need to find better ways of conveying that though.

...

At any rate, regardless of the die size or cost, the performance simply isn't there. A full node shrink shouldn't have only resulted in a 35% increase (using your numbers) performance. On AMD's side of course, it didn't -- the number is a bit larger. When you throw cost into the equation, it's simply been a depressing generation, and I'd bet it's largely due to TSMC.

I'm late to this thread, but I can't grasp what you are saying... can you make the point more plainly? I understand there is an issue with die size, and performance... but how are those tied together?

Is it that you believe that the die size used should create a better theoretical performance, and since it doesn't, it should be priced lower? Is it a viewpoint that sort of like the argument that a price of a video card should depend on how much it costs to make it in resources?

If so, how do you account for the huge price swings? My understanding is that video cards are not priced primarily based on how much they cost to make. Rather, I think the primary price determination is based on what people are willing to pay to get a level of performance?

I guess my point is that you could conceal the die size from the market, make it a mystery, and yet you'd see the same pricing trends because it's driven by demand for performance. If the die size matters, you'd at least see that advertised on the box?

I really think I'm missing your point because I can't figure out the flaw in what I laid out above, so there is more subtlety that went over my head so I'd appreciate some clarification.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Its not uncommon to get a 30% overclock out of a 7970. So, that has yet to be seen. Right now you're just making assumptions.

I think it's pretty fair to say that HD8970 and GTX780 will be faster than HD7970 OCed by even 30%. They will also have overclocking headroom.

Also, GTX480 @ 580 speeds was basically the 2nd fastest GPU after the 580 from March 26, 2010 to ~January 9, 2012 (when first 7970 sold on Newegg). You really think HD7970 OCed will be 2nd fastest GPU from January 9, 2012 to September 2013? Doubtful.

I agree with you that early adopters always pay premium price, which is nothing new. But even look at the HD7950. It dropped from $449.99 to $280-290 at certain places. The after market 7950s fell from $500 to $310.

HD4850/5850/6950 never dropped so drastically just 6 months after release. Of course they cost less to begin with but this generation, AMD cards lost a ton of value, while NV cards haven't moved a dollar.

Knowing this, when HD8900 series launches, it's more likely than not that AMD will once again try this predator pricing strategy, but how many people will actually fall for this again after seeing what happened to 7900 series?
 

BababooeyHTJ

Senior member
Nov 25, 2009
283
0
0
I think it's pretty fair to say that HD8970 and GTX780 will be faster than HD7970 OCed by even 30%. They will also have overclocking headroom.

Also, GTX480 @ 580 speeds was basically the 2nd fastest GPU after the 580 from March 26, 2010 to ~January 9, 2012 (when first 7970 sold on Newegg). You really think HD7970 OCed will be 2nd fastest GPU from January 9, 2012 to September 2013? Doubtful.

I agree with you that early adopters always pay premium price, which is nothing new. But even look at the HD7950. It dropped from $449.99 to $280-290 at certain places. The after market 7950s fell from $500 to $310.

HD4850/5850/6950 never dropped so drastically just 6 months after release. Of course they cost less to begin with but this generation, AMD cards lost a ton of value, while NV cards haven't moved a dollar.

Knowing this, when HD8900 series launches, it's more likely than not that AMD will once again try this predator pricing strategy, but how many people will actually fall for this again after seeing what happened to 7900 series?

You're making assumptions that there is no way to prove. I can say this with certainty. 7970 will have a much longer lifespan as the fastest single gpu on the market than 480 did. From the looks of things both Nvidia and AMD's next lineup is a long ways off. Nvidia has only released three models of the same gpu and one laptop gpu. They haven't even come close to flushing out their current lineup. Amd still hasn't released the dual gpu version of tahiti. There are also no rumors of anything on the horizon for either company for the desktop. I don't expect a refresh from either company this year or probably early next year. When there is a refresh released it'll still be stuck on 28nm. Without increasing the die size its going to be hard to have architectural improvements that large.

You could also debate that 6970 or even 6950 was the second or third fastest gpu on the market when you take overclocking into account. They did very well at higher resolutions and that $300 6950 was hardly any slower than 6970 clock for clock and thats if it couldn't unlock.
 
Last edited:

BD231

Lifer
Feb 26, 2001
10,568
138
106
Screw the 7970 and the 680/670, they're simply filler cards meant to raise profit margins through slow progression and milk a suckers wallet.

I challenge anyone to name some $550 retail launch cards that were sold for $369 bucks within the same year of their release. $200 off the top in half a years time, now that's nuts, especially for anyone who spent $500+ this round.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |