XFX Jumps Off GTX 480 / GTX 470 Ship ?

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The interesting thing is that if GTX480 came out with 10% lower gpu clocks, 256-bit memory bus with faster GDDR5 (could still get higher memory bandwidth that 5870), they would have

1) Been more likely to get more chips at a 70 mhz lower clock speed, helping them lower production cost per chip;
2) Still be as fast as 5870 in most games, while faster in DX11 games;
3) Run cooler.
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
Since we have seen their cards and they appear to be legit you have to think either nvidia cut them out of it or they are scared to offer their warranty on these cards.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
Hmm maybe Charlie's rumor about their tying fermi to their older products was accurate after all. Could also be just the are afraid of the heat issue long term, and unlike EVGA don't want to sell waterblock versions.
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
One thing with those cards is that if they are run at 100% Non stop(like by someone doing folding) or with large overclocks the cards might have a fairly low life expectancy. If the supply is limited they might find themselves in the position of having to give their customers next gen cards for free if their fermi cards fail.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
I posted a "what if" about the paper lauch and the sudden push back of actual availbility. What if nVidia was tight lipped even to their board partners? What if board partner's were hinging their initial bulk purchases on NDA lift, reviews, and public opinion? nVidia knew these things were hogs; perhaps board partners didn't?

Are ALL components spec'd, owned, and sold by nVidia to board partners who simply slap on their labels to then sell and support the finished product? Imagine writing that PO. I sure as hell would want to know what I'm buying if I were XFX. Think it was based on public reaction?

Lifetime warranty is big with XFX but they've offered non-lifetime before in the US market. According to the article, this is ALL markets, NO Fermi at all for any market. Maybe it's about not spending the cash on a lemon. Not saying Fermi's a lemon; but it's certainly not bringing much to market. This is BIG news. Wow..
 
Last edited:

T2k

Golden Member
Feb 24, 2004
1,665
5
81
After stories like this I won't be surprised if more and more OEM will follow XFX...
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
After stories like this I won't be surprised if more and more OEM will follow XFX...

Sounds shady. How can nVidia dictate what an e-tailer buys.. they're buying from board partners right? Crazy if so.

Still waiting on those 2GB cards, ATI. Opportunity is knocking..
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Sounds shady. How can nVidia dictate what an e-tailer buys.. they're buying from board partners right? Crazy if so.

Still waiting on those 2GB cards, ATI. Opportunity is knocking..

Here I am still hoping for either a cheaper 5970 or a very cheap second 5870 1GB...
 

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
This is a clear indication that Nvidia may take a loss with the GTX480/470 niche..It's already expensive manufacturing and marketing these two cards, plus they are selling it below their manufacturing cost, so a loss is prominent..
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
This is a clear indication that Nvidia may take a loss with the GTX480/470 niche..It's already expensive manufacturing and marketing these two cards, plus they are selling it below their manufacturing cost, so a loss is prominent..

You mean imminent? I'm sure that's the word Charlie will use.
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
This is a clear indication that Nvidia may take a loss with the GTX480/470 niche..It's already expensive manufacturing and marketing these two cards, plus they are selling it below their manufacturing cost, so a loss is prominent..

Well when you consider the much larger chip, more memory modules and more expensive pcb(wider memory bus) and beefy cooling you have to wonder how much more these things cost to make vs 58xx cards.
 

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
Wow, Nvidia is not doing good...it's a little sad though...

Well the tides have turned this time, Ati is no longer that old solo company as in the 9800 pro days, their drivers and cards were plagued with issues and incompatibility, but since AMD bought Ati, things have turned around quite positively. There's not much hope now for Nvidia, I mean AMD's strategy of "bang for the buck" is everywhere, from their Quad cores, to the upcoming Thubans (cheapest 6 core desktop proc to be priced $199/compatible with DDR2), plus their newly released 12 core Opteron that Anandtech just reviewed: http://it.anandtech.com/IT/showdoc.aspx?i=3784


"Final Words

The beancounters will probably point out that AMD’s strategy of bolting two CPU dies at 346 mm² together is quite costly. But this is the server CPU market, margins are quite a bit higher. Let AMD worry about the issue of margins. If AMD is willing to sell us - IT professionals - two CPUs for the price of one, we will not complain. It means that the fierce competitive market is favoring the customer. The bottom line is: is this twelve-core Opteron a good deal? For users waiting to use it in a workstation we have our doubts. You’ll benefit from the extra cores when rendering complex scenes, but in all other scenarios (quick simple rendering, modeling) the higher clocked and higher IPC Xeon X5600 series is simply the better choice.

Applications based on transactional databases (OLTP and ERP) are also better off with new Xeon. The SAP and our own Oracle Calling Circle benchmark all point in the same direction. Intel has a tangible performance advantage in both benchmarks.

Data mining applications clearly benefit from having “real” instead of “logical” cores. For datamining, we believe the 12-core Opteron is the clear winner. It offers 20% better performance at 20% lower prices, a good deal if you ask us. Intel’s relatively high prices for its six-core are challenged. The increased competition turns this into a buyers market again.

And then there is the most important segment: the virtualization market. We estimate that the new Opteron 6174 is about 20% slower than the Xeon 5670 in virtualized servers with very high VM counts. The difference is a lot smaller in the opposite scenario: a virtualized server with a few very heavy VMs. Here the choice is less clear. At this point, we believe both server CPUs consume about the same power, so that does not help either to make up our minds. It will depend on how the OEMs price their servers. The Opteron 6100 series offers up to 24 DIMMs slots, the Xeon is “limited” to 18. In many cases this allows the server buyer to achieve higher amount of memory with lower costs. You can go for 96 GB of memory with affordable 4 GB DIMMs, while the Intel server is limited to 72 GB there. That is a small bonus for the AMD server.

The HPC market seems to favor AMD once again. AMD holds only a small performance advantage, and this market is very cost sensitive. The lower price will probably convince the HPC people to go for the AMD based servers.

All in all, this is good news for the IT professional that is a hardware enthusiast. Profiling your application and matching it to the right server CPU pays off and that is exactly what set us apart from the average IT professional."


So ATI basically under AMD has also wizened up and adopted the same motto, sell affordable/performance/quality products, which is exactly what we are seeing. I think Nvidia is done, you have to look at which company is more powerful right now, which company is bigger and better and has a wider variety of IQ and resources, the answer is AMD.
 
Last edited:
Dec 30, 2004
12,553
2
76
Nvidia is not done, they have a manufacturing problem on their hands but their design is stellar and will scale very easily to infinity and beyond. They're going to take a hit now but I see no reason why they cannot return to competitiveness in the future.
 

Rhezuss

Diamond Member
Jan 31, 2006
4,118
34
91
LOL it took everything I had not to upgrade my graphics card before Fermi and now it's still a foggy situation.

At least the 5850 prices are coming down a bit, this morning the lowest price I saw was 336CAD and now it's down at 295CAD.

Waiting for the Sapphire Toxic 5850 2gb price...
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
LOL it took everything I had not to upgrade my graphics card before Fermi and now it's still a foggy situation.

At least the 5850 prices are coming down a bit, this morning the lowest price I saw was 336CAD and now it's down at 295CAD.

Waiting for the Sapphire Toxic 5850 2gb price...

I saw some benchmarks that showed the 2GB version gives 0 performance gains in any situation with a single screen. So unless you plan on using 3 screens it's not really worth the premium.
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
Well supposedly there are only 30,000 cards that will be produced. If true they would only get a few thousand anyway. Plus the fact they may not want to honor the double lifetime warranty for a card that may not last 3 years in many peoples cases. I haven't seen any BFG cards either.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I thought nvidia has been unhappy with xfx for a while because of selling amd cards. This is probably pay back, but the timing looks bad.
 

Tsavo

Platinum Member
Sep 29, 2009
2,645
37
91
Nvidia is not done, they have a manufacturing problem on their hands but their design is stellar and will scale very easily to infinity and beyond. They're going to take a hit now but I see no reason why they cannot return to competitiveness in the future.

Yep, Nvidia will never be done because there are an infinite number of numbers and thus an infinite number of times they can change the names of a G92 based product.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
I dunno, it makes lots of sense, IF they think there is going to be issues with the first wave of fermi cards, and don't want all the headaches that this can cause.
They make other OEMs take the risk, and then they can come swoop in for the next wave of fermi cards.

They need to have cards in stock to sell, AND for waranty issues, and if supply is tight, why bother being 'first' ?
 

slayernine

Senior member
Jul 23, 2007
894
0
71
slayernine.com
I saw some benchmarks that showed the 2GB version gives 0 performance gains in any situation with a single screen. So unless you plan on using 3 screens it's not really worth the premium.

If you are going to quote benchmarks show me an image or a link. I don't believe your made up benchmarks until I see otherwise. Adding another gigabyte of fast memory will do a lot for a graphics card. That is why nVidia pushes cards with more memory, that and to compensate for using slower memory chips than ATI cards. 2gb will improve performance on large 2560x1600 resolutions the most and also show decent gains at 1920x1200.

Though admittedly I would rather see faster core clock speeds than more memory.
 
Last edited:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
its true, it doesnt do jack unitl you reach 30" & add AA 4x or 8x, and even at this point, it's only crysis & a few others where the extra 1gb makes a difference.

Hexus has a review, check it out.
 

mm2587

Member
Nov 2, 2006
76
0
0
The interesting thing is that if GTX480 came out with 10% lower gpu clocks, 256-bit memory bus with faster GDDR5 (could still get higher memory bandwidth that 5870), they would have

1) Been more likely to get more chips at a 70 mhz lower clock speed, helping them lower production cost per chip;
2) Still be as fast as 5870 in most games, while faster in DX11 games;
3) Run cooler.

they couldn't use faster gddr5 becuase of their poorly designed bus, has nothing to do with how wide it is.

@1) true
@2) Everything I've seen so far show that in dx11 titles amd normally closes the gap/wins against the 470/480. Dx11 is not the green teams strong point (yet... will give them a driver revision before final judgment)
@3)cooler maybe, but still nowhere close to cypress.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Well the tides have turned this time, Ati is no longer that old solo company as in the 9800 pro days, their drivers and cards were plagued with issues and incompatibility, but since AMD bought Ati, things have turned around quite positively. There's not much hope now for Nvidia, I mean AMD's strategy of "bang for the buck" is everywhere, from their Quad cores, to the upcoming Thubans (cheapest 6 core desktop proc to be priced $199/compatible with DDR2), plus their newly released 12 core Opteron that Anandtech just reviewed: http://it.anandtech.com/IT/showdoc.aspx?i=3784


"Final Words

The beancounters will probably point out that AMD’s strategy of bolting two CPU dies at 346 mm² together is quite costly. But this is the server CPU market, margins are quite a bit higher. Let AMD worry about the issue of margins. If AMD is willing to sell us - IT professionals - two CPUs for the price of one, we will not complain. It means that the fierce competitive market is favoring the customer. The bottom line is: is this twelve-core Opteron a good deal? For users waiting to use it in a workstation we have our doubts. You’ll benefit from the extra cores when rendering complex scenes, but in all other scenarios (quick simple rendering, modeling) the higher clocked and higher IPC Xeon X5600 series is simply the better choice.

Applications based on transactional databases (OLTP and ERP) are also better off with new Xeon. The SAP and our own Oracle Calling Circle benchmark all point in the same direction. Intel has a tangible performance advantage in both benchmarks.

Data mining applications clearly benefit from having “real” instead of “logical” cores. For datamining, we believe the 12-core Opteron is the clear winner. It offers 20% better performance at 20% lower prices, a good deal if you ask us. Intel’s relatively high prices for its six-core are challenged. The increased competition turns this into a buyers market again.

And then there is the most important segment: the virtualization market. We estimate that the new Opteron 6174 is about 20% slower than the Xeon 5670 in virtualized servers with very high VM counts. The difference is a lot smaller in the opposite scenario: a virtualized server with a few very heavy VMs. Here the choice is less clear. At this point, we believe both server CPUs consume about the same power, so that does not help either to make up our minds. It will depend on how the OEMs price their servers. The Opteron 6100 series offers up to 24 DIMMs slots, the Xeon is “limited” to 18. In many cases this allows the server buyer to achieve higher amount of memory with lower costs. You can go for 96 GB of memory with affordable 4 GB DIMMs, while the Intel server is limited to 72 GB there. That is a small bonus for the AMD server.

The HPC market seems to favor AMD once again. AMD holds only a small performance advantage, and this market is very cost sensitive. The lower price will probably convince the HPC people to go for the AMD based servers.

All in all, this is good news for the IT professional that is a hardware enthusiast. Profiling your application and matching it to the right server CPU pays off and that is exactly what set us apart from the average IT professional."


So ATI basically under AMD has also wizened up and adopted the same motto, sell affordable/performance/quality products, which is exactly what we are seeing. I think Nvidia is done, you have to look at which company is more powerful right now, which company is bigger and better and has a wider variety of IQ and resources, the answer is AMD.

um, AMD's cpu division is doing all of this because intel is currently kicking their ass. Intel has an incredible process advantage and looks to continue that dominance for at a minimum of several more years. AMD's gpu division is in much better shape than the cpu division, but it is by no means dominant or even the top player. Nvidia has been so strong for so long that they can afford several more disappointments before losing a significant portion of their owner base. imho, xfx was probably cut out by nvidia as punishment for deciding to go "both ways" and will either eventually get some 4xx product as availability increases or at least get some next gen nvidia products.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |