From China with Love [G80, R600 and G81 info]

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: Creig
Nor does it meant they intend to pull out of the high end market.
If it's of no advantage to them, then yes they will. You're acting as if staying on top of the GPU tech is just going to be another minute task for AMD. Something where they can just throw some money at it and have it bring them significant gains. That's not how things work, as I outline in prior posts.


Originally posted by: Creig
Why would they need to shift anything away from Intel? They still have all the ATI employees and engineers. They will probably downsize ATI a bit, just to get rid of managerial and executive positions that are now duplicated between ATI/AMD, but ATI can (and is) continuing business as usual.
There not going to shift away from Intel by pouring in R&D into the discrete GPU buisiness, unless you're telling me they now plan to compete with nVidia, too. And if that's the case, it's inevitable that much of their resources would be divided / shifted so that they could not only combat Intel, but nVidia, as well. Does that really sound logical to you? 0_o


Originally posted by: CreigI don't think YOU'RE looking at the big picture. ATI now has access to all the patents, licensing and technology of AMD and vice versa. Both ATI and AMD are now in an even stronger position to bring new technology to their respective production lines than they were before the acquisition. ATI wasn't in any danger of going under before the buyout and now they have even greater technological resources to draw upon. It makes no sense to say AMD bought ATI for $5.4 billion dollars only to tell ATI to close up shop.
Nobody is claiming AMD is telling ATI to "close up shop." The high-end discrete market is just one portion of ATI's business, and the one that remains in question on whether AMD will decide to continue to pursue it post R600. You seem to think otherwise, yet your only reason is because it'll be another area where AMD can make a buck. I've consistently refuted that claim, and yet that's all you and apopin can resort to.


Originally posted by: Creig
I'm sure that if Nvidia thought they would do better by not offering a high end card, they would have by now. They aren't coming out with cards like the G80 just for the sheer joy of building it. They're doing it to make a profit. The same applies to ATI.
The discrete GPU is their core market. That isn't the same for AMD. And because ATI is now AMD, that still applies, because like I said time and time again, this was an aquisition, not a merger.

Originally posted by: josh6079
If the integrated market was really that inticing for businesses, Nvidia and ATI would have been competeing in the integrated sector instead of the discrete power-houses. I mean, if the integrated solution is really that appealing for large corporations such as Nvidia and ATI, they both would have been making nothing but integrated solutions for years so as to reap the rewards from it's huge user-base.
nVidia and ATI are competitng in that market, or begining too, but its a low margin market, and therefore has typically been at the bottom of their priority lists (atleast for nV).

Originally posted by: josh6079
A integrated solution isn't demanding enough to warrant the design of a CPU/GPU substrate. Also, Nvidia isn't planning on making CPU's just so that it can sit out with an overpriced, high-end, discrete video card while making integrated CPU/GPU solutions. These first CPU/GPU designs will have more power than a simple integrated Intel, otherwise they wouldn't be pushing for the design.
It's still not going to offer what a highend of the time, or even midrange, discreet GPU will be able to. That should be common sense at this point.

Nelsieus



 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
but is it a bottleneck that would affect system performance?
Define system performance. If you mean gaming performance then, yes in some cases. Why do you think Oblivion gives better minimum frame-rates as you increase the FSB?
nVidia and ATI are competitng in that market, or begining too, but its a low margin market, and therefore has typically been at the bottom of their priority lists (atleast for nV).
You just said that:
...the integrated market largely outweighs the discrete desktop, which is why Intel continues to be the number one supplier of graphics.
...how can it be low margin if it's cheap to produce and sells like wild-fire?
It's still not going to offer what a highend of the time, or even midrange, discreet GPU will be able to. That should be common sense at this point.
Unless the discrete market is without competition and dragging its technology because of it.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Elfear
Originally posted by: Elfear
Originally posted by: Crusader


You obviously havent read my direct response to this, or choose to ignore it becuase most of you would rather attack someone rather than take into account their actual viewpoints.

Crusaders view-
Doesnt matter what happens to "ATI" branded video cards. They can go under. That'd be great. Why? This is a free market, someone will take their place if theres a profit to be made.
ATI isnt necessary to be around, someone else will step in.
Thats why I dont care. Kinda funny watching the ATI fanboys squeal though, I can admit that much.

I'm guessing no one here is in any sort of private business? There are plenty capable companies that'd get in the market if they could A) Survive Nvidia, and B) Consider it worthwhile/profitable.


I'll take a stab at it. What about barriers to entry? It's not like opening up a competing lemonade stand across the street from your neighborhood rival. To get into the graphics card business takes some serious capital, some very talented engineers and programmers, and even then you're starting out behind the big boys by a long shot. If Nvidia and ATI have been in the graphics business for years and years now (i.e. they have LOTS of experience) and they've been working on G80/R600 for a long time now, how is an upstart company supposed to directly compete with them? They couldn't, not for many years to come and it wouldn't be a very lucrative business getting there.

That's a very basic business principal. Those industries which have little threat of substitutes, high entry barriers, and are very capital intensive don't have new guys popping up all over the place. It would be a lose lose situation for everyone if ATI stopped making high-end video cards no matter which team you root for.

You never did answer my rebuttal here Crusader. I'm not saying you're totally wrong but you definetely have some flaws in your logic.

Actually I missed it. Sorry. I dont like passing by legit posts.. and responding to all these flaming tools instead.

I've considered what you said in the past before I came to the conclusion that I hold currently.
And you are correct, 100%. But I do believe there are enough free-standing companies (with no alliances preventing them from taking up the task) that could fill that void pretty well (the void ATI left us).

Another forum member named some of those possibilities, but I think Intel themselves could enter the market if they wanted to, and saw profitability. Or even Microsoft. Theres nothing stopping them from purchasing PowerVR and TSMC and immedately start producing GPUs if the cost is worth it.

I dont believe it is worth it for anyone, unless someone comes across some amazing engineers that really produce something great (like the R300 or NV40). But otherwise I dont see anyone just saying "meh, lets give this a shot".
And I dont see ATI saying "lets spend part of our resources holding the high-end GPU crown.. at the expense of our war with Intel.. which we're barely making ground in anyway.. and the ground we do hold is VERY hard fought-for".

Ruiz wants Intel. I dont think hes going to be like "lets go after the niche market Nvidia holds.. yeah!"
If they dont succeed in CPUs 1st, they are cooked. Intel will eat them alive if they dont maintain competiveness in the CPU market.. now forgoing high end, costly, unprofitable GPUs? It doesnt make sense.

Can make a lot more money selling integrated parts, low power consumption device GPUs (PDAs, small electronics), ect.

Its a safe bet they are targeting the profitable areas and high growth markets. They will fight the high end GPU war if they really have something that destroys G80.. but that is so highly unlikely at this point.
ATI probably wouldnt have essentially begged for ATI to take them over if that was the case. Why give up your company if you are about to destroy your competitor, Nvidia?
There is no reason to.
 

A5

Diamond Member
Jun 9, 2000
4,902
5
81
Why give up your company if you are about to destroy your competitor, Nvidia?
There is no reason to.
I can think of about 5.4 billion reasons...
 

XNice

Golden Member
Jun 24, 2000
1,562
0
76
here's more information that should help people understand the AMD / ATi relationship, post-acquisition.

AMD & ATi Acquisition Complete
"We thought we?d give you a little insight into ATi and AMDs strategic vision by posting the slide presentation that we received after the press release announcing AMD had completed the acquisition of ATi."
 

Elfear

Diamond Member
May 30, 2004
7,116
695
126
Originally posted by: Crusader


Another forum member named some of those possibilities, but I think Intel themselves could enter the market if they wanted to, and saw profitability. Or even Microsoft. Theres nothing stopping them from purchasing PowerVR and TSMC and immedately start producing GPUs if the cost is worth it.

I dont believe it is worth it for anyone, unless someone comes across some amazing engineers that really produce something great (like the R300 or NV40). But otherwise I dont see anyone just saying "meh, lets give this a shot".

Your point here is exactly what has me scratching my head at your glee that ATI may leave the high-end market. If you concede that filling ATI's place will be very hard if not impossible (especially in the short-term) than who will compete with Nvidia directly to drive down prices and keep innovative products coming our way?

I'm not saying ATI won't leave the high-end market (although unlikely IMO), but I'm trying to figure out WHY you would want them to besides some personal vendetta against them? It just seems like a lose lose situation for consumers for many years if that happens.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079
but is it a bottleneck that would affect system performance?
Define system performance. If you mean gaming performance then, yes in some cases. Why do you think Oblivion gives better minimum frame-rates as you increase the FSB?

So are you thinking of sending data directly from CPU to GPU?

extremely highly not possible IMO, but if you think its possible, then I agree with your logic.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,520
0
76
Originally posted by: Elfear
Originally posted by: Crusader


Another forum member named some of those possibilities, but I think Intel themselves could enter the market if they wanted to, and saw profitability. Or even Microsoft. Theres nothing stopping them from purchasing PowerVR and TSMC and immedately start producing GPUs if the cost is worth it.

I dont believe it is worth it for anyone, unless someone comes across some amazing engineers that really produce something great (like the R300 or NV40). But otherwise I dont see anyone just saying "meh, lets give this a shot".

Your point here is exactly what has me scratching my head at your glee that ATI may leave the high-end market. If you concede that filling ATI's place will be very hard if not impossible (especially in the short-term) than who will compete with Nvidia directly to drive down prices and keep innovative products coming our way?

I'm not saying ATI won't leave the high-end market (although unlikely IMO), but I'm trying to figure out WHY you would want them to besides some personal vendetta against them? It just seems like a lose lose situation for consumers for many years if that happens.


dumb fanboys always want stupid things.
imo even ppl employed by nvidia should not want that cuz if that happens there will be no need for a lot of employees and pay cuts/layovers will follow
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
I guess the only thing I would add to this discussion is that we could easily end up with one company doing ultra high-end, discrete graphics (nVidia) if the market doesn't change paradigms. I just happen to think a paradigm-shift is coming, with 'integrated' GPUs becoming nearly the only kind of GPUs.

I think we're focusing our attention on the wrong part of the 'niche' market that purchases ultra-high-end graphics. Instead of looking at this market as defined by their interest in ultra-high-end graphics, we can look at them as being defined by the willingness to upgrade their computers. For most consumers, computers are discrete products with a single, integrated life-cycle. Very few computer users--when compared to the entire user base--look at the individual components of their computers as having individual life-cycles. That's like breathing for the AT crowd but it isn't the norm.

I think what is driving the AMD/ATI decision is the belief that most consumers treat their computers as an integrated whole. They don't want to bother with continual upgrading of discrete components. They'd rather just purchase a new computer that meets their updated needs. In this type of environment, at every level--low, mid and high-end--integrated solutions are going to be more efficient, economical and profitable for the companies that can deliver them.

AMD has seen Intel morph into a 'platform' company and of course they want to do the same in acquiring ATI. The point of disagreement I have with most of the posters who believe that AMD/ATI will abandon the high-end, is that I think that both companies, Intel and AMD, will start expanding the 'platform' model upwards, to include higher-end video performance.

Don't get me wrong, I think the AMD/ATI merger is a risky bet, with commensurately high payoffs and penalties. If they can market a powerful, unified CPU/GPU architecture as a 'gaming' platform in the way Intel marketed Centrino as a 'mobile' platform, I think that they'll capture a lot of gamers who don't want the hassle of worrying about all those minimum/recommended requirements on the box. The biggest part of the gaming crowd is in the console world, where if it says it's "made for Xbox 360" they know what they're going to get and they know their machine can run it.

I think the biggest part of the gaming market wants simplicity--even much of the market that wants top of the line performance.

That certainly isn't the AT crowd (nor me), but I think that they other group is where the real money is at.

Just my thoughts...I have no clue how this thing is going to turn out..but it should be fun watching!
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Intel will eat them alive if they dont maintain competiveness in the CPU market.. now forgoing high end, costly, unprofitable GPUs? It doesnt make sense.
Why has Nvidia "dominated" the GPU sector if it isn't profitable? It's the same question I brought up with Nelsius. There is quite a market for high-end, discrete video cards and both Nvidia and ATI have been batteling for it for years. If there wasn't any profit to be had from making such monster cards, both Nvidia and ATI would have been competeing in nothing but the integrated segment where R&D is next to dirt cheap and sales are huge.
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: josh6079
...how can it be low margin if it's cheap to produce and sells like wild-fire?
Because it's also cheaply sold, meaning low profits, which in this situation, equals lower margins.

And because we know nVidia cares very deeply about their margins (which have been doing quite well), we can assume they would be hesitant to occupy an area, atleast at this time, that wouldn't give them the worthwhile gains. This is exactly why we don't see them in the DLP / TV sectors, currently only occupied by ATI, or even the cell-phone markets (until as of late). nVidia has stated many times via their conference calls (which I can look up if you'd like) that they tend to stay out of the areas because it delivers lower than desired margins.

I know it may be hard to believe, but there are actually economics and logistics involved with this thinking and nVidia, ATI, Intel, and AMD's business strategies. :roll:

Originally posted by: josh6079Unless the discrete market is without competition and dragging its technology because of it.

No, because the discrete GPU has *far* much more potential than a CPU /GPU. It's not feasible, at this time, that AMD's fusion plans will rival anything in the highend, or even midrange, and it has nothing to do with how much more advanced the discreet GPU will be, but the fact that the discreet GPU will have capabilities that are non-evident in regards to AMD's plans.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: josh6079
Intel will eat them alive if they dont maintain competiveness in the CPU market.. now forgoing high end, costly, unprofitable GPUs? It doesnt make sense.
Why has Nvidia "dominated" the GPU sector if it isn't profitable? It's the same question I brought up with Nelsius. There is quite a market for high-end, discrete video cards and both Nvidia and ATI have been batteling for it for years. If there wasn't any profit to be had from making such monster cards, both Nvidia and ATI would have been competeing in nothing but the integrated segment where R&D is next to dirt cheap and sales are huge.

The battle weakened ATI to the point where they got bought out by AMD. If there were huge profits to be had that never would have happened.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
So are you thinking of sending data directly from CPU to GPU?
I'm not thinking of it, DAMIT is.
Because it's also cheaply sold, meaning low profits, which in this situation, equals lower margins.
True. I guess it would just be a matter of higher price/lower sales vs. lower price/high sales.
No, because the discrete GPU has *far* much more potential than a CPU /GPU.
How do we know if it has more potential if such technology as the CPU/GPU isn't even around to compare it with?

I just don't think that AMD would bother spending that amount of money on ATI just for some integrated GPU's--whether they be on the CPU substrate or not. The kind of R&D required for making such a chip is not something that integrated competition would warrant.
The battle weakened ATI to the point where they got bought out by AMD. If there were huge profits to be had that never would have happened.
Nvidia did it for the money, not to "battle" ATI for "unprofitable GPU's." The high-end segment has some decent market to it. It's not going anywhere.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nrb
Originally posted by: apoppin
i seriously doubt that they will give up the 'high end' . . . AMD will not accept 2nd best nor buying GPUs from a monopoly....
But why would AMD care about either of those things? Your argument seems to be based purely on emotion rather than business. You think AMD will somehow feel degraded or inferior because they are participating in a market without making the most powerful component available in that market, and will therefore pull out all the technological stops to in order to restore their sense of pride and self-respect.

I just don't buy that. The only reason why AMD would concentrate resources on any market is if they think they can make a profit by doing so.

And why on Earth would AMD care about how much money you have to pay for a video card they aren't supplying? Why would they care whether Nvidia has a monopoly or not, so long as they aren't losing money as a result?

If you want to convince people you need to come up with a business case for AMD participating in what is currently a non-profitable market for ATI.

clearly it IS a profitable market . . . AMD did not buy a failing busines . . .
:thumbsdown:

if AMD wants to they may want to take even more of nvidia's high end - to make it even MORE profitable at their expense . . . might easily do it if the r600 is "another r300" . . . AMD may well like keeping the graphics 'performance crown' title --permenently.
:Q

anyway, that's your first mistaken assumption - a giant false premise on which the rest of your illogical theory falls flat on it face.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: tanishalfelven
imo even ppl employed by nvidia should not want that cuz if that happens there will be no need for a lot of employees and pay cuts/layovers will follow

That wasn't even subtle! :laugh:

Still there will be room for the viral marketers, as they will need to convince you to buy the new card that has increased software.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
How?? without passing through memory..? so GPU is stalled if CPU is stalled?
no buffer?
To be honest, I don't know any definite means pertaining to "How??". That's why AMD has intriged enthusiasts with such an idea.

This gives a nice explanation of DAMIT's plan concerning the CPU/GPU hybrid: Click

Basically, they vaguely explain the different methods by which CPU's and GPU's operate and what obstacles may be involved in making the two cores interact.

[*] Parallelism

-- CPU's use two types and whichever one is used is determined by its own architecture as well as the architecture of the program:

Direct Parallelism
Programs are compiled by software that already "knows" how tasks should be delegated, so the compilers "toe-tags" tasks in advance. The processor reads these "tags" and assings the appropriate logic units, without much decision involved on the CPU's part.

Implied Parallelism
the processor analyzes a program that would otherwise run perfectly well in a single-core system, but divides the stream into tasks as it sees fit. This is how Intel?s hyperthreading architecture worked during its short-lived, pre-multicore era; and it?s how x86 programs can run in multicore x86 systems, running Core 2 Duo or Athlon processors, today.

They say that there are methods by which programmers could help initiate Implied Parallelism, but are often unwilling to make the minor changes to their code, even with ample opportunity.

-- GPU's use one type:

"Data-Structuring Parallelism"
Instead, since their key purpose is often to render vast scenes using the same math over and over, often literally millions of times repeated, GPUs parallelize their data structures. They don?t delegate multiple tasks; they simply make the same repetitive task more efficient by replicating it over multiple pipes simultaneously. It?s as though the same doctor could perform the same operation through a monitor, on multiple patients with the same affliction simultaneously.

Because of the way GPU's carry out instructions...
co-opting the GPU for use in processing everyday, complex data structures, yields unprecedented processing speed, on a scale considered commensurate with supercomputers just a few years ago.

This is why F@H is much more effective on a GPU rather than a CPU.

However, unlike Implied Parallelism in multicore CPU's, a shift to SIMD-style (Single Instruction, Multiple Core) would...
mean a monumental transformation in how programs are written. We?re not talking about compiler switches any more.

This raises question as to how exactly such technology would be implemented. This is why I myself have no idea exactly how such a chip would interact among two different types of cores inside itself.

 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: thilan29
Originally posted by: redbox
This would provide AMD a way to still compete in the high end graphic segment without having to loose a lot of revenue. Apoppin does have a point, with AMD's ambitions ATI is a rather good fit regardless of the means of takeover or merger. My question is just how far is AMD/ATI willing to take R600 will they stick just with the first card or are they going to do the refreashes also. It seams kind of silly to pour all that R&D into a card and then not take it to full term. Much of the ground work is already laid for them it would be rediculous not to take advantage of it.

Fusion will be nowhere near the performance levels of discrete graphics though will it? I suppose they could use some of the tech in the Fusion and transplant it into a discrete card, and maybe they'll use some of the tech in R600 and put it into Fusion. That way they don't have to spend too much more on R&D for discrete graphics and so may be able to stay with it. As long as AMD/ATI stays with graphics at a general level I think it's feasible that they could continue to make discrete cards...maybe not in the very high end but at least in the mid-high end.

Sorry it took so long for me to respond. I think that Fusion will be aimed primarly at notebooks and intergrated markets. However, no one knows and AMD leaves it kind of vague. Here is a quote from the article:
AMD intends to design Fusion processors to provide step-function increases in performance-per-watt relative to today?s CPU-only architectures, and to provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high-performance computing. With Fusion processors, AMD will continue to promote an open platform and encourage companies throughout the ecosystem to create innovative new co-processing solutions aimed at further optimizing specific workloads. AMD-powered Fusion platforms will continue to fully support high-end discrete graphics, physics accelerators, and other PCI Express-based solutions to meet the ever-increasing needs of the most demanding enthusiast end-users.

They say that are planning on this fusion cpu/gpu to provide increases in 3d graphics, digital media and high-performance computing (whatever that means). They hint that the system will use co-processors to attain the desired functions, and go on to say that besides the co-processors they will fully support high-end discrete graphic cards on the pci-express bus. So that leaves the question if we have a cpu/gpu, several co-processors, and discrete graphics then how will the work be split up? Why even have the gpu integrated into the cpu if you are going to use discrete graphic cards anyway?

The main problems is see with Fusion are what mem will the cpu/gpu be drawing from? Right now system ram is alot slower than graphic ram. Also with all of these co-processors won't there be alot of bus traffic? How will this affect performance? It would work alot better if AMD would use their direct link idea or whatever it is. Also what changes in software are going to be needed? They have a long up hill battle if they want to take this idea further than the drawing board. Not to mention Intel is for sure not going to make the climb very easy.

Another point I thought was interesting was this one:
With the development of Fusion and upcoming integrated AMD platforms, it is unknown what will happen to NVIDIA?s chipset business, which currently relies mainly on AMD chipset sales.

Nvidia's gpu buiz might not have any competition once ATI leaves for good but their chipset buiz just got a bit messed with.
High end gpu doesn't bring much money in and that is the only field Nvidia will own if no one produces high end gpu's past R600.
Nvidia now has yet another company to compete with for chipset sells. A market that usually does pretty well.
Nvidia looses a company that did a good job of selling chipsets for them.
So that leaves Nvidia with only dominating a market that doesn't make very good returns.
They traded 2 significate disadvantages for 1 so-so advantage. Doesn't look good for Nvidia.
 

Ibiza

Member
May 19, 2006
42
0
0
Originally posted by: Ulfhednar
Originally posted by: Crusader
Nvidia isnt a 2-dollar-ho like ATI.
Christ on a cross! :laugh: Please tell us what drugs we can blame your stupidity on.

Well we know what drugs you are on Ulfhednar.

Originally posted by: Ulfhednar
Originally posted by: JohnCU
paxil 25mg CR (everyday)
I was on that and it gave me horrible side-effects like feeling faint and cold sweating, it also ruined my appetite and I didn't eat for days at a time.

I'm currently on Sertraline (Zoloft) 200mg daily for depression/anxiety, it's not bad but it's not strong enough as I still have plenty of anxiety attacks in social situations.

Also on Zopiclone (Zimovane) 7.5mg daily for insomnia and Chlorpromazine (Largactil) 100mg a day for other stuff.

Pretty much a zombie most of the time. :frown:

Original thread here http://forums.anandtech.com/messageview...atid=38&threadid=1949653&enterthread=y

You sound like a disturbed social misfit to me.

Is compulsively trolling on internet forums one of the symptoms of your illness? I'm genuinely interested.


 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
OH MY GOODNESS. AMD/ATI won't leave the high-end GPU market. It doesn't make any business sense whatsoever. You guys are looking everything from ATI-vs-NV point of view. If anyone runs a business s/he will understand me. I can't believe it's even being discussed in such fashion here.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Josh.. your copy-n-pasted parallelism logic is interesting, but it does not imply CPU to GPU operations directly interact with each other without the use of system memory. As long as system memory is being used, the same FSB speed matters and there isn't an improvement in performance.

Memory is, in comparison to CPU registry, a bottleneck. That is why often times increased cache would improve performance.

I agree with you that the GPU/CPU integration is a great idea, in that it will reduce manufacturing cost and reduce the size/number of components required (for things such as PDA, cellphones, ultracompact notebooks) and reduce electricity required to run them. However, I do not see an improvement in performance as you have implied it will have. Perhaps you are mistaken?
 

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: lopri
OH MY GOODNESS. AMD/ATI won't leave the high-end GPU market. It doesn't make any business sense whatsoever. You guys are looking everything from ATI-vs-NV point of view. If anyone runs a business s/he will understand me. I can't believe it's even being discussed in such fashion here.

ATI's ROI in the high end market sucks. I think R520/R580 is superior to the G70/G71, but it came at a tremendous (and potentially fatal) cost to ATI.

The R580 die is much larger than G71's. It costs much more to produce. Also, recently, ATI has been constantly late with their products. The R580 is also NOT THAT MUCH superior to the G71. The result? No pricing leverage, and low margins, even for high end products.

If you can produce integrated solutions at an extremely low cost (low investment and manufacturing costs, not selling price) and maintain parity in other aspects, you can make much more money than you could selling low-margin R580s.

Chances are, you will see AMD milk off R600 technology for it's integrated CPU+GPU solution for the next five years, with no advances in performance.
 

nrb

Member
Feb 22, 2006
75
0
0
Originally posted by: apoppin
clearly it IS a profitable market . . . AMD did not buy a failing busines . . .
:thumbsdown:
If ATI were as successful as all that, why sell the business? It would make far more sense to wait two years and double the value of the company, then sell.

However, the more important question is where any profit was coming from. Although ATI has been making a reasonable profit, it has not been the high-end graphics card sector producing it. ATI deals in a number of other market areas, and it is from those areas that all of ATI's profit has been coming. The reality is that other segments have effectively been subsidising a loss-making high-end sector for the past couple of years, now.

Originally posted by: dravyn
Why would AMD stop High End GPUs as its the one which gets the most lime light. Wouldn't have the High End performers be good for PR and sales marketing?
I've never been entirely convinced by that argument, tbh. Imagine that ATI is producing a $600 graphics card that is unambiguously superior to the $600 card produced by Nvidia, but that Nvidia's $200 card is unambiguously superior to ATI's $200 card. Now imagine that you are in the market for a $200 video card. Would you say to yourself "Aha! A completely different ATI product is better than a completely different Nvidia product, therefore I should clearly buy the $200 ATI card, even though it is inferior in every way to its $200 Nvidia competition"? Or would you buy the best product available at that price? Most normal, sane people buy a product on the merits of the product itself, not those of a completely different product.

Originally posted by: dravyn
Wouldn't this be bad for business for AMD, since ATi was making money in all sectors, why pretty much cut off 50% of ATis revenue (High End, meaning that in the long term, High End would turn into mid, then low etc)? Isn't that a bad business strategy?
As I said before, it is not simply a question of revenue, it's a question of profit. Imagine you have $1 billion. You can either invest it in something which will result in your money turning into $1.1 billion, or in something which will turn it into $2 billion. Which do you choose?

If AMD remains in the high-end graphics sector they will have to keep ATI people working in that sector, and invest a lot of money in that sector. If they do this, then they cannot use the same people and money in other sectors. If the high-end sector makes a profit, that's not enough: it has to make more profit than the same people and the same money would if they were invested elsewhere. If that's not true, AMD will move the people and the money.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |