ATI+AMD confirmed

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I wonder where is Micro$oft in all of this. These kinds of things that hinder the high-end PC market or enthusiast level support are something that I can't see Porche or Ferrari doing to their consumers. PC gamers have a hard enough time as it is getting what they want for gaming. Most peoples gaming rigs out cost any console and we're the ones who spend the most money on more hardware when a certain game comes out that we want to play. I've seen game developers catering to consoles a lot more now (look at GRAW and Oblivion for example)

What if PC gaming is something that is starting to be more of a burden for major corporations to be involved with and want to clutch to console designs owned by other major corporations. Why is it ever increasingly difficult to give the people who pay the most for something what they want?

I got out of console gaming to come to PC's. Plenty of options, everything is customizable, and it is the complete all in one machine. Yet now everything is starting to blend and less and less of that custom feel will remain apart of PC's. How long will it be before only certain video cards (if they are still around) require certain monitors? How long will it be before PSU's are only limited to a specific motherboard/GPU combo? How long will it be before all we are buying are consoles on steriods?

I know I'm just speculating and the chances for the above happening are probably the same as Dick Cheney learning the difference between a bird and a person (no offense political people), but I can't help but feel how all of these competitive "wars" shouldn't have a resolution. Being competitive is something that has drived things to what they are today.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: apoppin
Originally posted by: Gstanfor
Originally posted by: apoppin
not true, Fox5 . . . let me refresh your memory . . . ATi was a much bigger player than nVidia . . . but they were mid/low end. Their first "performance" GPU was Rage Fury 32 - it's IQ was noticeable better than the faster Geforce256. With Radeon 8500, ATi narowed the Performance Gap with the Ti4200 series and then blew them away with the 9700p.

That doesn't mean diddlysquat apoppin. During that era gamers bought nvidia cards (except fro a tiny minority), everything else ended up in oem or price conscious systems, and developers developed for nvidia.

Rage's IQ and 8500's (eventual and excruiatingly long time coming) performance were irrelevant at the time. As I said above, gamers bought nvidia (and they still do...).

as usual, you post nothing related to what really happened. Rage Fury 32 - an excellent piece of HW - was a decent 'alternative' to nVidia's offering . . . especially for people that liked better IQ - like me. ATi's drivers were always inferior - until 9700p.

everything that doesn't place nVidia on your Altar of Worship is "irrelevant" and ridiculed . . . especially the facts. . . . it was far easier to get agreement from Rollo on 'history'.

:roll:

there is NO point to my discussing this with you . . . again

peace and aloha

Rage Fury 32 actually had good performance, a good feature set, and better iq than nvidia, but it was still largely overlooked. IQ enthusiasts of the time were buying G400, and hardcore gamers were buying nvidia or 3dfx. Plus, the Rage Fury 32 was largely overshadowed by the Geforce. Rage Fury 32 was the dual chip card right? I recall it had some problems related to that as well, for instance, because of the way it rendered it increased the latency between frames and sometimes performance could take a severe dip.

Radeon made a decent splash, actually a rather substantial one initially, but it was soon found that its low fillrate hindered it in many games (though it also performed great in many thanks to ATI's advanced memory efficiency techniques). Then they kind of fell silent for a while, until the Radeon 8500, which was a solid piece of hardware once again hindered by poor drivers, and released too late after the geforce 3 yet without enough performance increase to warrant upgrading from it, and eventually outperformed by the geforce 4ti. Never experienced an 8500 first hand, but it was supposed to be, like the original radeon, very good for image quality (for the time). By this point, color accuracy didn't really matter anymore as everything was 32 bit (and your only options were nvidia and ati anyway), though the analog output of nvidia cards was still suck (don't know how ATI compared).

The Radeon 9700 Pro was a wonderful break out card, dominant in every way, but it's sad to see ATI's poor management cause ATI not to be #1 today. IMO, there is really no reason they couldn't have stayed on top from that day, their main mistake wasn't the lack of SM3.0, rather it was marketting and pricing. Oh, and it would have helped if they had added something substantial to their later cards, maybe 8x AA or something, since their cards were faster than nvidia's (on the top end anyhow, poor pricing brackets on the lower models), but the games at the time were generally too cpu limited to show the difference between the X800 and the 6800 series. It's taken more recent releases to show the x800 series as noticably faster. (looks like the x1800/1900 series will share the same fate) Damn that R300 tech was good though, they've maintained the performance crown straight since that while barely modifying a thing until the x1800 (which stands to show a very large performance improvement over the 7800 when more games come out to take advantage of it, though since ati ceded leadership in the market it's likely games will be developed to nvidia's strengths now).
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: Hard Ball
http://blog.pcmag.com/blogs/miller/archive/2006/07/24/1382.aspx

AMD says it plans to continue to develop ?best-of-breed? discrete products, but it also says that by 2008, it plans to be producing produce that integrate microprocessors and graphics processors to serve ?the growing need for general-purpose, media-centric, data-centric and graphic-centric performance.?

It seems that we will have about 24 months more of discrete ATI graphics solutions on PCIe; then all bets are off after that.

He said he expected the graphics processor (GPU) would remain separate in some markets, notably those that require the fastest 3D rendering, such as high end gaming, ?for as long as I can see.? But he said in other markets, where you want the lowest cost or the lowest power, it could become a single chip. In particular, he said he could see the combined company creating a platform for the needs of emerging markets through a better job of integrated graphics, consistent with AMD?s 50x15 vision.

Clearly this will be the focus of future graphics devision of AMD-ATI: integration, digital media, low cost/low power but decent performance. The high end development seem to poised to be scaled back substantially.

It doesn't say that anywhere in any of those quotes. Nothing there says it's going to concentrate purely on the integrated market, which you've taken it to mean. Where's the quote that says high-end development will stop? "He said he expected the graphics processor (GPU) would remain separate in some markets, notably those that require the fastest 3D rendering, such as high end gaming, ?for as long as I can see.?" is the exact opposite of that.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
People "like me" . . . are an extreme minority. . . .

And like most minorities, i get disproportionate coverage in discussions/forums/etc and hold an inflated opinion of my own self worth...
*fixed*

an excellent description of yourself

good job

edit ---------> almost forgot

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
You don't you take your own advice again apoppin? It does wonders for the forum...

Oh, and I'm definitely part of the majority when it comes to video cards and always have been...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
You don't you take your own advice again apoppin? It does wonders for the forum...

Oh, and I'm definitely part of the majority when it comes to video cards and always have been...
the mindless majority?

i enjoyed my vacation . . . now i'm baack in a troll-fighting mode

beware



and here to 'counter' your myopic view
your extreme self importanceand confusion is beyond any help, however
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
You need psychiactric treatment, badly mate - either that or you need to lay off the mind altering substances...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
You need psychiactric treatment, badly mate - either that or you need to lay off the mind altering substances...

unfortunately there is very little help for your type of dementia

try stop posting . . no one would miss you
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Guys, both of you can offer relevant material given certain situations. Stop trying to downplay eachother already, there isn't anymore red vs. green to continue clutching at anyway. Who cares if you think the other one is trolling, most of the people who care to read this thread have a pretty good sense as to who or what they should find relevant. Stop trying to make the other one look like an idiot through e-smack talk.
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
Originally posted by: Barkotron
Originally posted by: Hard Ball
http://blog.pcmag.com/blogs/miller/archive/2006/07/24/1382.aspx

AMD says it plans to continue to develop ?best-of-breed? discrete products, but it also says that by 2008, it plans to be producing produce that integrate microprocessors and graphics processors to serve ?the growing need for general-purpose, media-centric, data-centric and graphic-centric performance.?

It seems that we will have about 24 months more of discrete ATI graphics solutions on PCIe; then all bets are off after that.

He said he expected the graphics processor (GPU) would remain separate in some markets, notably those that require the fastest 3D rendering, such as high end gaming, ?for as long as I can see.? But he said in other markets, where you want the lowest cost or the lowest power, it could become a single chip. In particular, he said he could see the combined company creating a platform for the needs of emerging markets through a better job of integrated graphics, consistent with AMD?s 50x15 vision.

Clearly this will be the focus of future graphics devision of AMD-ATI: integration, digital media, low cost/low power but decent performance. The high end development seem to poised to be scaled back substantially.

It doesn't say that anywhere in any of those quotes. Nothing there says it's going to concentrate purely on the integrated market, which you've taken it to mean. Where's the quote that says high-end development will stop? "He said he expected the graphics processor (GPU) would remain separate in some markets, notably those that require the fastest 3D rendering, such as high end gaming, ?for as long as I can see.?" is the exact opposite of that.


NO, it certainly doesn't say that. But concentrating on integrated and platform technolgies, necessarily will mean less resources towards discrete cards in the future. No one is saying that AMD is actually deliberately ending the delivery of discrete solutions any time soon; but only that much more of the engineering resources, as well as the market focus, will be changing to the end as to make platform integration and widespread adoption of Torrenza a reality, and eventually integrated on die development with CPUs which can perform geometry computations. Where the focus of the company is, so there will the money and the personelle go.

I think we are fairly certain, with the goal of achieving higher level of integration in 08 by greatly accelerating the developments that are already in motion., much of the resources of AMD-ATI will go towards from the near future (after the delivery of R600) until 08. And I can tell you that these extra resources certainly won't come from the K8L development team.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: josh6079
there isn't anymore red vs. green to continue clutching at anyway.

Yes the world has changed. Now only amd versus intel (how could this make fanboys?) and nv left in the corner to fade away. Will be all owned by microsoft in a couple of years, I suppose and innovation will die. :thumbsdown:

 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Do you think the Green Fanboys wil have a civil war as they would have no one to argue with anymore, and they would start arguing with themselves? Yup Microsoft will dominate the world!!!!

Ok on to more relevant stuff.

If nVidia picks up the slack from ATi IF they supposedly dont make any more discrete GPUs, wouldnt that be a monopoly since its pretty much only nVidia in that market now? Wouldnt there be any legal ramifications because of this merger that we will be seeing in the next 5 years where no other company can do anything because its all owned by nVidia?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Drayvn

If nVidia picks up the slack from ATi IF they supposedly dont make any more discrete GPUs, wouldnt that be a monopoly since its pretty much only nVidia in that market now? Wouldnt there be any legal ramifications because of this merger that we will be seeing in the next 5 years where no other company can do anything because its all owned by nVidia?
NVIDIA will not have a monopoly on the video card market as a whole, just the high end sector. Intel, XGI, S3, ATI, Matrox, etc. are still around for mid to low end cards and integrated video.

Don't be suprised if NVIDIA buys out the high end ATI tech from AMD. There is just not a lot of money in the high end for more than 1 or 2 companies.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: Wreckage
Originally posted by: Drayvn

If nVidia picks up the slack from ATi IF they supposedly dont make any more discrete GPUs, wouldnt that be a monopoly since its pretty much only nVidia in that market now? Wouldnt there be any legal ramifications because of this merger that we will be seeing in the next 5 years where no other company can do anything because its all owned by nVidia?
NVIDIA will not have a monopoly on the video card market as a whole, just the high end sector. Intel, XGI, S3, ATI, Matrox, etc. are still around for mid to low end cards and integrated video.

Don't be suprised if NVIDIA buys out the high end ATI tech from AMD. There is just not a lot of money in the high end for more than 1 or 2 companies.

That would be sad, and likely mean that AMD's integrated business will suck. (hasn't the high end been the area where ATI has been losing the least to nvidia? they got raped in the mid and low end)
So I guess nvidia will have a Creative like monopoly, the only major player in town, yet still a minority of the market due to the rest being integrated? Except sound really stopped progressing years ago and is still largely tied to the cpu, graphics are niether of those.
Oh, and Intel and S3 withdrew from discrete cards, XGI I believe has as well, Matrox is not really in the graphics anymore. They mainly sell various other adapters now, but they sold off their graphics division years ago and only produce g450 based cards for computers that just need 2d output.
 

Sc4freak

Guest
Oct 22, 2004
953
0
0
Rumor that ATI TECHNOLOGIES INC (ATYT) will get a $23 bid from (INTC) over current (AMD) bid
... I obviously need to brush up on my jargon here, what exactly does $23 mean? $23 million? Billion? Per share?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
In six months Intel announces the new AGP 5.0 spec based on modified PCI-E underpinnings. AMD and Intel just can't seem to come to a licensing agreement on the new technology- Intel destroys ATi's high end discrete market without benefting themselves directly(which would circumvent anti trust statutes).

Not saying it will happen, but don't be shocked if it does. Even if ATi manages to pull down 75% of the AMD enthusiast market(where they currently are a clear minority) that would leave them 15% of the discrete market. ATi relegated to an engineering team for prospective changes in development ideas. Sure, they will make sure that AMD has a better integrated solution then Intel- that won't take much and that will be a big boost for AMD, but their larger market presence not to mention Intel's ability and now very real capability of cutting them out of the 80% of the market they have at a whim must be taken into consideration. Given Intel's practices in the past, I honestly expect this to be more a question of when then if. Two years ago this could have been a great move for AMD, they dominated the gaming platform with their clearly superior performance and platform- now their performance is decidedly below that of Intel and their platform prospects look, at best, to be equal to Intel's.

What would be more threatening to nvidia would be if AMD and Intel start producing Cell like cpus.

This is seriously one of the most comical posts I have seen in a long time. You know who came up with the design metrics used for Cell? That would be Sony. They had the same idea you do, in fact they were very public about how Cell was going to negate the need for GPUs altogether and make them obsolete. Then they realized something- they were fvcking morons

It really is quite simple- you have a limited amount of die space. To maximize functionality you design the chip to make maximum utilization of its resources. You try to make a CPU that is also a GPU you are going to get your head kicked in by anyone who decides to build comparable sized chips for the GPU AND CPU. Reality check here- we are a great number of orders of magnitude short of where we need to be in terms of GPU performance to come remotely close to real time CGI. Reality check number two- CPUs are several orders of magnitude slower then current GPUs. Cell type processors taking over the graphics end is a dream that comes up every so often always with the same moronic idea that GPUs are nigh stationary targets like CPUs. Simply take a look at reality- every three years GPUs are pushing ~250% of the performance of their earlier counterparts. CPUs are closer to 25%. The gap between CPUs and GPUs is GROWING at an exponential rate right now- convergence to one chip isn't going to happen anywhere remotely close to five years from now.

i am talking when the GeForce256 was King of Performance . .. the Rage Fury32 was ATi's first performance GPU . . . and the performance gap with the GeForce256 was small - if you factored in IQ. By then Matrox was Overpriced and becoming niche and 3DFX was old tech and on the way OUT.

The Rage Fury 32 wasn't a GPU at all- it was just a rasterizer. Also, the Rage 128 predates the Fury 32, utilized the same chip and had the same feature set. It was a competitor to the TNT2 and even using the best case scenario it was obliterated by it. Comparing it to the GeForce is nuts. Forget performance at all- you were forced to disable almost every feature in game because the chip didn't support them- their base filtering was utterly horrific, they had staggering problems with correctly rendering z depth and couldn't handle any sort of anisotropic filtering at all. In all seriousness- it was one of the most hideously poor 3D rendering chips in the history of the industry strictly on an IQ basis. I still own one- I am quite aware of how bad it was. The only thing the Rage ever had going for it vs nV on an IQ basis was the fact that it had brighter colors. That was it in its entirety and that was almost always solved by proper calibration. nV had issues with their low quality partners providing less then optimal filtering circuitry which resulted in very poor output for some of the lesser quality parts, but the chip itself was outputting definitively superior visuals until, at the very earliest, the NV30 hit. This wasn't due to ATi having improved their IQ at all, but nV had dropped down to their level. I have done a test running a Gainward GF2 Pro against a BBA R9500Pro double blind and every person that saw it thought the nV based part looked clearly better(this was with identical settings to remove the performance variable)- both 2D and 3D wise. Reality is that ATi was still using Voodoo1 style accuracy for their base blend operations up until very recently and they have removed the capability from handling wide footprint Z accuracy from all of their newer chips. Both of these are IQ reducing tactics to gain speed over IQ- that has been ATi's mode of operation up until very recently(and even now they steadfastly refuse to fix the issue with being able to handle Z accuracy properly).

For the record, I have purchased one nVidia board in the last six years- all the rest of my money went to ATi(including currently) but I certainly am not delusional.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
The Rage Fury's "History"

in short it was a GPU . . .

Later, ATI developed a successor to the original Rage 128, called the Rage 128 Pro. This chip carried several enhancements, including an enhanced triangle setup engine that doubled geometry throughput to 8 million triangles/sec, better texture filtering, DirectX 6 texture compression, AGP 4X, DVI support, and a Rage Theater chip for better video encoding/decoding. This chip was used on the gamer-oriented Rage Fury Pro boards and the business-oriented Xpert 2000 PRO. Rage 128 Pro was generally an even match for Voodoo 3 3500, RIVA TNT2 Ultra, and Matrox G400 MAX.

The Rage 128 graphics accelerator is the final revision of the Rage architecture.
i had this last revision of the Rage - Rage Fury - 32 MB SDRAM memory ATi targeted it at PC gamers. . . and then i got the Geforce 256 [my first and last nVidia Card] to replace it [the GeForce was much faster, but i didn't like its IQ -the nVidia GPU's colors were "washed out"]


EDIT: Good God! . . . i should have kept mine . . . those things are going for $200
:Q
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
GPU's are ***programmable***, there was *nothing* programmable about any ATi chip prior to the first radeon...

I don't why the hell you are crowing about a triangle setup engine, it's been a standard feature (at least on graphics chips that are actually worth a damn) since Riva 128!
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
No, I don't (and most other definitions of GPU you'll find) agree with wikipedia at all there. tnt & below, plus everything below radeon were rasterisation devices, not GPU's.

I have to admit the Rage (aptly named) did manage to cause some amusement - I remember "unique armor" for the avatar in Ultima Ascension, exclusively for ATi users - gang plank shoes, busted barrel crowns, etc - I should look up the screenshots if they are still out there...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
please post YOUR definition of GPU . . . using respected links, of course for *most definitions of GPU*

unless i am mistakes, we were using the term "GPU" way back then.
:Q

EDIT: way back then?

'99



and i think you are confusing the Rage Fury 32 with the Maxx . . . now that was buggy as hell.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |