Phenom In perspective

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Darkskypoet

Member
Feb 15, 2007
42
0
0
G1ood god man, the point being that AMD only has one thing next in the pipeline, whereas intel has the capacity to have more then one in the pipeline... And you specifically mentioned the C2D as if it had magical properties, that a further die shrink, or speed ramp of Core / P4 would not have done.
\Regardless, the manufacturing realities being what they are, Intel doesn't have to beat AMD performance wise to hold on to the majority of its market share, AMD does. Intel effects AMD more then AMD effects intel. AS we agree there is not equal choice to push new tech on either side, with AMD having no choice, and Intel a ton of it... Then I really don't see what you are arguing about.

Unless you disagree that Intel forced a price war, rather then the imminent release of the next thing from AMD. AS by saying that, you insinuate AMD had a choice to stop innovating, and somehow not move aggressively towards the next step. AS AMD moved to increase capacity, modernize production facilities, and was already deeply involved in a long design process for their next core there is no proof that C2D in any way accelerated the release of the next big thing. There is proof that AMD's performance dominance caused Intel to massively change spending / cancel projects/ et cetera to accelerate C2D development.

In sum, because Intel has choices that AMD does not have, you cannot show a causal link between C2D release, and a change in behavior for AMD, as AMD couldn't ( and still mostly can't) produce enough to gain any large measure of market share. They can only raise their ASP. C2D then, only effects ASP, and not marketshare for AMD. This is borne out by the numbers.
 

zsdersw

Lifer
Oct 29, 2003
10,505
2
0
Were it not for performance competition from Core 2, Barcelona and Phenom may have been released at lower clock speeds and/or earlier steppings.. or not until a much later time, and all we'd have from AMD is a couple speed bumps of Brisbane.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Originally posted by: bfdd
Originally posted by: eye smite
I'll say it again then a little differently. AMD went from the market leader to trying to keep up and that's becausse of implentation of their existing goals with no flexibility to market demands. As the phenom matures over the next few weeks and months and amd focuses strongly on developing this product, you're going to see intel doing double steps to stay ahead. AMD changed the pace first, intel switched gears and out paced them. Now the pace is set and you're going to see real rumblin bumblin jumps across the boards as amd focuses all they have on cpu developement.

Again like I said I just don't see it happening this time around. 10% clock for clock lead, plus a ton of overhead to release even faster CPUs. AMDs fastest quad you can buy right now till the 2.4ghz gets ramped up is 2.3 and it doesn't have shit on the q6600 conroe g0 so what makes you think they're going to catch up so soon, when they're not even competing against intels previous model? If the Phenoms were quite a bit cheaper, 2.4 for like 220 or something. I would even consider buying one if the spider quad xfire platform paid off, but they're not. Mobos are expensive, cpus are to expensive, it's not worth even considering at this point.

Yep. No processor series in history has leapfrogged a superior competitor without a major change (ie; P4 Willamette to Northwood, vs. the Athlon XP). The P4/PD could never meat and beat K8, even after putting ridiculous caches, and ridiculous mhz ramping.

AMD will need a serious redesign to match Kentsfield, let alone Yorkfield and the future.
 

Darkskypoet

Member
Feb 15, 2007
42
0
0
@bfdd I agree that they really needed to price these chips lower. However, as I have heard non-amd sources mention 3.5ghz on air for B3... I don't think it'll be that dire. Additionally... 4 ghz plus with Penryn isn't exactly stock cooling either... Intel has TDP to think about too. The OC headroom on C2D is phenomenal but it should be considering the money Intel dumps into MFG capabilities. I think If AMD can release at up to 3.2ishghz at 65nm, they'll be ok. AS long as 45nm via chartered, TSMC, UMC, or themselves comes in around 5 or 6 months from now. Intel has no desire to slash prices on all existing inventory in channel, enroute to channel, in assembly plants, etc. to drop in higher speeds at low prices. AMD however, having much less in the way of channel stock on Phenoms will have no such qualms. However B3 would have to be a godsend for that to happen... Worst case is that the current ~$600 champ bumps down to $300, and q6600 disappears (or its penryn equivalent), as having a a Q6600 variant out there at $200 or less would hurt ASP. I bet they sell more Q6600's now then any other speed grade, and because of heir ocing headroom, much of our market won't give them 2x or 3x the money for the same core... (IMHO)

Also, current results from Phenom are not indicative of the family... They are indicative of a broken chip (that can't take advantage of microcode updates thus the need of a bios flash), and launch issues. Intel had issues with the TLB in C2D as well, but they fixed with a microcode update... Yet another reason to flog them all at $200
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
Originally posted by: Darkskypoet
@bfdd I agree that they really needed to price these chips lower. However, as I have heard non-amd sources mention 3.5ghz on air for B3... I don't think it'll be that dire. Additionally... 4 ghz plus with Penryn isn't exactly stock cooling either... Intel has TDP to think about too. The OC headroom on C2D is phenomenal but it should be considering the money Intel dumps into MFG capabilities. I think If AMD can release at up to 3.2ishghz at 65nm, they'll be ok. AS long as 45nm via chartered, TSMC, UMC, or themselves comes in around 5 or 6 months from now. Intel has no desire to slash prices on all existing inventory in channel, enroute to channel, in assembly plants, etc. to drop in higher speeds at low prices. AMD however, having much less in the way of channel stock on Phenoms will have no such qualms. However B3 would have to be a godsend for that to happen... Worst case is that the current ~$600 champ bumps down to $300, and q6600 disappears (or its penryn equivalent), as having a a Q6600 variant out there at $200 or less would hurt ASP. I bet they sell more Q6600's now then any other speed grade, and because of heir ocing headroom, much of our market won't give them 2x or 3x the money for the same core... (IMHO)

Also, current results from Phenom are not indicative of the family... They are indicative of a broken chip (that can't take advantage of microcode updates thus the need of a bios flash), and launch issues. Intel had issues with the TLB in C2D as well, but they fixed with a microcode update... Yet another reason to flog them all at $200
Oh I know 4ghz+ isn't on stock cooling, but it seems like it would be easier for intel to make a more efficient stock air cooler to cool stock 4ghz chips than it would be for AMD to compete with a 4ghz C2Q/C2D. I would LOVE to see the Phenom 2.2 even price well below 200. If the B3 stepping is really what the rumors say it is, it'd be awesome to purchase for a cheap quad especially if the quad xfire scales well. I'm more interested in quad xfire and how that performs than I really am about the new phenom. If it does well you could very well see gamers rigs going that direction vs intel and nvidia offerings unless nvidia follows up right quick with cheap avalible cards to put in tri/quad sli configurations which again could help sell AMD cpus if nvidia releases a chipset for AMD.
 

BigDH01

Golden Member
Jul 8, 2005
1,630
82
91
Originally posted by: eye smite
It really is a great job by amd. I have been a fan of amd since the k6-2 with 3d now. All of my systems save one are amd and that one is a p4 3.2 gig my company sold me for $100 when they laid me off, otherwise I wouldn't own it as p4's were garbage imo. I don't see any problems with yesterday's launch at all, it was pre production chips and MB's you will obviosly see mature. AMD has earned the rep of being a good developer of their products while maintaining backwards compatibility, i.e. athlon xp and athlon64. You can't say that with intel and their constant cpu and chipset changes. Let us not forget that for 3 years amd trounced intel with the athlon64, parts of that cpu intel still hasn't equaled in inovation. Can anyone say FSB bottleneck? I see that peoples expectations are too high and amd made too early a release of phenom because of the noisy minority screaming for a new cpu. Many companies and home users are still on the rotting and alwasy was rotting p4's. I have 8 amd based systems here ranging from athlon 64's, x2 and one turion. I run world community grid through the boinc client and even the turion at 1400mhz slower is turning in more units than that p4 system I have. How many people out there are still on windows98 and how many of those systems are physically dying and having to be replaced? What would you rather see them buy in the 300-400 price range, a celeron D or an athlonx2 system. Say it's your neighbor and they ask you to help with their system from time to time, which one would you rather be frustrated by?

If that was a 3.2C Northwood you bought from your company then you got a great deal. It was faster than the Athlon XPs of that time.

I see a problem with yesterday's launch. They launched a chip that is slower than Intel's slowest quad core part and which requires more power. The only thing competitive about their chip is value. This is not the position to be in if you are getting destroyed by the market leader and squeezed into margins you can't maintain to survive. I would also hesitate to call the products launched yesterday as "pre-production." I believe those are the products you will be buying at the store.

What's so great about AMD's backwards compatibility? The upgrade to Athlon 64 from Athlon XP required a new motherboard. And the first A64FXs required socket 940 boards with registered RAM. How long were those around? Oh yeah, then there's the replacement of socket 939 with AM2. Many people around here were upset about that. I had already sold my X2 for a C2D by the time that happened so I didn't care as much. Regardless, don't act as if Intel is the only company to obsolete a motherboard/socket.

I would hardly say the A64 trounced the P4 for 3 years. In case you've forgotten, the P4 remained competitive in many areas. Look up encoding or 3d rendering. It was hardly a clean sweep for the A64. I realize that many views have been distorted by a gaming-only mindset and the poor reputation of the P4, but for some people the P4 was the chip to have. It wasn't until the high clocked 64s and X2s that AMD finally took a definitive lead in just about everything.

I won't address the innovation comment. I think it's quite obvious that if we define aggregate innovation as being able to produce the fastest chip using the least amount of power at a competitive price then there is a clear winner. If you want to define innovative solely by the interconnect, then so be it. I think that's a fairly limited comparison and ignores architectural advances that exist in Core but to each their own. Your argument would also be more germane if the FSB really was a bottleneck. All indications I've seen are that the FSB is not a huge bottleneck. It certainly doesn't stop Core from outperforming Athlon64/Phenom (at least on the desktop).

AMD released the Phenom because it has to try and stop the bleeding. It can't just scrap all those expensive MCM chips it's produced even if they do have errata. They aren't in business to lose money.

Why would I be limited to a Celeron D in a 300-400 dollar system? I can get an Allendale for only slightly less than $20 more than an X2 3800+. And that's only if I opt for the OEM 3800+. If I go retail then the difference is less than $15. Worth it in my opinion. Tack on another 50 dollars or so for a motherboard with integrated graphics and you are well on your way to a dirt cheap Core 2 system, albeit one with crappy graphics performance. (All prices from Newegg)

My first AMD chip was the K6. I've owned just about every iteration of AMD chip after that. It doesn't mean I blindly worship AMD. This launch is certainly lackluster if not a complete failure.
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Originally posted by: bfdd
Oh I know 4ghz+ isn't on stock cooling, but it seems like it would be easier for intel to make a more efficient stock air cooler to cool stock 4ghz chips than it would be for AMD to compete with a 4ghz C2Q/C2D. I would LOVE to see the Phenom 2.2 even price well below 200. If the B3 stepping is really what the rumors say it is, it'd be awesome to purchase for a cheap quad especially if the quad xfire scales well. I'm more interested in quad xfire and how that performs than I really am about the new phenom. If it does well you could very well see gamers rigs going that direction vs intel and nvidia offerings unless nvidia follows up right quick with cheap avalible cards to put in tri/quad sli configurations which again could help sell AMD cpus if nvidia releases a chipset for AMD.

Haven't you seen the new Extreme Edition HSF? It's nothing fancy but an improvement on the older style HSFs.

I doubt a Phenom 9500 will drop below $200, on simple price/performance alone it deserves a pricepoint of $200.

B3 is meant to allow AMD to scale Phenoms to HIGHER clockspeeds, at HIGHER pricepoints, to compete with Intel's higher end offerings. You are hoping for B3 to push prices down, which I doubt would happen. AMD may decide to cut prices for competitive reasons, but it will have nothing to do with B3.

I'll reserve judgement on quad CF until I see some actual benches. I'm not a fan of multi GPUs to be honest, it is very energy inefficient and I believe you will see the law of diminishing returns kick in with quad CF.

Remember, having 4 512MB cards in CF does NOT mean you have the equivalent of a 2GB monster on your hands, you are still effectively limited to 512MB of VRAM as the data is mirrored in the RAM of all 4 cards. This is the achillies heel of CF (and SLI) and you can already see this VRAM limitation when comparing SLI between 8800GT (512MB) and 8800GTX/Ultra (768MB) at higher resolutions.
 

AlabamaCajun

Member
Mar 11, 2005
126
0
0
The northwoods as a whole were not faster than overclocked XPs. I had a OCed NW2.6 at 3.06G my buddy had his XP at around 2.8G and cleaned my clock. I did evety thing I could to boost my favourite top of the line Intel rig but could not touch his scores.

Current Windsor 65nm hardware from AMD is right behind the Kents, with Phenom coming a little close when put clock for clock (overclock or not).
Intel holds the crown and AMD will continue with innovation lets just put the fanboyism away and move on.
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Originally posted by: BigDH01
If that was a 3.2C Northwood you bought from your company then you got a great deal. It was faster than the Athlon XPs of that time.

I see a problem with yesterday's launch. They launched a chip that is slower than Intel's slowest quad core part and which requires more power. The only thing competitive about their chip is value. This is not the position to be in if you are getting destroyed by the market leader and squeezed into margins you can't maintain to survive. I would also hesitate to call the products launched yesterday as "pre-production." I believe those are the products you will be buying at the store.

What's so great about AMD's backwards compatibility? The upgrade to Athlon 64 from Athlon XP required a new motherboard. And the first A64FXs required socket 940 boards with registered RAM. How long were those around? Oh yeah, then there's the replacement of socket 939 with AM2. Many people around here were upset about that. I had already sold my X2 for a C2D by the time that happened so I didn't care as much. Regardless, don't act as if Intel is the only company to obsolete a motherboard/socket.

I would hardly say the A64 trounced the P4 for 3 years. In case you've forgotten, the P4 remained competitive in many areas. Look up encoding or 3d rendering. It was hardly a clean sweep for the A64. I realize that many views have been distorted by a gaming-only mindset and the poor reputation of the P4, but for some people the P4 was the chip to have. It wasn't until the high clocked 64s and X2s that AMD finally took a definitive lead in just about everything.

I won't address the innovation comment. I think it's quite obvious that if we define aggregate innovation as being able to produce the fastest chip using the least amount of power at a competitive price then there is a clear winner. If you want to define innovative solely by the interconnect, then so be it. I think that's a fairly limited comparison and ignores architectural advances that exist in Core but to each their own. Your argument would also be more germane if the FSB really was a bottleneck. All indications I've seen are that the FSB is not a huge bottleneck. It certainly doesn't stop Core from outperforming Athlon64/Phenom (at least on the desktop).

AMD released the Phenom because it has to try and stop the bleeding. It can't just scrap all those expensive MCM chips it's produced even if they do have errata. They aren't in business to lose money.

Why would I be limited to a Celeron D in a 300-400 dollar system? I can get an Allendale for only slightly less than $20 more than an X2 3800+. And that's only if I opt for the OEM 3800+. If I go retail then the difference is less than $15. Worth it in my opinion. Tack on another 50 dollars or so for a motherboard with integrated graphics and you are well on your way to a dirt cheap Core 2 system, albeit one with crappy graphics performance. (All prices from Newegg)

My first AMD chip was the K6. I've owned just about every iteration of AMD chip after that. It doesn't mean I blindly worship AMD. This launch is certainly lackluster if not a complete failure.

You raise many good points. Netburst was ultimately a failure because Intel ran into severe thermal issues with Prescott, the architecture itself, whilst lacking in IPC, was capable of clocking into the stratosphere, it was simply overcome by curent leakage on the 90nm process. 65nm actually fixed the leakage issues somewhat, people were overclocking to 5GHz+ on air with the latter 65nm P4s, but by then the world had moved on to dual core, and even 65nm PDs were relative ovens compared to the more efficient X2s. Then of course, came C2D, and the rest, as they say, is history...

You are correct in saying that P4 was competitive in rendering and encoding, mainly thanks to it's SSE2 implementation that AMD did not incorporate until their Venice A64s if my memory serves me correctly.

People used to harp on the gaming prowess of A64 compared to P4, and whilst at low resolutions and detail settings it did trounce P4, in 'real world gaming' situations the difference was nowhere near as apparent.

Notice how nowadays, people emphasize the importance of GPU over CPU in the overall gaming experience? 'GPU limited' is a popular term amongst enthusiast gamers nowadays.

The same applied between the P4 and A64, once you ran games at higher details and resolutions, the limiting factor was predominantly the GPU, not the CPU. I tried pointing this out way back in 2003, to be shot down by fanboys saying 'STFU, look at the benchmarks, A64 pwns P4 at gaming!'.

In terms of upgrade paths, both AMD and Intel have been guilty of killing off upgrade paths on their platforms.

Since 2003, AMD has released S940, S754, S939, and AM2 (and now AM2+ obviously) for the desktop. That's roughly 1 new socket every year, and the one that was most disappointing was the discontinuation of S939, as it left a lot of enthusiasts out in the cold.

Intel, whilst having stayed on S775 for the past few years, tends to introduce new VRM requirements on each new generation of processor, and kept introducing new chipsets like it was going out of fashion or something. Sometimes the new chips would work on older boards, sometimes they wouldn't, it was really a crap shoot.

A lot can be argued over the technical merits of the native vs MCM approach, and frankly I would leave more technically savvy people to debate those points, but at the end of the day it's not hard to see who won this round. Intel's MCM quads may not be the most elegant by design, but the fact remains that Intel's slowest quad is faster than AMD's fastest quad, and that's a rather damning fact. Even with the native approach, Phenom couldn't achieve clock for clock parity with a 12 month old C2Q design, let alone the upcoming 45nm versions.

People have to understand the relative strengths and weaknesses of Phenom vs Core 2. K10 is strongest in servers, where it's scalability with increased core/socket count is far superior to the FSB constrained Core 2 Xeons.

However, desktop performance is the strength of Core 2. The bandwith advantages of HT vs FSB really doesn't mean much on typical desktop applications. That is the hard truth. Sure, AMD does show better scaling on heavily threaded applications (see Cinebench scaling vs Core 2), but it's still merely playing a game of catchup, rather than leapfrog. Ultimately, IPC still rules on desktop, and all that extra bandwith means little in real world desktop usage.
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Originally posted by: AlabamaCajun
The northwoods as a whole were not faster than overclocked XPs. I had a OCed NW2.6 at 3.06G my buddy had his XP at around 2.8G and cleaned my clock. I did evety thing I could to boost my favourite top of the line Intel rig but could not touch his scores.

Current Windsor 65nm hardware from AMD is right behind the Kents, with Phenom coming a little close when put clock for clock (overclock or not).
Intel holds the crown and AMD will continue with innovation lets just put the fanboyism away and move on.

Well, considering 3.06GHz is hardly a 'stellar' overclock on a Northwood C, whilst 2.8GHz is an EXCEPTIONAL overclock on an XP, what did you expect? You're comparing a shitty (no offence) Northwood overclock to an XP overclock that is, quite frankly, unrealistic except for cherrypicked XP-Ms with exceptional cooling. Most 'normal' XPs were hitting 2.3 - 2.4GHz max.

Please don't take offence when I say that I am more qualified than yourself to comment on this since I owned both a Northwood C and AXP system. My 2.6C could overclock to 3.4GHz stable, whilst my XP-M 2500+ (the best XPs for overclocking) maxed out at 2.7GHz, and this was with the SP-90, one of the best Socket A heatsink ever made.

The end result was very close between the two in terms of benchmarking, but I would still give the nod to the P4 in terms of general day to day use due to HT. It was simply more responsive than the XP system, and multitasking was definitely smoother. People often like to pass off HT as a gimmick, but in days before mainstream dual core, P4s w/HT were the only CPUs with multithreading abilities, albeit limited at that.

And dude, Windsor is 90nm, Brisbane is 65nm, and it ain't anywhere close to Core 2. Even Phenom is 10% slower clock for clock, and that will grow to 15% once 45nm Core 2s hit. You want people to put away fanboyism, when you show the classic signs of fanboyism yourself. Rather ironic, isn't it?
 

AlabamaCajun

Member
Mar 11, 2005
126
0
0
I can only comment on what I remember from three years. I was never a master overclocker and being my main rig it was a bad thing.
Now I do remember his rig was a mobile and he had it screaming but his speed was somewhere around 2.8 but I would have to ask him.
The thing about HT it was great on most of the microsoft tools and multiple apps running but is did nothing for games.
Even with all the beating I took it is still a good rig for doing tasks like Day to Day and other office tasks.
Little did I know back then I would shift to AMD having built 8085, 80188 systems from boards so I was no stranger.

Systems today are taking on a whole new dimension with media servers, PVRs, MPC and all the small form factors.
Up until 1997 I think, it was a little more of an undertaking to put together a system from scratch and get parts that played well together.
CPUs back then were expensive, mobos were about the same price as today $50 to $200 and up but ram always seemed expensive.
K-Series were out just trailing the early but slow rising Pentiums that up until K8 we finnaly saw a nice jump.
Sitting here today it's quite simple to collect a bunch of parts and build a ripping rig from either chip but spend what you want.
 

Daverino

Platinum Member
Mar 15, 2007
2,004
1
0
I think it's delusional to think that going monolithic was not an 'Engineering' choice but a 'Management' one. I am certain that the engineers at AMD had a loud and active voice in what design route they took on the Phenom. This is not a software or consulting firm where design specs can come from stakeholders without knowledge of the engineering required. A LOT of people at AMD must have thought going monolithic was a good idea, not just a cabal of know-nothing pencil pushers at the top. Iteration after iteration of design and planning went into their decision and they decided it was feasible and advantageous to go monolithic. And, in honestly, when they made that decision, based on where Intel was at the time, it could have seemed like the correct one. But the engineering was harder than expected and Intel was able to make dual-quad core exceed everyone's expectations. AMD played a pair of tens when Intel had a pair of Aces. But the wagers were made a long time ago and everyone at AMD, engineers and all, knew their cards.
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
Ugh, another "just wait for the next stepping" thread. I thought once we reached official launch, rabid AMD supporters would finally give up on this phantom stepping. For months and months we heard "you can't judge Phenom until we get real benchmarks of shipping chips" and "AMD has said +40% performance per clock over C2D." Well, news flash, the chip is launched and we have real benchmarks and they are a huge letdown. No need for anyone to make excuses or apologize for AMD, the numbers are what they are.

Also, if you think Intel would still be selling NetBurst chips today if AMD did not exist, I think you're fooling yourself. Even if there were no competition, Intel will ALWAYS be motivated to innovate and create faster, more efficient chips. NetBurst did not produce the results that Intel expected at 90nm and they had a viable alternative architecture already in the pipeline, so they made the switch which was the correct thing to do. Core would have come to the desktop with or without AMD. If Intel stops creating faster chips, they will die even if AMD is dead and gone because no one would feel the need to upgrade more than once every 4-5 years. Pushing out new, exciting technology allows the company to keep growing, and even if AMD is forced out of the market, Intel will innovate. It's a simple business decision.
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
Originally posted by: harpoon84
Originally posted by: bfdd
Oh I know 4ghz+ isn't on stock cooling, but it seems like it would be easier for intel to make a more efficient stock air cooler to cool stock 4ghz chips than it would be for AMD to compete with a 4ghz C2Q/C2D. I would LOVE to see the Phenom 2.2 even price well below 200. If the B3 stepping is really what the rumors say it is, it'd be awesome to purchase for a cheap quad especially if the quad xfire scales well. I'm more interested in quad xfire and how that performs than I really am about the new phenom. If it does well you could very well see gamers rigs going that direction vs intel and nvidia offerings unless nvidia follows up right quick with cheap avalible cards to put in tri/quad sli configurations which again could help sell AMD cpus if nvidia releases a chipset for AMD.

Haven't you seen the new Extreme Edition HSF? It's nothing fancy but an improvement on the older style HSFs.

I doubt a Phenom 9500 will drop below $200, on simple price/performance alone it deserves a pricepoint of $200.

B3 is meant to allow AMD to scale Phenoms to HIGHER clockspeeds, at HIGHER pricepoints, to compete with Intel's higher end offerings. You are hoping for B3 to push prices down, which I doubt would happen. AMD may decide to cut prices for competitive reasons, but it will have nothing to do with B3.

I'll reserve judgement on quad CF until I see some actual benches. I'm not a fan of multi GPUs to be honest, it is very energy inefficient and I believe you will see the law of diminishing returns kick in with quad CF.

Remember, having 4 512MB cards in CF does NOT mean you have the equivalent of a 2GB monster on your hands, you are still effectively limited to 512MB of VRAM as the data is mirrored in the RAM of all 4 cards. This is the achillies heel of CF (and SLI) and you can already see this VRAM limitation when comparing SLI between 8800GT (512MB) and 8800GTX/Ultra (768MB) at higher resolutions.

1. I know that the new HSF for the extreme edition is basicly the same, but they could go with something much better and charge a little premium and hit the higher clock speeds. I'm not talking Scythe Ninja, Tuniq Tower, or TR Ultra 120 crazy, but they could make something much bigger and better.

2. I realize that about it not adding the ram together and it only uses the ram on the main card, that's the biggest downside to both everyone knows that. But the raw processing power is what's amazing. I only game at 1680x1050 current cards are more than enough in single form or xfire/sli.
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Originally posted by: bfdd
I realize that about it not adding the ram together and it only uses the ram on the main card, that's the biggest downside to both everyone knows that. But the raw processing power is what's amazing. I only game at 1680x1050 current cards are more than enough in single form or xfire/sli.

Then what exactly is the appeal of Quad CF in a practical sense? As you said, at lower resolutions, 'regular' CF/SLI is already more than enough for playable framerates. Quad CF in this case would purely be for bragging rights. I don't care about raw processing power, if it can't be harnessed in a practical manner.

At higher resolutions, say 1920x1200 or 2560x1600, the 512MB VRAM will become a bottleneck. Just look at 8800GTS 320 SLI benchmarks to see what I mean. Past a certain point, once you run out of VRAM, performance just will not scale, no matter how many GPUs you throw into the mix.

 

Darkskypoet

Member
Feb 15, 2007
42
0
0
Originally posted by: Daverino
I think it's delusional to think that going monolithic was not an 'Engineering' choice but a 'Management' one. I am certain that the engineers at AMD had a loud and active voice in what design route they took on the Phenom. This is not a software or consulting firm where design specs can come from stakeholders without knowledge of the engineering required. A LOT of people at AMD must have thought going monolithic was a good idea, not just a cabal of know-nothing pencil pushers at the top. Iteration after iteration of design and planning went into their decision and they decided it was feasible and advantageous to go monolithic. And, in honestly, when they made that decision, based on where Intel was at the time, it could have seemed like the correct one. But the engineering was harder than expected and Intel was able to make dual-quad core exceed everyone's expectations. AMD played a pair of tens when Intel had a pair of Aces. But the wagers were made a long time ago and everyone at AMD, engineers and all, knew their cards.

I think Its delusional to accept that everyone thought a monolithic die was a great idea. Yes there will be debates.. but ultimately Management decides (and these are engineers too hopefully) what goes forward as a project. Regardless of where Intel is, its where AMDs MFG process is, and their capacity to produce chips. Both of which were limited, and in both cases MCM makes more sense. Intel, if any IC firm, can dedicate an entire fab to making it work without harming production of whats selling now.

I just get the sense that it was decided to go monolithic on a less then sound strategic basis. For an engineering feat, lovely. Lots of transistors, hugely complex, yay!!! AS for good business sense, or utilization of scarce fab space and small wafers?? Bad choice. Plain and simple, if the engineers weren't reigned in, or if leadership decided to gamble either way bad idea, bad leadership, bad management. Lots of cajones though.
 

Darkskypoet

Member
Feb 15, 2007
42
0
0
Originally posted by: SexyK
Ugh, another "just wait for the next stepping" thread. I thought once we reached official launch, rabid AMD supporters would finally give up on this phantom stepping. For months and months we heard "you can't judge Phenom until we get real benchmarks of shipping chips" and "AMD has said +40% performance per clock over C2D." Well, news flash, the chip is launched and we have real benchmarks and they are a huge letdown. No need for anyone to make excuses or apologize for AMD, the numbers are what they are.

Also, if you think Intel would still be selling NetBurst chips today if AMD did not exist, I think you're fooling yourself. Even if there were no competition, Intel will ALWAYS be motivated to innovate and create faster, more efficient chips. NetBurst did not produce the results that Intel expected at 90nm and they had a viable alternative architecture already in the pipeline, so they made the switch which was the correct thing to do. Core would have come to the desktop with or without AMD. If Intel stops creating faster chips, they will die even if AMD is dead and gone because no one would feel the need to upgrade more than once every 4-5 years. Pushing out new, exciting technology allows the company to keep growing, and even if AMD is forced out of the market, Intel will innovate. It's a simple business decision.


Not another wait till the next stepping thread, however B3 will be better, as most later steppings are. I waited to get a G0 over a B3 from intel for client's Q6600's... Something wrong with that? Made it easier to hit 3.6ghz on air.... So If I get to the same on AMD B3... is that bad? or wrong?

The main point is something is wrong with the benchmarks... If they all uniformly sucked, thats cool, bad chip... If you get unreasonable, and non-sensical results.. you wait and investigate. Same as I'd do for intel when checking new silicone... Prescott sucked, we figured it would, but a lot of us we waited to see how it was gonna go... Poorly.. Hated them vs northwoods... But if it hit, or C2D for that matter with weird assed results, and some strange issues... It should at least give pause to people to find out what happened. I know I am that way cause I am curious, and also, because If a chip is gonna suck... lets let it suck with all its faculties. If indeed a bios flash fixes a few things ( no updating microcode for AMD chips to fix issues ) and the results turn out to be a little less neurotic, cool. If not an they are just broke past a certain frequency, chalk up another failure for the stupidly large monolithic core on 65nm.

However, An architecture can redeem itself. Simply look back to Thoroughbred A vs Thouroughbred B... Massive difference there... One spin away. we'll have these two architectures for the better part of a year or more, lots can happen. Right now though its more curiosity about the why for me. Aren't you curious? Didn't you interrogate the results? Don't they make you ask a few questions? If not, thats cool man, I build C2Ds for people all the time, especially for designer stations... Its the best game in town. But I like asking why, as do many people involved in this stuff... Or so I had thought.



AS to the second part... If there was no competition, intel would produce whatever made it the most amount of money, and allowed it to engage in as close to near perfect price description as possible. Its the def'n of a monopoly. AS well, you would see hard locks in effect limiting Over clocking in as draconian a fashion as possible, as this ensures better price discrimination.

The core type is inconsequential, If there is only one, then they seek monopolistic rent and price discrimination, they also don't run that many design teams in parallel, as it costs shareholders too much money.

Considering the P4 was an exercise in marketing to get a higher clock speed because of AMD beating them to certain milestones, its debatable if they would have went netburst at all w/o AMD. The cheaper more profitable bet was incremental changes to the P3, following their past history without a competitive AMD. Incremental changes to the Pentium line... P5, P6, P!!, P!!!, P-M, etc etc.

 

DrMrLordX

Lifer
Apr 27, 2000
22,000
11,563
136
So do we have confirmation that B2 stepping Phenoms have broken dual-channel memory support as a function of their memory controller? Or is this an AM2+ BIOS issue on boards used in testing? I took a quick look at the Sandra memory benchmark numbers on Tom's review (yech) and did notice that the memory bandwidth numbers looked stupidly low for Phenom.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: eye smite
I wasn't being shorsighted on where the competition is important, I was focusing on the fact that if amd hadn't been so competitive, you'd still be delusioned that netburst was a great cpu when it was garbage. When p4's first came out you could lower them to 1 gig which is the highest the p3 would go at the time and the p3 would run circles around it, so we knew way way back then intel was building garbage. It took years of intel earning my contempt on this, and it sure isn't going to dissapear overnight. When intel shows business dealings and customer based decisions like amd has, then they might get a bit of respect from me. Oh my look, It's only been a year and a half and they're going to train all of the intel base again with no backwards compatibility for current system builders and owners when the new core comes out with ddr3 support. Go figure.

Where do you guys come up with crap like this? My newest system has an LGA 775 "socket", with a Kentsfield Q6600 in it now, yet can run the latest Penryn. Oh yeah, and it has DDR2 in it atm, yet can also use DDR3, anytime I see fit. On the other hand, AMD has changed sockets and chipsets three times in the last 16 months.

Originally posted by: BigDH01
I would hardly say the A64 trounced the P4 for 3 years. In case you've forgotten, the P4 remained competitive in many areas. Look up encoding or 3d rendering.

You're right, the A64's absolutely trounced all of the P4's in everything, except video encoding and/or 3d rendering. That would make it the best processor for what, 97% of the population? That means he was right.

Why would I be limited to a Celeron D in a 300-400 dollar system? I can get an Allendale for only slightly less than $20 more than an X2 3800+. And that's only if I opt for the OEM 3800+. If I go retail then the difference is less than $15. Worth it in my opinion. Tack on another 50 dollars or so for a motherboard with integrated graphics and you are well on your way to a dirt cheap Core 2 system, albeit one with crappy graphics performance.

Very good point.

Originally posted by: harpoon84
People used to harp on the gaming prowess of A64 compared to P4, and whilst at low resolutions and detail settings it did trounce P4, in 'real world gaming' situations the difference was nowhere near as apparent.

Notice how nowadays, people emphasize the importance of GPU over CPU in the overall gaming experience? 'GPU limited' is a popular term amongst enthusiast gamers nowadays.

That depends completely on which games you're playing, actually. While nearly all FPS's are more or less totally GPU-limited, other games are about as much CPU-limited, like flight simulators and online multiplayer RPG's, as FPS's are GPU-limited.
 

AlabamaCajun

Member
Mar 11, 2005
126
0
0
Yes the low numbers on the RAM caught me off guard to. From what I understand, the RAM controller is now 200mhz slower than clock speed killing the excellant bandwidth we used to see on AMD procs. Sucks when you see new DDR2 tested on an Intel platform doing 7200 where AMDs did 9100, YMMV* I hope this fix this but it is an apparent a tweak on the new controller. What may make up is the split channel ability of the 2 cores to each get one channel on AM2+ boards.
 

AlabamaCajun

Member
Mar 11, 2005
126
0
0
Originally posted by: SexyK
~
Also, if you think Intel would still be selling NetBurst chips today if AMD did not exist, I think you're fooling yourself. Even if there were no competition, Intel will ALWAYS be motivated to innovate and create faster, more efficient chips. NetBurst did not produce the results that Intel expected at 90nm and they had a viable alternative architecture already in the pipeline, so they made the switch which was the correct thing to do. Core would have come to the desktop with or without AMD. If Intel stops creating faster chips, they will die even if AMD is dead and gone because no one would feel the need to upgrade more than once every 4-5 years. Pushing out new, exciting technology allows the company to keep growing, and even if AMD is forced out of the market, Intel will innovate. It's a simple business decision.

Actually I think intel would be selling some Itanic based junk with HT still in it that would barely beat a Sempron. To be more respectful, while I personally thought Conroe sucked, I admire the Harper-Yorks. Neha intrigues me but not enough to cross the bus divide.
 

Darkskypoet

Member
Feb 15, 2007
42
0
0
Mem scores are sad... Some reviewers mentioned it didn't matter whether one stick or two were in system. It performed the same... Which is why I say wait a bit to get some sensible results.... Mem scores should break 10000, not sit around 6k like some sites showed. Damn, my oc'd x2-3600 beats the crap out of it. Thats not right. Nevermind the impeded cache speeds (running at 2ghz because of another issue with bios / or TLB of cache)
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
The L3 cache runs at the HT speed by design. It's not a bug, although you could call it an issue.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
While there are rumors of a TLB bug, just be clear that is not the reason the L3 cache runs at 2Ghz.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |