Desktop CPU upgrades have now shifted to a 20 year cycle.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cytg111

Lifer
Mar 17, 2008
23,550
13,115
136
this has just been on my mind since i first saw this thread and not sure if it's already been mentioned. but does the slowdown in cpu performance development have anything to do with the not-so fierce competition from AMD's side for the past couple of years?

Yes.
 

positivedoppler

Golden Member
Apr 30, 2012
1,112
174
106
I think there has been numerous threads of people expressing their opinion that the upgrade cycle is getting longer, but 20 years is a heck of a exaggeration.

You have to look at the reason of what drives the need for home users to upgrade.
For the gamers and those in the technical field who needs the extra horsepower, the upgrade cycle will always be short(1-3 years), but in truth, these users are the minority.

For the casual users which I imagine is the vast majority, yes the upgrade cycle will be much longer. Had we stopped at Words and Excel, yes there be a 20 years upgrade cycle provided you okay with looking at an ugly giant beige case. But more demanding internet content, anti virus, flash games and ads are probably what drives the perception that one's computer is slow and an upgrade is needed.

With that being said, the typical internet speed for the vast majority of internet user has not increase much in the past few years and probably will not so as the amount of content streaming from the internet that could clog your cpu cycle will probably not get drastically worse. We've enjoyed a smooth ride since the old 13" CRT monitor days with each improvement in larger screen technology drove the need for processing power to display what's on the screen. We are now approaching the size limit of what's reasonable for our desktop monitor. 1080P on a 27" monitor already looks really good so don't expect the typical home user to be too driven to upgrade to the latest skylake in an effort to display more stuff on screen. I wouldn't be surprise at all if this upgrade cycle goes to 7 years or even a bit more. Majority of people who I know now who are not gamers are running some fairly old computer(older than 5 years)

That being said, I do believe VR will provide quite a boom in home computing demand as VR will hopefully take center stage as the new must have for all home entertainment systems. A leap in internet bandwidth (fiber optics everywhere steaming 4k) will also all of a sudden shorten the upgrade cycle once more.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Ask yourself how nVidia destroyed AMD in the discrete graphics segment. Performance/watt again.

I'd say it's Nvidia's superior performance/transistor/die size, marketing, and inability for AMD to directly compete thanks to constant financial losses.
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
'Tis priceerformance, nay?

My ol' 965 BE cost me £80. An i5 2500k costed £160, offering around 40-45% better performance. For 200% of the price, I'm expecting at least 200% of the performance.

There's just nothing to upgrade to. Sure, there are those costly intel proccies, but damned if I'm paying more than double, for just a bit more than half again as much performance.

'Tis the used market that people are buying from, I'd wager. A used i5 2500k goes for £110, and a used 965 BE goes for £60. I've seriously considered going for the used 2500k, but...Eh. If it ain't broke, don't fix it.
 

TheELF

Diamond Member
Dec 22, 2012
3,993
744
126
We are now approaching the size limit of what's reasonable for our desktop monitor. 1080P on a 27" monitor already looks really good so don't expect the typical home user to be too driven to upgrade to the latest skylake in an effort to display more stuff on screen.
Cpu has nothing to do with resolution.
 

positivedoppler

Golden Member
Apr 30, 2012
1,112
174
106
Cpu has nothing to do with resolution.

Not directly, but higher resolution and bigger screen display leads to more flash/scripts/programs running on your screen. Compare the web layout from the old 640 x 480 days to now. There's much more going on when you load up a webpage because the majority of them assumes you have at least a 1366x 768 or even a 1080p display so they dump more visuals at you. So more screen or higher resolution allows you to do more things which in turn increases the need for more powerful CPUs. What I'm saying is our screen size and resolution is near the end of the desktop road for most people.
 

ehume

Golden Member
Nov 6, 2009
1,511
73
91
Provided your desktop case is metal, it should be shielded from emp. Make sure you've got a good surge protector though.
I've read tales about telegraph wires acting as long antennas and starting fires in telegraph offices. Surge protector or no, our cases are imperfect Faraday cages at best. When the defense department 1st learned of nuclear-driven EMP's back in the 60's (I had a highschool classmate tell me about what her boyfriend was up to, if you can believe it) they discovered that Faraday cages weren't enough. EMP-proofing is hard.

We're all dead meat the next time the Sun burps in our direction. We ought to be buying EMP-resistant gear now, but no one will. Try driving your car . . .

Back to main thread: Jetliners let you know what climax design looks like. People still ride on 20-year-old planes. Heck, people still fly in 50-60-year-old propeller things, though 40-50-year-old planes are more common.
 

TheELF

Diamond Member
Dec 22, 2012
3,993
744
126
Not directly, but higher resolution and bigger screen display leads to more flash/scripts/programs running on your screen. Compare the web layout from the old 640 x 480 days to now. There's much more going on when you load up a webpage because the majority of them assumes you have at least a 1366x 768 or even a 1080p display so they dump more visuals at you. So more screen or higher resolution allows you to do more things which in turn increases the need for more powerful CPUs. What I'm saying is our screen size and resolution is near the end of the desktop road for most people.
Sure but for what you are saying years have to pass and everybody has to move to a higher res,someone getting a 4k monitor today won't have to deal with 4k amounts of flash on webpages or whatever else.
That by going to 4k you automatically load up more programs at once and use cascadeing or whatever to see them all at once I don't buy.
You don't have to see everything at once to do a lot of work,if you're not doing it in 1080 you won't do it in 4k.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
For 200% of the price, I'm expecting at least 200% of the performance.
Name anything else on Earth that follows this rule. I can think of exactly zero. Why would CPUs have to follow a precedent that no other product follows?
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,929
405
126
For 200% of the price, I'm expecting at least 200% of the performance.

Name anything else on Earth that follows this rule. I can think of exactly zero.

* Small vs large coke bottle. You get more than 2x size for 2x price.

* Light bulbs. You get more than 2x wattage for 2x price.

* Small embedded CPUs. You get more than 2x performance for 2x price.

The list goes on...
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
* Small vs large coke bottle. You get more than 2x size for 2x price.

* Light bulbs. You get more than 2x wattage for 2x price.

* Small embedded CPUs. You get more than 2x performance for 2x price.

The list goes on...

Please quote the guy who said a single thing about volume per dollar. The guy I quoted mentioned volume exactly zero times.

edit: Are you able to give an example or two of this magical embedded CPU, with free performance?
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
3,929
405
126
Please quote the guy who said a single thing about quantity per dollar. The guy I quoted mentioned quantity exactly zero times.

Sure, but number two is about performance to some degree, and certainly number three. Anyway, one is enough to prove the point that zero cases exist to be incorrect.

There are lots of similar cases once you start thinking about it.

edit: Are you able to give an example or two of this magical embedded CPU, with free performance

Hmm... free performance? I said 2x performance for 2x price. So you don't get it for free, you have to pay twice as much.
 
Last edited:

westom

Senior member
Apr 25, 2009
517
0
71
I'd say it's Nvidia's superior performance/transistor/die size, marketing, and inability for AMD to directly compete thanks to constant financial losses.
AMD's problems become obvious by first learning well proven concepts in innovation. AMD once made inferior CPUs. But AMD bought an Asian CPU design house. Then said no management was allowed to talk to or interfere with those designers. This happened when Intel stumbled with Pentium 4. Intel's architects, unfortunately, designed a new Pentium that requires too much silicon. Intel had to eliminate the features to make manufacturing possible.

Intel's modified design meant (for example) that the order for loading SI, DI, and CX registers (loaded for varioius CMPSx, STOSx and MOVSx instructions) were critical. Compilers had to be rewritten to optimize how those instructions executed. Otherwise an Intel CPU could stall due to congestion in execution pipelines. AMD processors did not need optimized code. AMD processors were faster when programs were not compiled in an optimized manner.

Moving on - transistors were so tiny that CMOS gates (silicon dioxide - glass) were only three atoms thick. This caused AMD and Intel CPUs to leak current - get hot. Everyone knew this solution required a high-K material. But no one could make one work. At one point, IBM tried one in production. ICs failed (peeled off) during production.

Intel ignores concepts taught in the business schools. Since spread sheet logic can only report on things that were four plus years ago. Bean counter management is why innovation gets stifled. Then profits diminish four or ten years later.

Intel uses Moore's Law. That means making decisions based upon the product - not upon half truths on spread sheets. Intel realized Moore's law would be violated without some solution to current leakage in gates - and resulting high temperatures. So Intel did something that most companies (that later fail) refuse to do. Intel risked the entire company on a high-K material involving Hafnium. Intel committed designs (that would appear two or more years later) to using Hafnium. Since the president came from where the work gets done (not from the finance department and not from business school philosophies), then he could risk the company on what he believed was possible.

If Hafnium did not work, Intel had major new product disruptions for years. That is a major risk.

Today Intel processors are faster and cooler than AMDs - because Intel took the risk and made Hafnium work. AMD did what business school graduates do - make decisions based in costs and profits. So AMD no longer has those profits. Only innovation makes profits happen. AMD decided to play it safe - and not innovate - as any graduate from business schools is trained to do.
 
Last edited:

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Anyway, one is enough to prove the point that zero cases exist to be incorrect.
You should do some research into strawman arguments. Nobody in this thread has said this, including myself. <--- Edit: I'm sorry, I had forgotten that I phrased my response the way that I did.
I retract the strawman accusation.
Hmm... free performance? I said 2x performance for 2x price. So you don't get it for free, you have to pay twice as much.
You get more than 2x performance for 2x price.
Nope, as quoted here, you said that you get free performance, by buying some unnamed embedded CPU.
 
Last edited:

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Name anything else on Earth that follows this rule. I can think of exactly zero. Why would CPUs have to follow a precedent that no other product follows?

Playstation 2 - £300 - Launch
Playstation 3 - £425 - Launch
Playstation 4 - £340 - Launch

AMD Radeon HD 6670 1GB DDR3 VTX3D - £60 - Post-Launch
AMD Radeon HD 7850 2GB GDDR5 Sapphire - £125 - Post-Launch.

Questions?

Nope, as quoted here, you said that you get free performance, by buying some unnamed embedded CPU.

Uh, free performance?

Are you being serious here? I ask, because playing stupid does you no favour. You see, I made a rather popular comparison, known as priceerformance. I mentioned it in my earlier post.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
* Light bulbs. You get more than 2x wattage for 2x price.

That doesnt say anything. Its like saying a twice as expensive car uses twice as much fuel.

Instead just say you can get twice the lumen (or more) for twice the price, then you got a valid argument. Assuming those required prices would match real world pricing.
 
Last edited:

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Playstation 2 - £300 - Launch
Playstation 3 - £425 - Launch
Playstation 4 - £340 - Launch
Okay, that's a great example. It's actually the kind of example I was trying to think of/find (edit: the word I was seeking here is 'imagine', not 'think of/find'). I do not at all follow consoles, so it never even crossed my mind.
 
Last edited:

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Uh, free performance?

Are you being serious here? I ask, because playing stupid does you no favour. You see, I made a rather popular comparison, known as priceerformance. I mentioned it in my earlier post.
Small embedded CPUs. You get more than 2x performance for 2x price.
Are you saying that over in England, you don't call receiving something for which you did not pay getting it for "free"?
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Are you saying that over in England, you don't call receiving something for which you did not pay getting it for "free"?

I don't know how things work in the dark lands of the south, but up here in the dignified north, when we pay for something, we give currency in exchange. If the thing we're buying is an improved model that has a better priceerformance curve, that's called a better deal than that which was offered with the older model.

Now, if we got a complimentary bar of chocolate when we buy the new model from a particular store, that bar of chocolate would be considered free. But the new model? Still gotta pay for it.
 

MrTeal

Diamond Member
Dec 7, 2003
3,586
1,748
136
Which gpu and resolution did you use in 2009? Because afterall gpu is still the most important part of getting good fps. I used amd hd7750 with 64X2 4600+ at 1440*900 and thus got good fps(30 fps average) so you might be using a weaker gpu or higher resolution hence you got poor fps.
Yes G1820 will beat e6600 but that is hardly the point. The point is c2d or Athlon x2 would still fare relatively well(25-30fps average at atleast) in today's games provided it's paired with a mid range gpu like gtx 750ti and tested at 720p medium settings instead of forcing the poor grandpa cpu's from 05/06 to perform at 1080p high.

You most likely won't be able to upgrade the GPU on a 2015 computer to anything remotely modern in 2035. Even 10 years ago PCIe was pretty new and there were still computers selling with AGP. If your system was even a top of the line FX-51 from a dozen years ago, the fastest card you could upgrade your computer to would be an HD4670 (or HD3850).

Even 10 years from now it's not unlikely that most modern video cards will use a connector other than PCIe x16 and while you might still see some PCIe cards produced for awhile, they'll be increasing lower power ones and the market will be marginalized.
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Ask yourself how nVidia destroyed AMD in the discrete graphics segment. Performance/watt again.

I think that's incredibly misleading -- I suspect many former Radeon customers are buying APU's now instead of dedicated cards. So the nVidia "destoying" AMD is hyperbole.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
20 years. 20 years. We could be looking at a global scale nuclear winter, alien invasion, quantum computing paradigm shift, emdrive flying cars and a human colony on mars.
Im not putting money down on anything 20 years out.

I think the alien invasion is more likely than me running the same desktop that I have right now in 2035.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |