ShintaiDK
Lifer
- Apr 22, 2012
- 20,378
- 145
- 106
I'd say most people want performance/$, which has at best stagnated, maybe even gone back a bit.
Ask yourself how nVidia destroyed AMD in the discrete graphics segment. Performance/watt again.
I'd say most people want performance/$, which has at best stagnated, maybe even gone back a bit.
this has just been on my mind since i first saw this thread and not sure if it's already been mentioned. but does the slowdown in cpu performance development have anything to do with the not-so fierce competition from AMD's side for the past couple of years?
I think there has been numerous threads of people expressing their opinion that the upgrade cycle is getting longer, but 20 years is a heck of a exaggeration.
Ask yourself how nVidia destroyed AMD in the discrete graphics segment. Performance/watt again.
Cpu has nothing to do with resolution.We are now approaching the size limit of what's reasonable for our desktop monitor. 1080P on a 27" monitor already looks really good so don't expect the typical home user to be too driven to upgrade to the latest skylake in an effort to display more stuff on screen.
Cpu has nothing to do with resolution.
I've read tales about telegraph wires acting as long antennas and starting fires in telegraph offices. Surge protector or no, our cases are imperfect Faraday cages at best. When the defense department 1st learned of nuclear-driven EMP's back in the 60's (I had a highschool classmate tell me about what her boyfriend was up to, if you can believe it) they discovered that Faraday cages weren't enough. EMP-proofing is hard.Provided your desktop case is metal, it should be shielded from emp. Make sure you've got a good surge protector though.
Sure but for what you are saying years have to pass and everybody has to move to a higher res,someone getting a 4k monitor today won't have to deal with 4k amounts of flash on webpages or whatever else.Not directly, but higher resolution and bigger screen display leads to more flash/scripts/programs running on your screen. Compare the web layout from the old 640 x 480 days to now. There's much more going on when you load up a webpage because the majority of them assumes you have at least a 1366x 768 or even a 1080p display so they dump more visuals at you. So more screen or higher resolution allows you to do more things which in turn increases the need for more powerful CPUs. What I'm saying is our screen size and resolution is near the end of the desktop road for most people.
Name anything else on Earth that follows this rule. I can think of exactly zero. Why would CPUs have to follow a precedent that no other product follows?For 200% of the price, I'm expecting at least 200% of the performance.
This seems like a troll thread.
For 200% of the price, I'm expecting at least 200% of the performance.
Name anything else on Earth that follows this rule. I can think of exactly zero.
* Small vs large coke bottle. You get more than 2x size for 2x price.
* Light bulbs. You get more than 2x wattage for 2x price.
* Small embedded CPUs. You get more than 2x performance for 2x price.
The list goes on...
Please quote the guy who said a single thing about quantity per dollar. The guy I quoted mentioned quantity exactly zero times.
edit: Are you able to give an example or two of this magical embedded CPU, with free performance
AMD's problems become obvious by first learning well proven concepts in innovation. AMD once made inferior CPUs. But AMD bought an Asian CPU design house. Then said no management was allowed to talk to or interfere with those designers. This happened when Intel stumbled with Pentium 4. Intel's architects, unfortunately, designed a new Pentium that requires too much silicon. Intel had to eliminate the features to make manufacturing possible.I'd say it's Nvidia's superior performance/transistor/die size, marketing, and inability for AMD to directly compete thanks to constant financial losses.
You should do some research into strawman arguments. Nobody in this thread has said this, including myself. <--- Edit: I'm sorry, I had forgotten that I phrased my response the way that I did.Anyway, one is enough to prove the point that zero cases exist to be incorrect.
Hmm... free performance? I said 2x performance for 2x price. So you don't get it for free, you have to pay twice as much.
Nope, as quoted here, you said that you get free performance, by buying some unnamed embedded CPU.You get more than 2x performance for 2x price.
Name anything else on Earth that follows this rule. I can think of exactly zero. Why would CPUs have to follow a precedent that no other product follows?
Nope, as quoted here, you said that you get free performance, by buying some unnamed embedded CPU.
* Light bulbs. You get more than 2x wattage for 2x price.
Okay, that's a great example. It's actually the kind of example I was trying to think of/find (edit: the word I was seeking here is 'imagine', not 'think of/find'). I do not at all follow consoles, so it never even crossed my mind.Playstation 2 - £300 - Launch
Playstation 3 - £425 - Launch
Playstation 4 - £340 - Launch
Uh, free performance?
Are you being serious here? I ask, because playing stupid does you no favour. You see, I made a rather popular comparison, known as priceerformance. I mentioned it in my earlier post.
Are you saying that over in England, you don't call receiving something for which you did not pay getting it for "free"?Small embedded CPUs. You get more than 2x performance for 2x price.
Are you saying that over in England, you don't call receiving something for which you did not pay getting it for "free"?
Which gpu and resolution did you use in 2009? Because afterall gpu is still the most important part of getting good fps. I used amd hd7750 with 64X2 4600+ at 1440*900 and thus got good fps(30 fps average) so you might be using a weaker gpu or higher resolution hence you got poor fps.
Yes G1820 will beat e6600 but that is hardly the point. The point is c2d or Athlon x2 would still fare relatively well(25-30fps average at atleast) in today's games provided it's paired with a mid range gpu like gtx 750ti and tested at 720p medium settings instead of forcing the poor grandpa cpu's from 05/06 to perform at 1080p high.
Ask yourself how nVidia destroyed AMD in the discrete graphics segment. Performance/watt again.
20 years. 20 years. We could be looking at a global scale nuclear winter, alien invasion, quantum computing paradigm shift, emdrive flying cars and a human colony on mars.
Im not putting money down on anything 20 years out.