I wonder for how long we are going to keep using ATX...
Obviously Haswell would be better for a 10 year period. But the entire concept of a 10 year period is silly. You need every single component to work that long, and you need to be able to replace any defect component within reasonable price, should it happen. Not to mention all the performance related issues.
Had you bought the best 10 years ago. It would be something like a singlecore P4 2.8Ghz/K7 with a FX5800/9700 card and whatever limited memory it may have had.
I don't think using a P4 is a good example, even if it is, in fact, 10 years old.
Look at how the C2D has aged, in comparison. From 2006 until today, is what, 8 years? And yet, they are still totally viable, assuming that you max the RAM out and add an SSD.
After the quantum leap forward in performance with the P4 to C2D transition, further CPU performance increases were much more subtle, on the order of a few miniscule percentage points each new generation.
Intel is now much more concerned with lowering power consumption and increasing IGP performance, than they are with raw CPU performance. They also don't seem to want to go beyond quad-cores in the consumer segment, even though quad-cores were introduced in 2006 as well.
Edit: Then again, we may have 4K video and H265 by then (in 2 more years), which might cripple the C2D rigs just as much as 1080P flash video cripples P4 rigs today. Who knows.
Basic web browsing, media consumption, online videos, etc. Basically, something for puttering around on the Internet.In what context is this PC supposed to perform? Gaming for something 10 years from now? Surfing? Spreadsheets? Databases? ASCII art?
To put this into some better context, my oldest working PC at the moment is a Q9550 (if we're strictly taking CPU only), which puts this at about five years old. That said, for it's current task, which is zero gaming, just as a general use PC, it's more than sufficient. It's got 8GB RAM, and a pair of Raptors in a stripe. I can easily pop in an SSD to get a bit more performance out of it if I desire and it will be fine.
While those examples obviously don't cover everything, the point is that the foreseeable future allows for some flexability with maintaining a PC for a while, depending on what we want it to do.
I wouldn't count on the load for media decoding remaining the same. Historically, new algorithms always present difficult, if impossible, challenge for older CPUs to decode in real time. I remember going through different iterations of video codecs and resolution (DivX3 to 5, XviD, 720p to 1080p H264 and Hi10). Each newer codecs always upped processing workload.Basic web browsing, media consumption, online videos, etc. Basically, something for puttering around on the Internet.
I wouldn't count on the load for media decoding remaining the same. Historically, new algorithms always present difficult, if impossible, challenge for older CPUs to decode in real time. I remember going through different iterations of video codecs and resolution (DivX3 to 5, XviD, 720p to 1080p H264 and Hi10). Each newer codecs always upped processing workload.
4K and H265 will undoubtedly continue the trend. Who knows what newer codecs we will have in 10 years and if a Haswell will still be viable or just useless by then?
If, every year, we get a 5% improvement in CPU performance, that 105% of the speed of last year. Multiply that out, and for 5 years, you get something on the order of 127% increase over 5 years. (Too lazy to do the math for 10 years.)
Unless there's some breakthrough in CPU performance, then it is unlikely that codecs of the future will demand more CPU power than that which is available from a top-of-the-line CPU at that point in time, and using my 5% improvement progression, then it seems highly unlikely that Haswell will be rendered obsolete. Well, unless Intel gets TDP licked, and goes core-crazy.
Another thing to consider. Assuming that they still use some (backwards-compatible) form of PCI-Express for video cards 5 years into the future, then when 4K-res displays come down in price, the rig will need a video card upgrade just to support those resolutions in 2D, I think. (Will current Intel IGPs, or NV/AMD card drive 4K already?) A new video card, will also contain, most likely, hardware support for all the current newest codecs to drive that resolution when playing video.
(Will current Intel IGPs....(snip)....drive 4K already?)
This October Intel will be providing a driver update for Ivy Bridge that will enable 4K x 2K resolution support as well as hardware accelerated 4K video decode. You'll need to use two DP outputs to drive a 4K panel from an Ivy Bridge system, which unfortunately makes it so most existing Ivy Bridge systems won't be able to drive the higher resolution panels.
Haswell will support driving a 4K display off of a single DP output or HDMI.
If, every year, we get a 5% improvement in CPU performance, that 105% of the speed of last year. Multiply that out, and for 5 years, you get something on the order of 127% increase over 5 years. (Too lazy to do the math for 10 years.)
Unless there's some breakthrough in CPU performance, then it is unlikely that codecs of the future will demand more CPU power than that which is available from a top-of-the-line CPU at that point in time, and using my 5% improvement progression, then it seems highly unlikely that Haswell will be rendered obsolete. Well, unless Intel gets TDP licked, and goes core-crazy.
Another thing to consider. Assuming that they still use some (backwards-compatible) form of PCI-Express for video cards 5 years into the future, then when 4K-res displays come down in price, the rig will need a video card upgrade just to support those resolutions in 2D, I think. (Will current Intel IGPs, or NV/AMD card drive 4K already?) A new video card, will also contain, most likely, hardware support for all the current newest codecs to drive that resolution when playing video.
Some of you might have read my other thread about an IB Celeron and my goals of having a "ten year rig". One that I wouldn't have to do a platform upgrade for 10 years.
Well, I might have been a bit pre-mature buying an IB rig for that purpose, but I had that idea mostly AFTER I purchased it.
Assuming I do something similar for my personal rig, except get a Haswell. Do you all think that the HW platform is a good investment for a "ten year rig"?