In Dec 2008 I bought i7 920, that I still use, to upgrade from a single core P4 2.4Ghz bought in Aug 2003, so 5.5 years between them. It's 5-10x more performance, huge difference, P4 is barely enough to run Vista alone. This is what I consider a gigantic leap. i7 920 to 6700K doesn't come anywhere close to this kind of difference in experience. And it's been 7.5 years.
And P4 itself was an upgrade for a Duron 700 MHz after some 4-5 years, yet another far bigger improvement than 920->6700K would be. So maybe 920->Sandy is a 'huge improvement' for you, but to me it's not even worth the 1-2 hr hassle to build machine + reinstall OS, I literally wouldn't take 2600K+MB for free...
You know this was already addressed by me in
post #55 in this very thread?
"I went from a 1998 Pentium II 233mhz MMX (never overclocked) to a 2001 Athlon XP1600+ overclocked to 1800+ speeds, or a 7.73X increase in performance in only 3 years. Then in 2003, just 2 years after my Athlon XP1600+, I got a Pentium 4 2.6Ghz "C" that overclocked to 3.2Ghz. That means from 1998 to 2003, or in 5 years, my CPU performance increased 13.73X. By 2006, I had E6400 @ 3.4Ghz (so more than 2X Pentium 4 "C" 3.2Ghz) and by August 2007, I upgraded that to Q6600 @ 3.4Ghz (double the cores). If I apply straight up linear logic, that means 13.73X * 2X for C2D and another 2X for 2 more cores = 54.92X from Fall 1998 to August 2007.
If I were to apply the same high standards, even for a small time-span of 1998 to 2003 where CPU speed for me went up almost 14X, I might as well never upgrade again for 30 more years then. See how flawed this logic is for CPUs? "
Yes, the first part of the quote is true for me, I'm no longer a power user - I've lost interest in gaming, and for the some video encoding I do, it's always run in background, 5 mins or 8 mins isn't much difference, I'm not tapping my fingers on the table waiting for something to finish.
Ya, so you are not the target market for modern PCs, it's as simple as that. I already addressed that point too. If the PC does everything a user needs, he won't upgrade. For 100 old consumers, Intel doesn't need 100 new buyers. They just need more buyers buying higher end i5/i7 CPUs that offset the loss of buyers who don't wants new PCs; and that's exactly what's happening == record i7 sales.
Heck, my i7-920 desktop PC is still more powerful than ~80-90% new laptops (to be fair, I upgraded Radeon 4870 to 7950 + SSD, so it's not the exact same system as bought in 2008).
Again, if your PC satisfies you, that's fine. Now for every 1 of you, there is me. I have an
i7 IVB laptop that beats your i7 920, I also have an i7 6700K and soon I will get an
i7 6800K. In 3-4 years, I will sell all of these and upgrade all of these systems with new i7s or AMD's 8+ core CPUs. Since you bought your i7 920, I built
25+ Intel systems, the
worst of which had an i5 2500K. Intel loses sales to customers such as yourself since you are no longer interested, but it more than makes up for it by targeting people who love and still want desktops. Everyone who asks me for advice on what system to build gets recommended an i5/i7 and that means instead of buying junky Celerons, dual cores and i3s, every single system I built for friends/family had a minimum of an i5 2500K since 2011. In 2015/2016, I
only built i7s. That included 2 x i7 4790Ks, 2 x i7 5820Ks, 1 x i7 6700K.
And I may be in minority, but I live in a condo and electricity is included in my rent. Whether I use only my smartphone or run 4x AMD 9590 systems 100% 24/7, my electricity bill would be the same.
Even if I paid for it directly, I really can't imagine that I would be doing all my computing on my 13" i5-5200U laptop, and only fire up the desktop when I really need it so as to save on the electricity bill, I practically never use my laptop at home. And it would be a bigger difference in kwh than 920->6700K...
I already explained in this thread that even if not accounting once for electricity costs, it made no sense to hold on to an i7 920/i7 860 anyway if you were smart about timing your upgrades. It's very simple:
Total Cost of Ownership or TCO.
If I spent $700-800 on an i7 920 platform in 2008-2010, by now at best it's worth $200. That means it would have been a real loss of value between $500-600. Instead, I went i7 860 -> i5 2500K -> i7 6700K and after reselling all those parts and reinvesting the resale value, it cost me no more than $500-600 anyway. The difference is I had a very fast system in 2009-2010, 2011-2014 and now in 2016.
Just like right now I'll keep the i7 6700K + Z170 mobo temporarily, but then I'll just sell them in 3-4 years and may even reuse DDR4 I have in the future (more $ saved). Assuming I sell the CPU+Mobo for $225 in 3-4 years, it's exactly half of what I paid for them new in 2016. That means my TCO for that platform over 3-4 years is a mere $225. I will then take $225 from the resale value and buy an i7 9700K or i7 10700K in 2018-2019 and keep going.
That means with my strategy, I take all the hard work and stress and throw it out the window. Instead of spending $700-800+ on a new platform and sitting on it until it becomes nearly worthless, I just upgrade more frequently and reinvest the resale value. At the end of the day over 6-8 years, your TCO and mine are very similar but I have close to top-of-the-line rig every 3 years and yours is not.
So I don't even need to expalin that Skylake/Broadwell-E is a mediocre upgrade compared to going from Pentium II 233mhz MMX to Athlon XP1600+ or moving from Pentium 4 C 2.6Ghz @ 3.2Ghz to E6400 @ 3.4Ghz because holding old parts and letting them become worthless costs more or less than the same as does upgrading more often. Problem solved.
This TCO concept can be applied to cars, headphones, smartphones, videocards, etc.
For example, let's say someone bought a November 2010 GTX580 1.5GB for $499 and still has it today. That card is probably worth $50, or $450 loss of value/real $.
Here is me. When 580 was $500, I bought an HD6950 for $230, unlocked it to 6970 speeds. That got me 85-90% of the performance of a 580 in real world games at the time. Then I sold the 6950 for $160 and bought a 7970 for $400. Then I sold the 7970 for $160 and bought a 390 for $245.
My total cost is:
- $230
+ $160
- $400
+ $160
- $245
= - $555*
*But let's assume I can sell the 390 right now for $100, it would mean my TCO is - $455.
That means if Buyer 1 bought a $500 GTX580 and held on to it until now, they had good performance for 2 years and downhill from there. I had great performance every year until now and we both spent the same amount of $ in real terms. It actually cost me way less since I can easily sell the 390 for way more than $100 now.
Now take this exact same concept and apply it to CPU platform upgrades. :thumbsup:
Note: In reality I actually paid $0 for my GPU upgrades since my ATI/AMD cards made $ from 2008 but the TCO concept still applies and it's a highly effective method to understand the true cost to own something imho. That's why I wouldn't buy a pre-built rig since it's way too hard to resell those parts separately.
If you start thinking about the concept of TCO, you no longer think about how you set aside $1500-2000+ on a new PC. Otherwise, you'd be making the exact same mistake as you just made with your last rig. You'd potentially go out and drop $1500-2000 on new parts and again hold them for 6-8 years, while they depreciate close to $100-200 at most. Instead, learn the concept of TCO and resell and roll-over/reinvest the savings towards new parts. This way, you'll end up "wasting" the same $1500 over 8 years but instead you'd actually have a fast system every year over those 8 years!