Worthy upgrade from 2600k yet?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TFchris

Member
Feb 10, 2013
28
0
66
I still use a i7 2600k. There isn't much to upgrade to these days anyways, 3770k is just a die shrink version of it, and haswell is barely any improvement over sandy/ivy in terms of performance.

Until skylake comes out, I'm not interested.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I have a I7-2600K @ 4.4 Ghz still chugging along. I still have a crossfire ATI 6950's (unlocked to 6970's and OC'd some) and am able to run games at the settings I want just fine still.

I haven't seen a justification yet really based off performance and pricing to think about upgrading my main system. I use it for gaming, development, and other assorted things like surfing the net or watching videos. Nothing crazy and easily accomplished with what I have with no apparent bottlenecks.

Could I upgrade and get a bit of a boost? Sure, but will it be really noticeable? I doubt it. I had a feeling when I put this rig together it would last more than the 1/2 year cycle I had been doing. Good thing too because I'm not single anymore and have far less discretionary cash to spend on things like new computers all the time.

My brother had the same config, but thanks to the Bitcoin bonanza he sold his CF 6950 setup and got a single 770 4GB OC. He's really happy with the upgrade. Not sure if that kind of deal is still possible, but worth a thought. Then later on you could go SLI 770 4GB and be beast mode for a few more years yet
 

alkemyst

No Lifer
Feb 13, 2001
83,769
19
81
I am on a Phenom 955 @ 3.7, 16GB and a 6800 series card (6870? I don't really know). I run 1920x1600 and games run good and detailed.

I'd benefit more going to SSD or even a larger monitor.

At the same time if I had disposable cash to do it or if one of my components became the 'hot' one. I'd buy or sell then re-buy in a heartbeat.

I had some older tech that became re-valuable at points that I was able to leverage into better tech for free or even profit in the past.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,880
1,550
126
Well, I'm back, like the Terminator.. . .

If you've decided to give up on OC'ing or just hardware config and tweaking in general, you can exclude yourselves. It's probably useful for us SB-K bystanders to look at overclocking guides for the IB and Haswells.

They "rename" stuff. I had a little dueling match with someone here -- probably another thread, dunno . . . I was attempting to give advice but with no IB or Haswell firsthand experience. Instead, you have the descriptive articles analyzing Haswell before or at the time of its release; OC'ing guides and other sources.

I think that certain BIOS features for an older processor line (like SB-K) have their analogs in Z77 and Z87 chipsets, or more specifically, the variation isn't so likely for Z77 because it was still socket-1155. So I think they may change some features or change naming conventions or both.

Some of the SB boards allowed BIOS changes to the "VCSSA" or system agent voltage, and this seems keyed to iGPU in my own BIOS. The Haswell boards allow it, and it may need adjustment. There is a "CPU Ring Voltage" allowing tweaks to the L3 cache and speed, and I/O Analog and I/O digital voltages -- which I suspect are really refinements for VTT or VCCIO. People can correct me about this, but if they do, they should be familiar with both SB socket1155 and Haswell skt 1150.

Meanwhile, I feel ignorant for not taking the time to explore further the possibilities with my current ("pooper" GTX 570) VGA card and "iGPU mode" with the Intel HD 3000 Gfx. I was running three displays this afternoon-- two off the NVidia and one ("primary") off the motherboard. I should've tried this a long time ago, because it's fairly easy and the inconvenience of resetting the desktop and "main monitor" is not all that troublesome.

What I discover with this was that you can configure two dGPU monitors and one iGPU, but you can't get your games to play easily on the dGPU if your boot monitor was the HD 3000/[and maybe 4000 etc -- just guessing]. The Geforce 570 only allows for two monitor connections -- and apparently newer graphics cards like the GTX 770 have no such limitation. In that case, you would prefer dGPU mode for any use of Lucid -- or even better, no use of it at all. MC continues to run as before for my HDTV -- from the dGPU, but you really have to connect a third monitor to the motherboard Intel HD to have two monitors that work with Lucid. You may say "why Lucid?" and the answer is that once the iGPU is set to provide one monitor for your main desktop, it's either crippled with a maximum of 512MB of RAM allocation, or it's enabled for "multi-monitor" with Lucid to share dGPU VRAM.

So I find out that using the iGPU independently is fine, but games are grainy, slow and so "retro." You can switch VIRTU on and off in this scenario. If it's switched on, suddenly the games on monitors connected to the motherboard behave as though you were running in dGPU mode - all monitors connected to the dGPU. So there's no searing performance improvement -- and there may be a slight hit to it. But it's a viable configuration for me until I can replace this GTX 570 clunker.

Somebody mentioned a 770 GTX here, and it would seem this is a better use of money for an addict than building a new machine. Or maybe a coupla . . . SLI . . .

So many little projects to do . . . while I save up some money for "after the Haswell-E release."
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Going to a 770 from a 570 is freaking gigantic with current games at 1080p and beyond.

Second best would be going from a 64GB SSD to say a 250GB Samsung 840 Evo ($170ish IIRC). SSDs really start flying at the 250GB size.

I sincerely doubt you'd notice the difference between a 2600K @ 4.7 and a 4770K @ 4.4 (typical air OC). If you were running iGPU yes, but dGPU? Not a chance.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Looking at the trend, we will be still asking the question for Haswell, it's successor and it's successor's successor.

And the successor to that.
That is... if it doesnt go in reverse.

Better burry the cpu upgrade itch. Last time i bought the hw but didnt even bother to plug it into my own pc because of the hassle. Instead it went to one of the kids. Nice for fun to have a mobile cpu in a desktop though. Lol
 

BonzaiDuck

Lifer
Jun 30, 2004
15,880
1,550
126
Going to a 770 from a 570 is freaking gigantic with current games at 1080p and beyond.

Second best would be going from a 64GB SSD to say a 250GB Samsung 840 Evo ($170ish IIRC). SSDs really start flying at the 250GB size.

I sincerely doubt you'd notice the difference between a 2600K @ 4.7 and a 4770K @ 4.4 (typical air OC). If you were running iGPU yes, but dGPU? Not a chance.

Perhaps a caveat: it depends on HOW you're "running the iGPU." As standalone, it is little better than the NVidia onboard graphics of my Mom's Gigabyte 610i mATX board. Hourglasses, hesitation -- all because of the graphics processor. It was half the equation for upgrading her system in 2010-2011: for what Mom does, we spent maybe $30 on a budget GeForce card, an XFX GT 460 or something like that. "Free at last! Free at last!" And then we added a 128GB Elm Crest SSD (SATA-III) to her SATA-II port and replaced the HDD -- "Thank God Almighty, I'm free at last!"

And actually, with 512MB of RAM allocated to it, you simply discover that the iGPU isn't a "slug" for most things: You could certainly use it for HTPC functions. It WILL run games. For various computing sessions, with your desktop monitor connected to it, it is fine.

But if limited to two monitors when you want to deploy three or four, it becomes a more attractive resource. You can dedicate a GeForce-connected monitor to an HDTV, but you can also use the iGPU together with the dGPU card and its VRAM to get similar performance from a third graphics card.

NVidia had prepared its SLI software to do roughly what the Intel HD does with a single card. [And you can't use more than a single card with the Intel HD with VIRTU.]

It boils down to a need to use more monitors for specialized purposes. And -- the convenience of simply switching Lucid on and off in software with two mouse-clicks.

ADDENDUM/AFTERTHOUGHT: You're right: there was little difference between my 2600K @ 4.6 with 570GTX [dGPU mode] and Cinebench results for an i7-4770K @ 4.4. In fact, I actually came out a smidgeon better with 4.7Ghz. But that's only one particular benchmark. Even so -- probably indicative of "future-proof" for the near future.

Z15CAM said:
. . . .

This is the way my i7 2700K scales with EIST ENABLED :
4600 Mhz:
With a 46 Multiplier and - 0.005v offset the system Idles at 1600Mhz @ 0.992v/34C, at 4.6G under heavy application load like x264 encoding voltage ramps to 1.336v/47C then at 4.6G under Prime95 large FFT Stress Test voltage ramps to 1.376v/68C. NICE.
. . . .

Just a thought you may incline to file away. Reason it came up, I thought I'd "vet" my OC's again with LinX and Prime95. I had mentioned that there was a second voltage setting in the P8Z68-[ . . V, Pro, Deluxe] boards that seems to work mV-for-mV with the Offset Voltage: "Add'l Voltage for Turbo." Also -- the observation from various sources that too much LLC will cause overshoots or even a VCORE higher than VID. But turn away from the LLC issue for a moment.

Looking at your setting for 4.6, you're using an offset just 10 mV lower than mine if I understand correctly. So if you left the "Add'l Voltage for Turbo" ["AI Tweaker," "CPU Power Mgmt"-> . . ] at its default "Auto" setting, how much extra voltage is the latter providing in addition to the Offset Voltage setting? The only way to know would be through measuring load voltage minimums from similar stress tests -- between "Auto" and some fixed setting.

I say this, because with Offset +0.005V and Add'l Voltage . . (+ only) 0.008V, my Prime95 large-FFT load-voltage minimum is 1.322V -- while running Media Center "Live TV" in the background. Unfamiliar with "x264 encoding," but how is it that such a load source shows 1.336V, but Prime95 gives 1.376V? Or did you read the VCORE from the monitor at the wrong time -- maybe as a "Maximum" value that would show the unloaded Turbo voltage?

The only way to know for sure if "Auto" Add'l Turbo Voltage" gives the same 0.008V extra or something higher would be to recheck the Prime95 load Large-FFT at its minimum. The two softwares won't give the same statistics, or you misread your Prime95 load voltage.

With MC running in background, I DID discover an instability @4.7 with LinX after 30+ interations -- not so likely to happen after so many runs as you speculate about something like a Poisson distribution of errors. So my adjustment added 5mV to the load turbo-voltage to set things straight. That gave it a clean 50 iterations while watching TV.

What impressed me more -- whether it was the 1.65+ PLL Voltage setting or some other tweak that I made -- my load LinX temperature "average-of-maximums" has dropped to 72.5C (4.7 Ghz) prevailing at closer to 69C, and the 4.6 Ghz equivalent has dropped to 68C prevailing around 65C. Room ambient is 77F. I think I can find another 2C drop by plugging an air-leak with some carefully-cut art-board.

I'm thinking you have water-cooling, and I have the old D14 . . .

ADDENDUM/CORRECTION: For 4.6, I'd revised my VCORE settings from +0.005V/0.008V to either +0.005V/0.016V or +0.010V/0.012V. It passed stressing at the lower settings, but I didn't like the spreads of controlled GFLOPS, so I upped it a tad. These were the settings that gave me 1.322V minimums at highest Prime95 loading.
 
Last edited:

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
And the successor to that.
That is... if it doesnt go in reverse.

Better burry the cpu upgrade itch. Last time i bought the hw but didnt even bother to plug it into my own pc because of the hassle. Instead it went to one of the kids. Nice for fun to have a mobile cpu in a desktop though. Lol

Desktops, who cares? Heck for NUCs and laptops that people want to buy are actually better off with Atom than Broadwell/Skylake whatever if it means saving $200 for negligible real-world performance loss and better battery life.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,880
1,550
126
Desktops, who cares? Heck for NUCs and laptops that people want to buy are actually better off with Atom than Broadwell/Skylake whatever if it means saving $200 for negligible real-world performance loss and better battery life.

Soon, I'll probably make a "big buy" that includes both a decent ultra-book or surface laptop/hybrid and a small tablet. One, the other or both.

Heck. I want to watch cable-premium HDTV while I work in my garden or sit in the garden with a dram of Grappa and some Espresso . . . .
 

nwo

Platinum Member
Jun 21, 2005
2,308
0
71
Nearly 1.4v on a Haswell

How the heck do you cool that thing when it's at full load?
 

FordGT

Member
Jul 11, 2008
37
0
0
I'm still running a 2600K with a Z68 board and it does everything I ask of it.


One reason I kept my current system was because of the cheap thermal paste used on IVB and HSW chips. Solder is king.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Soon, I'll probably make a "big buy" that includes both a decent ultra-book or surface laptop/hybrid and a small tablet. One, the other or both.

Heck. I want to watch cable-premium HDTV while I work in my garden or sit in the garden with a dram of Grappa and some Espresso . . . .

A surface and a tablet seems a bit redundant no?
 

stockwiz

Senior member
Sep 8, 2013
403
15
81
I'm still running a 2600K with a Z68 board and it does everything I ask of it.


One reason I kept my current system was because of the cheap thermal paste used on IVB and HSW chips. Solder is king.


yep... I don't drink the "it does everything that is required of it at stock speeds" kool aid. I consider a 2600K with solder superior to the chips with the thermal paste and am not going to delid to fix a built in flaw.... plus these new chips just aren't a big enough jump over a 2600K anyways. The chipset improvements aren't even any big deal.
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
My brother had the same config, but thanks to the Bitcoin bonanza he sold his CF 6950 setup and got a single 770 4GB OC. He's really happy with the upgrade. Not sure if that kind of deal is still possible, but worth a thought. Then later on you could go SLI 770 4GB and be beast mode for a few more years yet

Yep, still feel like I'm in beast mode with my setup. I only play on a 1920x1200 dell ips monitor monitor right now. It's sort of overkill for that setup even with new games. I can turn up almost all the eye candy and still have playable framerates.

I paid the $150 price deal at microcenter when those 2600K chips were first released. Got that asrock fatality board for a decent sale price too. Was like $150 as well I think. I already had a 1250 watt 80+ gold OCZ power supply I had managed to snag for $80. Yes you heard right $80. I love store closing clearances On top of that I picked up a 250GB 4th gen SSD for $200 a few years ago. Have it paired with a 2TB seagate drive for my storage. Picked up the XFX 6850's for $300 for the pair. I have 32GB of the g.skill ripjaw 1866s I picked up for $100 total. When DDR3 was very cheap for a bit. Also, I am still rocking my old rocketfish rebadged lian li full tower case I picked up on that old best buy clearance for $25 bucks. I also have thrown in a blu-ray burner, various fans, and my corsair H80 I picked up on sale too. My total build has been great the last few years for anything I wanted. It was also fantastic for the price. I think I was just at $1250 total for the whole setup price. Of course some of the components were reused like the case, fans, basic HDD, and PSU as examples. Any future build those components will be reused again.

When I did the build I only purchased the cpu, motherboard, memory, video cards, cpu cooler, and SSD. Everything else I already had. When I go to do my upgrade, I'll only need to change out the CPU, motherboard, and vid cards. Possibly the cpu cooler. I used to do it where I snagged good deals on upper end products so that I could resale them later for close to what I paid. I only did so when I could snag the new stuff for super prices too. I'm pretty sure I could get $500+ for my current cpu/mobo/vid cards at least. Which would go towards the upgrade cost.Still, I would be spending at least another $500 on top of what I get for my current stuff. Which was easy enough to do when I was single and living in a cheap rented place. Not so easy when one is married, paying a mortgage, and has a kid due in a few months.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,880
1,550
126
A surface and a tablet seems a bit redundant no?

Yes, now that I think of it. There are several possibilities: I could buy a laptop of two/three-year-old technology, and see how much I can wring out of it. Or the Surface would provide BOTH tablet and "PC" functionality to some extent. Or, with a decline in my ability to keep up and catch on -- I might feel better off with an Ultrabook.

I COULD tell you about how, in 1983, I built myself something in Grandma's "overnite"-size Samsonite which looked forward from "transportable" to "laptop," to at least attempt dispelling doubts about my seniority in the longer history of this business. But anyway, part of my interest in multiple devices would feed a crash effort to turn both myself and my family toward a mobile-enabled direction, even if we're not all goosey-gooh-gah about what a cellphone can do.

stockwiz said:
yep... I don't drink the "it does everything that is required of it at stock speeds" kool aid. I consider a 2600K with solder superior to the chips with the thermal paste and am not going to delid to fix a built in flaw.... plus these new chips just aren't a big enough jump over a 2600K anyways. The chipset improvements aren't even any big deal.

Those were mostly my sentiments, especially as I tightened my belt last year moving toward the regular uncertainty of temporary income loss. Did I really want to go to the trouble for building an Ivy-Bridge or Haswell and feel compelled to engage in both the work and risk?

But I'd keep an open mind to improvements in instruction-set, wattage and new features, even for the thermal dilemma those chips present. And -- yes for indium solder! I was just notified by the Egg that the Skt-2011 ASUS X79 Deluxe motherboard had been restocked (most likely with more refined BIOS), but I'm not going there. I will wait for Haswell-E. I might even wait until the Haswell platform has matured six months or more!

HumblePie said:
[Your dissertation on swapping parts, recycling or selling your system.]

This is getting to be like new-vs-used cars when people felt prosperous enough to get new chrome every three years and pursue dealer-promoted trade-in. Because I made money with my computers, I seldom got rid of them until they'd become obsolete enough that you could sell the parts for exotic metals or take them to the enviro-friendly recycling center. You once could donate to churches and schools: today, GoodWill doesn't accept used computers -- at least not my local facility.

But I spread the cast-offs among the fam-damn-ily, and I accept the extra costs of building the new machine I want, as opposed to the $600 OEM knockoff with bells, whistles, locked-in connection to OEM update and service, and the extended warranty.

HERE'S MY PARTING ANNOUNCEMENT: Over at "Cases & Cooling" there is a thread on the much-anticipated Noctua "D15" or "double-U14S" cooler. I'll be posting my summary of my project consuming too much effort for meager returns, but here are the essentials.

For any heatpipe cooler, you can improve the cooling efficiency by various low-tech strategies, but essentially they involve pressurizing your case and forcing most of the air that goes in the case to pass through the cooler fins as they are quickly and most efficiently exhausted.

So, Noctua NH-D14 with "140mm-exhaust-fan-klooge ducted to rear cooler fins." I finally decided to stop thinking about how best to do it, and did it the easiest, simplest way with a practical scope to avoid putting a duct shroud over the whole cooler.

To my best estimation and measurement, at 77F room-ambient, I have reduced my maximum temperatures averaged across cores by 2C more degrees by plugging holes.

The good news: @ ~1.36 to 1.38V VCORE [loaded and unloaded-turbo], 4.7 Ghz, LinX with Media Center feeding CNN, the average maximum doesn't exceed 70C and the prevailing instantaneous or average-of-averages is between 67C and 68C.

I figure that's about the limit for this cooler, but I could see about a "duct-box" with cardboard prototype pieces. Yet it only stands to reason, if all air passes through the fins before exhaust, it wouldn't much matter. So we await reviews of Jakob Dellinger's new creation to appear between April and August. I'd say it's a 50-50 proposition whether I'll water-cool a Haswell E. Dime for a rhyme: We'll wait and see . . .

PS Water-cooling aficionados would envy my machine for it's near-perfect silence. Plus -- thermal-control of all fans with no add-on $80 controller and front-panel. Works for me!
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |