There we have confirmation from guskline running the same linpack binaries. It has to be a some voltage setting problem in the BIOS(being much over recommended value). Or a dud CPU...
What is the recommend value for the voltage?
There we have confirmation from guskline running the same linpack binaries. It has to be a some voltage setting problem in the BIOS(being much over recommended value). Or a dud CPU...
Just bare in mind TDP is about cooling not CPU power consumption. Its perfectly possible for a CPU to draw my power than its TDP up until its raised its temperature sufficiently that it needs to slow down. The cooling system must be able to at least remove tdp watts to be sufficient but a CPU can and will exceed that for periods of time. What a processor must do however is ensure that over seconds and minutes that it doesn't exceed TDP on average.
Sorry Idontcare I'm now at work and not near the computer. I ran a short 5 run of the LinX program with the AMD changes per the link. I also have a Kill-O-Meter and did not measure cpu voltages during the run.Interesting. Just to ensure we are talking apples-to-apples, your power numbers are with a kill-a-watt or software measure program utility? And did you run with problem size 43122 using 8 threads? What GFlops did you get? And lastly, what is your CPU voltage at load?
With DDR3-1866 10-10-10-28-T2, I'm currently peaking at ~85 GFlops w/8 threads. Kill-a-watt reports pulling ~287W, and my CPU voltage is ~1.377V. Loaded CPU temps are ~58C with 19C ambient.
I doubt a dud cpu. I'm thinking that Idontcare's Asus mb (top end) lives, breathes and sleeps overclocking SO I wonder if even at stock the BIOS is upping voltages?There we have confirmation from guskline running the same linpack binaries. It has to be a some voltage setting problem in the BIOS(being much over recommended value). Or a dud CPU...
Sorry Idontcare I'm now at work and not near the computer. I ran a short 5 run of the LinX program with the AMD changes per the link. I also have a Kill-O-Meter and did not measure cpu voltages during the run.
I don't have all the sofisticated equipment you do but I will try to replicate your run tonight. BTW, my ram was set at 1600 eventhough rated at 1866.
I'll try to be back tonight. You might want to try running a short sequence with the ram at 1600 to see if that makes a difference.
It is the latest BIOS, and I thought I told the BIOS to load optimized defaults (F5), but I will do it again just in case I failed at that before.Idontcare, does your BIOS have an Optimized Default setting? You may want to try that.
You should use all RAM possible because the load increases as you use more RAM, and so as to mirror IDC's setting. Don't worry about finishing the tests (using 14GB takes a long time), it just has to run for long enough so that the computation actually starts (as opposed to just allocating the RAM, which can take a minute or two).I'm trying to remember about the ram setting, I think 3 Gb but not sure. I'm going home for lunch so I can do a run of @1/2 hr. How do I measure the peak wattage on my Kill-O Watt without continually looking at the Kill-O Watt digital screen? I have the smaller Kill-O-Watt. Is there a software program from Kill-O-Watt?
Any idea roughly on what ram setting you used? Was it the default 1GB?
If you can't measure the CPU voltage that is alright, I'd be happy just to hear your CPUz reported value when loaded. If yours is ~1.35V then that is one thing (being comparable to mine), if it is ~1.25 or even lower then that is a whole other thing and that would explain the power-usage differences.
It is the latest BIOS, and I thought I told the BIOS to load optimized defaults (F5), but I will do it again just in case I failed at that before.
I'm trying to remember about the ram setting, I think 3 Gb but not sure. I'm going home for lunch so I can do a run of @1/2 hr. How do I measure the peak wattage on my Kill-O Watt without continually looking at the Kill-O Watt digital screen? I have the smaller Kill-O-Watt. Is there a software program from Kill-O-Watt?
As to the cpu voltage, under AIDA64 sensor readings, my cpu voltage with the Default optimized settings drop as low as .9 v. In the OC state (4.53Ghz ---21 x 215 with vcore at 1.4437) the v core stays at the 1.44 state.
I'll try to run both an Optimized Default state and an OC state and monitor both Max wattage and max vcore.
Also, did you check to see if turbo core was functioning properly? I know Asus uses a multicore tweak to push all the cores to the maximum turbo setting.
OK. On my lunchbreak and ran the LinX program filling up ram and same problem size.
Stock:
Max watts - 241
vcore - 1.296
CPU Temp Overall 41C cores 27
OC to 4.53 Ghz with vcore set to 1.4437 for max stability
Max Watts - 344
vcore - 1.44
CPU Temp Overall - 51C core 41 C
Hope this info helps, BTW I ran each test @20 minutes.
That's a tough one to answer. If you can run your 8120 stable at 4.5 Ghz upgrading will not result in a much higher OC, even with a 8350 unless you have a custom water cooling system. I use a Corsair H100 and I notice that you use a H80. My 8350 is solid at 4.53Ghz and uses less power than my 8150 did. Where you will notice the difference most is at stock. The 8350 really runs very nice at stock and "feels" faster than the 8150 did at stock.guskline...
Would it be worth while for me to upgrade to a FX 8350 from a FX 8120 ? Right now it is running stock speeds. But it can go as far as 4.5 GHz stable. Ran it there for a few weeks stable.
I'll check that tonight. BTW, what cpu cooler are you using? I have a Corsair H100 with 4 fans with cool air intake at the top of the CM HAF 932 Advanced. From what I read you need a decent water cooling system to keep the heat from spiking when you crank up these 8 core/ 4 module systems.Your stock temps and Vcore are astonishingly lower than mine.
According to Coretemp, the VID for my FX8350 is 1.3875V and it peaks around 60C under IBT.
Did you happen to notice the GFlops for your stock run? I'd be curious to know if you are seeing ~85GFlops or if perhaps you end up with lower GFlops because your mobo engages a current restrictor mechanism of some kind to the CPU.
That's a tough one to answer. If you can run your 8120 stable at 4.5 Ghz upgrading will not result in a much higher OC, even with a 8350 unless you have a custom water cooling system. I use a Corsair H100 and I notice that you use a H80. My 8350 is solid at 4.53Ghz and uses less power than my 8150 did. Where you will notice the difference most is at stock. The 8350 really runs very nice at stock and "feels" faster than the 8150 did at stock.
It's really a question of money.
Right now I am just using the stock HSF that came with the FX8350.I'll check that tonight. BTW, what cpu cooler are you using? I have a Corsair H100 with 4 fans with cool air intake at the top of the CM HAF 932 Advanced. From what I read you need a decent water cooling system to keep the heat from spiking when you crank up these 8 core/ 4 module systems.
Having trouble with the Gflop reading but at stock with your settings watts were 251Your stock temps and Vcore are astonishingly lower than mine.
According to Coretemp, the VID for my FX8350 is 1.3875V and it peaks around 60C under IBT.
Did you happen to notice the GFlops for your stock run? I'd be curious to know if you are seeing ~85GFlops or if perhaps you end up with lower GFlops because your mobo engages a current restrictor mechanism of some kind to the CPU.