Originally posted by: Smartazz
What really kills processors during overclocking? Voltage, temperature or both?
GPUs typically run at 500-600mhz. CPUs are up at 2-3ghz.Originally posted by: Smartazz
Yeah, well GPUs can safely hit 80C-90C degrees, I don't see why a CPU couldn't do that and GPUs do last a long time too.
Originally posted by: Smartazz
Yeah, well GPUs can safely hit 80C-90C degrees, I don't see why a CPU couldn't do that and GPUs do last a long time too.
Originally posted by: Imyourzero
I thought I remember reading that it's voltage, not necessarily heat, that diminishes the life of a CPU. I can see how they are directly proportional under normal circumstances, but there are special cases and that's what I wonder about. For example, let's say you implement a high tech cooling solution (i.e. not air) to achieve a stable overclock. What if you have to increase the voltage on a given CPU to 1.55 or 1.6 but you're able to keep the CPU nice and cool? Will the life still be shortened because of the increased voltage even though temps are well within normal operating spec?
Originally posted by: Imyourzero
I thought I remember reading that it's voltage, not necessarily heat, that diminishes the life of a CPU. I can see how they are directly proportional under normal circumstances, but there are special cases and that's what I wonder about. For example, let's say you implement a high tech cooling solution (i.e. not air) to achieve a stable overclock. What if you have to increase the voltage on a given CPU to 1.55 or 1.6 but you're able to keep the CPU nice and cool? Will the life still be shortened because of the increased voltage even though temps are well within normal operating spec?
Originally posted by: corkyg
What comes first, voltage or heat. In OC'ing, the voltage is increased, and that in turn creates higher speeds which create heat. The simple answer is BOTH, but start with voltage as the basic initiating cause.
With today's vastly increased native speeds, why overclock? It won't be reflected in most serious apps.
Originally posted by: Rubycon
Originally posted by: Imyourzero
I thought I remember reading that it's voltage, not necessarily heat, that diminishes the life of a CPU. I can see how they are directly proportional under normal circumstances, but there are special cases and that's what I wonder about. For example, let's say you implement a high tech cooling solution (i.e. not air) to achieve a stable overclock. What if you have to increase the voltage on a given CPU to 1.55 or 1.6 but you're able to keep the CPU nice and cool? Will the life still be shortened because of the increased voltage even though temps are well within normal operating spec?
The problem is people are doing this but the methods of cooling are rather unusual for pc cooling (i.e. expendable refrigeration such as dry ice or liquid nitrogen) and thus continuous operation at these voltages is not possible or practical. I would imagine operation with that much overvoltage is going to cause a breakdown faster than a moderate overvoltage to get a cpu running on more conventional cooling.
Originally posted by: GuitarDaddy
I hang out quite frequently over at extreme forums where phase change (refrigerated) cooling is the norm and can be used for long periods of time as opposed to DI or LN2, and I can tell you that prolonged extreme overvolting at high levels can and will kill a chip rather quickly. But the folks that are that extreme change CPU's about as often as they change socks so it really doesn't bother them. But for the real hardcore guys that use cascade, DI and LN2 to shoot for WR's it doesn't seem to be a problem because they are only doing it for a relatively short period of time.
Bottom line is extreme overvolting for extended use will kill the CPU regardless of what cooling method you use.
Originally posted by: Matthias99
Originally posted by: corkyg
What comes first, voltage or heat. In OC'ing, the voltage is increased, and that in turn creates higher speeds which create heat. The simple answer is BOTH, but start with voltage as the basic initiating cause.
You can OC without overvolting. And upping the voltage but keeping the same clockspeed will produce substantially more heat. Clock rate and voltage are totally independent, although you may need to increase the voltage to make the processor run stably at a higher speed.
With today's vastly increased native speeds, why overclock? It won't be reflected in most serious apps.
If you just use your system to browse the web and send email... yes, it's kind of pointless, although some people just like playing around with their hardware.
If you encode video regularly, higher CPU speeds will reduce encoding times. Some games are also sensitive to CPU speed. Distributed computing workloads will also complete faster.
Originally posted by: Imyourzero
I thought I remember reading that it's voltage, not necessarily heat, that diminishes the life of a CPU. I can see how they are directly proportional under normal circumstances, but there are special cases and that's what I wonder about. For example, let's say you implement a high tech cooling solution (i.e. not air) to achieve a stable overclock. What if you have to increase the voltage on a given CPU to 1.55 or 1.6 but you're able to keep the CPU nice and cool? Will the life still be shortened because of the increased voltage even though temps are well within normal operating spec?
Originally posted by: Special K
Clock rate and voltage are not totally independent. The faster you clock the processor, the less time the circuits have to charge and discharge the internal capacitances. If you continue to increase the frequency while leaving the voltage the same, eventually a circuit somewhere in the processor will no longer be able to charge/discharge that capacitor to an acceptable voltage level within the alotted time (i.e., the clock cycle). This in turn will eventually lead to what most people call "instability" or "a glitch".
Increasing the voltage allows the transistors to source/sink more current, allowing the processor to charge/discharge the capacitors faster.
Originally posted by: corkyg
What comes first, voltage or heat. In OC'ing, the voltage is increased, and that in turn creates higher speeds which create heat. The simple answer is BOTH, but start with voltage as the basic initiating cause.