Considering how we're all supposed to be so aware of global warming, it's always surprised me how no one ever seems to question the fact that PCs kick out so much excess heat, especially since at the moment (when the weather is *very* warm around here) the PCs are making things almost unbearable in the office...
I mean, the excess heat from a PC (be it desktop or laptop) is waste heat. And waste heat is a sign of bad design (so I've heard), as either the components are drawing too much power, that they then don't need and so is left to radiate outwards, or the PC is producing waste heat as part of it's normal functioning, which is an inefficient side effect. So why aren't PCs produced that counter this, either by drawing only the power they need, or by avoiding the inefficient output of waste heat from their components?
I've heard that it's down to the inefficient design of the original chips, and that every revision/upgrade of the chips are not only carrying all of the inefficiencies of the older chips, but also adding their own new inefficiencies, as a total redesign of the chips, along with as close to 100% efficiency as possible, would be a prohibitively large expense for the chip manufacturers, when instead they can just build on the old chips which costs much less in research and design. I've no idea if that's true or not (I know nothing of CPU design or building, it might as well be magic to me) but I'd be interested to know why such (seemingly) inefficient components are the norm nowadays.
I mean, the excess heat from a PC (be it desktop or laptop) is waste heat. And waste heat is a sign of bad design (so I've heard), as either the components are drawing too much power, that they then don't need and so is left to radiate outwards, or the PC is producing waste heat as part of it's normal functioning, which is an inefficient side effect. So why aren't PCs produced that counter this, either by drawing only the power they need, or by avoiding the inefficient output of waste heat from their components?
I've heard that it's down to the inefficient design of the original chips, and that every revision/upgrade of the chips are not only carrying all of the inefficiencies of the older chips, but also adding their own new inefficiencies, as a total redesign of the chips, along with as close to 100% efficiency as possible, would be a prohibitively large expense for the chip manufacturers, when instead they can just build on the old chips which costs much less in research and design. I've no idea if that's true or not (I know nothing of CPU design or building, it might as well be magic to me) but I'd be interested to know why such (seemingly) inefficient components are the norm nowadays.