Before going too crazy on this, check out the facts, and I do NOT mean the stories you find all over the authoritative Internet! I was judging in a local Science Fair recently and one project was to find the real facts on this. The student had used a power meter to really MEASURE the power consumption for items that were turned off, so he could figure out how much power they consumed in that state. Things like using a programmable thermostat for your house furnace and reducing gas use during the night do generate considerable savings (on gas energy use, not so much on electricity). But unplugging your TV? Or how about the microwave oven? He told me he was surprised by the microwave result, because he had read "on the Internet" that the clock display which always runs wastes a lot of power. His measurement calculated out to less than $2.00 per year! The TV might have been twice that much.
By the way, the TV is a good example of facts changing over time. At one time the "instant on" feature of TV's actually did consume enough power in the "off" state to be of interest. That is because that feature on a CRT display system with tube amplifiers actually kept the heater filament of the picture tube and a few other tubes heated up all the time so they could start working "instantly" when the TV was turned on. The current involved was in the one amp range. But today with solid state circuits the only filament left is the CRT tube itself (or even none if your TV is flat-screen LCD) and the current consumed by the tiny circuit that watches for you to push the "On" button is minuscule!