Did you know that by getting the most power out of your computer you are contributing to global warming? To run all the latest games and software you need as much processing oomph as possible, and that takes more energy.
London's Battersea power station is now a wreck, literally a shell of what it was.
Battersea Power Station was operational from 1937 until 1982
It fell victim to the realisation that power generation is not a one way street: the more power it produced the more it cost the environment.
This realisation has changed our view of almost everything that consumes power, but not so much computers.
Only recently have they become numerous enough to make an energy difference to our world, and more recently still, their power consumption has rocketed.
"In the mid 90s when the original Pentium processor was introduced, the average computer system could work with a 130/140 watt power supply, which is much lower that it is today," said Scott Richards of computer component manufacturer Antec.
"The processor was probably 15 watts of consumption and the graphics cards was about 10 watts of consumption. Then you had your hard drive and your floppy drive, so even given the 10 or 20 percent headroom you need to operate the computer you could easily do a 130/140 watt power supply.
"Today we are selling power supply units at 1,200 watts."
And while the demand for power within the computer has been growing the computer's power supply unit has been making things worse.
Its job is to convert mains electricity into the current for the computer, but until now efficiency has not been a consideration.
"Just 10/15 years ago power supply unit efficiency of about 50% was considered good. So if you wanted 100 watts, you had to draw 200 watts from the mains and the other 50% was lost in heat," said Mr Richards.
"Over time the power supply efficiency has slowly and surely got better. Two or three years ago 70% was considered a reasonable standard of efficiency."
More recently power supply manufacturers have got together and set up an organisation called 80 Plus that has set the bar at making power supply units 80% efficient with the hope of saving some power.
"What we really need is to get more consumer education, particularly big business," Mr Richards added. "If you look at some of the big buildings in London - how many PCs do they have? Thousands. If those PCs are two or three years old they could be working at 60/70% efficiency."
But it is not only the power supply that has been having an eco-overhaul, the processor itself is getting tweaked, and as multi-core processors become the norm they are going to need it.
Bruce Shaw of integrated circuit supplier AMD said: "One core may be running at full speed but the others may be sitting around waiting for it, depending on the type of application it's running.
"What we are now seeing is the ability to take those independent cores and turn them down from the idle state so they are not sitting consuming power idly and just generating pure heat.
"As long as the processor is consuming electricity, heat is coming off of it. That heat has to be taken away. We are able to power different zones of the processor and if it isn't working hard like the rest of the system let it go to sleep."
And leaving your computer asleep, or on standby, is not only raising your electricity bill but also your carbon footprint.
"What I recommend people do if they don't want their computer on standby is to go to the store and invest in a power strip and plug all three devices into it," explained Mr Richards.
"Then when you want to turn off the power to your computer turn off the power to the power strip.
"It's the only way you can cut off power to your computer and the other peripherals."