Owing to the soaring cost of energy, I decided to invest in a plug in power meter. That is a handy little device that monitors how many volts/amps/watts and most importantly watts/hour is being drawn from a power socket. I decided to plug this into the socket which supplies my entire computer rig to see what my computing is costing me.
I had running my desktop PC, my file server, LCD monitor and speakers. One hour later and the meter was reading 0.34kWh which means that to keep this running for 24 hours would use 8.16kWh. It doesn’t look like much but lets do the maths!
Putting these figures into pounds and pennies is a convoluted process, and different for everyone, but here is how I worked it out. I assumed that the more expensive units (first 182 kWh per quarter) were used up by general household usage leaving just the daily and night time rates (17 hours @ 12.75p/kWh and 7 hours @ 5.4p/kWh). This means if my usual setup uses 0.34kWh every hour it will use (5.78kWh @ 12.75p + 2.38kWh @ 5.4p) meaning the cost per 24 hours is (73.695p+12.852p) = 86.547p, which is £315.90 over a whole year.
Now that is a LOT! Even though I do not run my computers 24/7 any more, the meter still read 5.14kWh after 24 hours. By the same calculations, that is still £200.18 per year that I spend just on running my computers.
The fridge/freezer has often been cited as the most wasteful and costly home appliance, however mine uses just 1kWh every day, which is just £38.72 per year. This means that my computers are over five times more expensive to run. I consider my own setup and usage modest among my peers. I know people who run several home servers 24/7. I wonder what sort of bills they are racking up. Furthermore, what must it cost to run a warehouse-sized data centre these days?!
The cost of computing? Very expensive indeed