After discovering that running my computers (modestly used as they are) cost over half of what I pay for electricity every year, I started thinking about how my computers could be put to better use.
I remembered a little known yet well established concept; volunteer computing, which I used to partake in years ago (I can’t think why I actually stopped being involved before). Which basically involves installing a program on your PC which uses your space cpu cycles to work on scientific projects. A central server for a project hands out small chunks to each volunteered computer over the internet and collates the results.
It would appear that this technology has come along in leaps and bounds since I last looked into it, as there are now dozens upon dozens of these projects, all of which have settled on a common framework known as BOINC (Berkeley Open Infrastructure for Network Computing).
I have a very powerful desktop machine (quad-core Phenom 2.5Ghz) which means during daily use I barely scratch the surface of my computers ability.
With BOINC installed however I am constantly using nearly 100%, though despite this I do not notice any loss in performance while running applications.
It makes me wonder how much quicker, big scientific problems could be solved if every computer in the world was running this software.
The BOINC client is available on most standard platforms and operating systems and is open source so can be modified to run on potentially any system.
Most linux distributions have the software available from their repositories via package management tools such as Yum and Aptitude. If you have a powerful computer that does not use 100% of it cpu time, can you justify not having this software installed?
Full details and install instructions can be found at http://boinc.berkeley.edu