Linux, politics, and other interesting things
I have seen it claimed that renting a virtual server can be cheaper than paying for electricity on a server you own. So I’m going to analyse this with electricity costs from Melbourne, Australia and the costs of running virtual servers in the US and Europe as these are the options available to me.
According to my last bill I’m paying 18.25 cents per kWh – that’s a domestic rate for electricity use and businesses pay different rates. For this post I’m interested in SOHO and hobbyist use so business rates aren’t relevant. I’ll assume that a year has 365.25 days as I really doubt that people will change their server arrangements to save some money on a leap year. A device that draws 1W of power if left on for 365.25 days will take 365.25*24/1000 = 8.766kWh which will cost 8.766*0.1825 = $1.5997950. I’ll round that off to $1.60 per Watt-year.
I’ve documented the power use of some systems that I own . I’ll use the idle power use because most small servers spend so much time idling that the time that they spend doing something useful doesn’t affect the average power use. I think it’s safe to assume that someone who really wants to save money on a small server isn’t going to buy a new system so I’ll look at the older and cheaper systems. The lowest power use there is a Cobalt Qube, a 450MHz AMD K6 is really small, but at 20W when idling means a cost of only $32 per annum. My Thinkpad T41p is a powerful little system, a 1.7GHz Pentium-M with 1.5G of RAM, a 100G IDE disk and a Gig-E port should be quite useful as a server – which now that the screen is broken is a good use for it. That Thinkpad drew 23W at idle with the screen on last time I tested it which means an annual cost of $36.80 – or something a little less if I leave the screen turned off. A 1.8GHz Celeron with 3 IDE disks drew 58W when idling (but with the disks still spinning), let’s assume for the sake of discussion that a well configured system of that era would take 60W on average and cost $96 per annum.
So my cost for electricity would vary from as little as $36.80 to as much as $96 per year depending on the specs of the system I choose. That’s not considering the possibility of doing something crazy like ripping the IDE disk out of an old Thinkpad and using some spare USB flash devices for storage – I’ve been given enough USB flash devices to run a RAID array if I was really enthusiastic.
For virtual server hosting the cheapest I could find was Xen Europe charges E5 for a virtual server with 128M of RAM, 10G of storage and 1TB of data transfer , that is $AU7.38. The next best was Quantact who charges $US15 for a virtual server with 256M of RAM , that is $AU16.41.
Really for my own use if I was paying I might choose Linode  or Slicehost , they both charge $US20 ($AU21.89) for their cheapest virtual server which has 360M or 256M of RAM respectively. I’ve done a lot of things with Linode and Slicehost and had some good experiences, Xen Europe got some good reviews last time I checked but I haven’t used them.
When comparing a Xen Europe virtual server at $88.56 per annum it might be slightly cheaper than running my old Celeron system – but would be more expensive than buying electricity for my old Thinkpad. If I needed more than 128M of RAM (which seems likely) then the next cheapest option is a 256M XenEurope server for $14.76 per month which is $177.12 per annum which makes my old computers look very appealing. If I needed more than a Gig of RAM then my old Thinkpad would be a clear winner, also if I needed good disk IO capacity (something that always seems poor in virtual servers) then a local server would win.
Virtual servers win when serious data transfer is needed. Even if you aren’t based in a country like Australia where data transfer quotas are small (see my previous post about why Internet access in Australia sucks ) you will probably find that any home Internet connection you can reasonably afford doesn’t allow the fast transfer of large quantities of data that you would desire from a server.
So I conclude that apart from strange and unusual corner cases it is cheaper in terms of ongoing expenses to run a small server in your own home than to rent a virtual server.
If you have to purchase a system to run as a server (let’s say $200 for something cheap) and assume hardware depreciation expenses (maybe another $200 every two years) then you might be able to save money. But this also seems like a corner case as the vast majority of people who have the skills to run such servers also have plenty of old hardware, they replace their main desktop systems periodically and often receive gifts of old hardware.
One final fact that is worth considering is that if your time has a monetary value and if you aren’t going to learn anything useful by running your own local server then using a managed virtual server such as those provided by Linode (who have a really good management console) then you will probably save enough time to make it worth the expense.