Advertising a Scam

Below is a strange Google advert that appeared on my blog. It appeared when I did a search on my blog, it also appears on my post about perpetual motion. It seems quite strange that they are advertising their product as a scam. It’s accurate, but I can’t imagine it helping sales.

Google advert for SCAM

I Just Joined SAGE

I’ve just joined SAGE AU – the System Administrators Guild of Australia [1] .

I’ve known about SAGE for a long time, in 2006 I presented a paper at their conference [2] (here is the paper [3] – there are still some outstanding issues from that one, I’ll have to revisit it).

They have been doing good things for a long time, but I haven’t felt that there was enough benefit to make it worth spending money (there are a huge variety of free things that I can do related to Linux which I don’t have time to do). But now Matt Bottrell has been promoting Internode and SAGE, SAGE members get a 15% discount [4]. As I’ve got one home connection through Internode and will soon get another it seems like time to join SAGE.

Shelf-life of Hardware

Recently I’ve been having some problems with hardware dying. Having one item mysteriously fail is something that happens periodically, but having multiple items fail in a small amount of time is a concern.

One problem I’ve had is with CD-ROM drives. I keep a pile of known good CD-ROM drives because as they have moving parts they periodically break and I often buy second-hand PCs with broken drives. On each of the last two occasions when I needed a CD-ROM drive I had to try several drives before I found one that worked. It appears that over the course of about a year of sitting on a shelf I have had four CD-ROM drives spontaneously die. I expect drives to die if they are used a lot from mechanical wear, I also expect them to die over time as the system cooling fans suck air through them and dust gets caught. I don’t expect them to stop working when stored in a nice dry room. I wonder whether I would find more dead drives if I tested all my CD and DVD drives or whether my practice of using the oldest drives for machines that I’m going to give away caused me to select the drives that were most likely to die.

Today I had a problem with hard drives. I needed to test a Xen configuration for a client so I took two 20G disks from my pile of spare disks (which were only added to the pile after being tested). Normally I wouldn’t use a RAID-1 configuration for a test machine unless I was actually testing the RAID functionality, it was only the possibility that the client might want to borrow the machine that made me do it. But it was fortunate as one of the disks died a couple of hours later (just long enough to load all the data on the machine). Yay! RAID saved me losing my work!

Then I made a mistake that I wouldn’t make on a real server (I only got lazy because it was a test machine and I didn’t have much risk). I had decided to instead make it a RAID-1 of 30G disks and to save some inconvenience I transfered the LVM from the degraded RAID on the old drive to a degraded RAID on a new disk. I was using a desktop machine and it wasn’t designed for three hard disks so it was easier to transfer the data in a way that doesn’t need to have more than two disks in the machine at any time. Then the new disk died as soon as I had finished moving the LVM data. I could have probably recovered that from the LVM backup data and even if that hadn’t worked I had only created a few LVs and they were contiguous so I could have worked out where the data was.

Instead however I decided to cut my losses and reinstall it all. The ironic thing is that I had planned to make a backup of the data in question (so I would have copies of it on two disks in the RAID-1 and another separate disk), but I had a disk die before I got a chance to make a backup.

Having two disks out of the four I selected die today is quite a bad result. I’m sure that some people would suggest simply buying newer parts. But I’m not convinced that a disk manufactured in 2007 would survive being kept on a shelf for a year any better than a disk manufactured in 2001. In fact there is some evidence that the failure rates are highest when a disk is new.

Apart from stiction I wouldn’t expect drives to cease working from not being used, I would expect drives to last longer if not used. But my rate of losing disks in running machines is minute. Does anyone know of any research into disks dying while on the shelf?

Smoke from the PSU

Yesterday I received two new machines from DOLA on-line auctions [1]. I decided to use the first to replace the hardware for my SE Linux Play Machine [2]. The previous machine I had used for that purpose was a white-box 1.1GHz Celeron and I replaced it with an 800MHz Pentium3 system (which uses only 35W when slightly active and only 28W when the hard disk spins down [3]).

The next step was to get the machine in question ready for it’s next purpose, I was planning to give it to a friend of a friend. A machine of those specs which was made by Compaq would be very useful to me, but when it’s a white-box I’ll just give it away. So I installed new RAM and a new hard drive in it (both of which had been used in another machine a few hours earlier and seemed to be OK) and turned it on. Nothing happened, I was just checking that it was plugged in correctly when I noticed smoke coming from the PSU… It seems strange that the machine in question had run 24*7 for about 6 months and then suddenly started smoking after being moved to a different room and being turned off overnight.

It is possible that the hard drive was broken and shorted out the PSU (the power cables going to the hard drive are thick enough that it could damage the PSU if it had a short-circuit). What I might do in the future is keep an old and otherwise useless machine on hand for testing hard drives so that if something like that happens then it won’t destroy a machine that is useful. Another possibility is that the dust in the PSU contained some metal fragments and that moving the machine to another room caused them to short something out, but there’s not much I can do with that when I get old machines. I might put an air filter in each room that I use for running computers 24*7 to stop such problems getting worse in future though.

I recently watched the TED lecture “5 dangerous things you should let your kids do” [4], so I’m going to offer the broken machine to some of my neighbors if they want to let their children take it apart.

big and cheap USB flash devices

It’s often the case with technology that serious changes occur at a particular price or performance point in development. Something has small use until it can be developed to a certain combination of low price and high performance that everyone demands.

I believe that USB flash devices are going to be used for many interesting things starting about now. The reason is that 2G flash devices are now on sale for under $100. To be more precise 1G costs $45AU and
2G costs $85AU.

http://www.coker.com.au/hardware/usb.html

The above page on my web site has some background information on the performance of USB devices and the things that people are trying to do with them (including MS attempting to use them as cache).

One thing that has not been done much is to use USB for the main storage of a system. The OLPC machines have been designed to use only flash for storage as has the Familiar distribution for iPaQ PDAs (and probably several other Linux distributions of which I am not aware). But there are many other machines that could potentially use it. Firewall and router machines would work well. With 2G of storage you could even have a basic install of a workstation!

Some of the advantages of Flash for storage are that it uses small amounts of electricity, has no moving parts (can be dropped without damage), and has very low random access times. These are good things for firewalls and similar embedded devices.

An independent advantage of USB Flash is that it can be moved between machines with ease. Instead of moving a flash disk with your data files you can move a flash disk with your complete OS and applications!

The next thing I would like to do with USB devices is to install systems. Currently a CentOS or Red Hat Enterprise Linux install is just over 2G (I might be able to make a cut-down version that fits on a 2G flash device) and Fedora Core is over 3G. As Flash capacity goes up in powers of two I expect that soon the 4G flash devices will appear on the market and I will be able to do automated installs from Flash. This will be really convenient for my SE Linux hands-on training sessions as I like to have a quick way of re-installing a machine for when a student breaks it badly – I tell the students “play with things, experiment, break things now when no-one cares so that you can avoid breaking things at work”.

The final thing I would like to see is PCs shipped with the ability to boot from all manner of Flash devices (not just USB). I recently bought myself a new computer and it has a built-in capacity to read four different types of Flash modules for cameras etc. Unfortunately it was one of the few recent machines I’ve seen that won’t boot from USB Flash (the BIOS supported it but it didn’t work for unknown reasons). Hopefully the vendors will soon make machines that can boot from CF and other flash formats (the more format choices we have the better the prices will be).