8

Vibration and Strange SATA Performance

Almost two years ago I blogged about a strange performance problem with SATA disks [1]. The problem was that certain regions of a disk gave poor linear read performance on some machines, but performed well on machines which appeared to be identical. I discovered what the problem was shortly after that but was prevented from disclosing the solution due to an SGI NDA. The fact that SGI now no longer exists as a separate company decreases my obligations under the NDA. The fact that the sysadmins of the University of Toronto published all the most important data entirely removes my obligations in this regard [2].

In their Wiki they write “after SGI installed rubber grommits around the 5 or 6 tiny fans in the xe210 nodes, the read and write plots now look like” and then some graphs showing good disk performance appear.

The problem was that a certain brand and model of disk was particularly sensitive to vibrations. When that model of disk was installed in some machines then the vibrations would interfere with disk reads. It seems that there was some sort of harmonic frequency between the vibration of the disk and that of the cooling fans which explains why some sections of the disk were read slowly and some gave normal performance (my previous post has the graphs which show a pattern). Some other servers of the same make and model didn’t have that problem, so it seemed that some slight manufacturing differences in the machines determined whether the vibration would affect the disk performance.

One thing that I’ve been meaning to do is to test the performance of disks while being vibrated. I was thinking of getting a large bass speaker, a powerful amplifier, and using the sound hardware in a PC to produce a range of frequencies. Then having the hard disk securely attached to a piece of plywood which would be in front of the speaker. But as I haven’t had time to do this over the last couple of years it seems unlikely that I will do it any time soon. Hopefully this blog post will inspire someone to do such tests. One thing to note if you want to do this is that it’s quite likely to damage the speaker, powerful bass sounds that are sustained can melt parts of the coil in a speaker. So buy a speaker second-hand.

If someone in my region (Melbourne) wants to try this then I can donate some old IDE disks. I can offer advice on how to run the tests for anyone who is interested.

Also it’s worth considering that systems which make less noise might deliver better performance.

Mo Rewards

While shopping at Highpoint [1] today I noticed that they had a new loyalty system. It’s called Mo Rewards [2] (for which the real web site is at MoCoMedia.net [3] which has no link from the main site because they didn’t care enough about their web presence).

mo RFID keyring tokens

The way that Mo works is that everyone gets a free RFID token similar to the two in the above photograph. The token comes with a pseudo-random seven letter code that you have to SMS to register it to your phone. You SMS the code and then receive a confirmation SMS. After that you can wave your token near a detector any time you visit the shopping center and you will receive three SMS messages with discount offers. You can send an SMS with your gender and birth-year to receive more targeted offers. To redeem offers you have to wave your token near a detector at the store so they know who is using the offers.

Then of course once the database knows that you are a regular customer at a certain shop they can send you targeted advertising to entice you to buy from that shop on every visit. I presume that they have some sort of bidding system for adverts from the shops of a similar nature to the Google advertising.

It’s an interesting system and a lot better than most loyalty programs.

One interesting thing about this is that high quality RFID devices are being given out for free. The tokens are quite solidly constructed and could be used for a variety of other purposes. I couldn’t find anyone offering RFID tags at a reasonable price with a quick Google search (the cheapest was $75 for 100 tags – and they were the fragile ones used for marking stock in shops). So a hobbyist who wanted to do some RFID stuff could buy a cheap reader under one of the demo offers (where you get a reader and a small quantity of keys for a reasonable price) and then collect free RFID tokens from shopping centers. I expect that the number of people who would do such things is small enough to not be statistically significant and therefore not affect the business model. The tags are given out freely with no requirement that you must use them for the expected purpose (Mo Rewards) instead of using them for your own RFID work.

Amusing Thanks.txt Entry

My SE Linux Play Machine [1] has a file named thanks.txt for users to send messages to me [2].

On a number of occasions people have offered to give me things in exchange for the password for the bofh account (the one with sysadm_r privileges). I’ve been offered stolen credit cards, a ponzi scheme of root access to servers on the net, and various other stuff. Today I received an amusing joke entry:

Hello Kind Sir,
I am Dr. Adamu Salaam, the the bank manager of bank of africa (BOA) Burkina Faso West
I am sending you this message about the $3.14159 million dollars in bank account number 2718281828450945. I will give you this money in exchange for the password to the ‘bofh’ account.

The amount of money is based on the value of Pi. The account number is based on the mathematical constant e [3].

It’s a pity that the author of that one didn’t sign their real name. Whoever created that should have claimed credit for their work.

7

The Future of Electric Cars

TED published an interesting interview with Shai Agassi about electric cars [1]. One idea that I hadn’t heard before is that of moving car batteries between regions as they lose capacity. An old battery for an electric car that can only handle short journeys may be useful in a region where journeys are typically short. On a similar note I expect that in a few decades the less prosperous countries will import old electric vehicles and fit them with 4 or more batteries. Last time I checked the Prius battery pack weighed about 120Kg, so the car would be usable with 4 battery packs if driven at low speeds.

Shai Agassi also gave a TED talk on this topic [2]. The real solution for the problem of providing convenient and affordable electric vehicles is to start by recharging the batteries whenever the vehicle is parked (at the office, shopping center, home, etc). Then on the rare occasions when the car is being driven for longer distances and the battery gets flat it can be swapped for a charged battery. They have apparently designed a robot for changing car batteries, so changing the battery would be like driving through a car-wash. He describes this as an economic model that decouples the expensive battery from the car, so you pay for the use of the battery not the ownership – just as with a petrol car you pay for the petrol you use not for a portion of the ownership of an oil well.

He also pointed out that cars produce 25% of the world’s CO2 emissions, so his plan for all electric cars everywhere seems to be an essential part of solving the environmental problems. He then compared this to the UK parliamentary discussion on ending slavery, at the time slaves provided 25% of the energy used by the UK. After a month of discussion the decision was made to make the moral choice and end slavery regardless of the cost.

The AP and Copyright on the Web

The New York Times has an article about the Associated Press (AP) trying to gain more control over material that it distributes [1]. The article is not clear on the details.

One noteworthy fact is that the AP apparently don’t like search engines showing snippets of their articles. This should however be an issue for the organisations that license the AP content and redistribute it (newspapers etc), they can use a robots.txt file on their web server to prevent search engines from showing snippets of their content – then once their traffic drops dramatically they can threaten to boycott AP if they can’t do things properly. Speaking for myself the majority of the articles I read on major news sites come from Google results, if they stop Google from indexing AP content then I will read a lot less of it. The end of the article says that there is some sort of battle in Europe between Google and newspapers. Has Google stopped respecting robots.txt? How can this be a problem, if someone copies the entire article you can sue them and you can ask Google not to index your site. That should cover it.

The AP will be going after sites that copy large portions of articles, but this is not news at all. I often see web sites copy my blog content in ways that breach the license terms. As I’m not well equipped to deal with such people I usually try to find an instance of the same splog (spam blog) copying articles from a major news site and report it to employees of the news organisation. They can often get the splogs shut down, sometimes rather quickly.

They are apparently after SEO, they want to get the top entries in search engines for their articles and not have a site that paraphrases the article or quotes it. I don’t think that my blog posts which paraphrase and quote from mainstream media articles are likely to do that, but the newspapers have to deal with the fact that when Slashdot and other popular sites reference their articles then they will lose on SEO. They should be happy that they can win most of the time.

Brendan Scott has a rather harsh take on this [2] which he unfortunately has not explained in any detail. The people who write the news articles for AP get paid for their work and then AP needs to get paid to run a viable business – which is in the public interest. It may be that the AP are doing something really bad, but the New York Times article that Brendan cites doesn’t seem to support any such claim.

10

Fixing the Correct Network Bottleneck

The latest news in the Australian IT industry is the new National Broadband Network (NBN) plan [1]. It will involve rolling out Fiber To The Home for 90% of the population, the plan is that it will cost the government $43,000,000,000 making it the biggest government project. Kevin Rudd used Twitter to say “Just announced biggest ever investment in Australian broadband – really exciting, infrastructure for the future” [2].

Now whenever someone says that a certain quantity of a resource is enough you can expect someone to try and refute that claim by mentioning that Bill Gates supposedly stated that “640K is enough” when referring to the RAM limits of the original IBM PC. As an aside, it’s generally believed that Bill Gates actually didn’t claim that 640K would be enough RAM, Wikiquote has him claiming to have never said any such thing [3]. He did however say that he had hoped that it would be enough for 10 years. I think that I needed that disclaimer before stating that I think that broadband speeds in Australia are high enough at the moment.

In any computer system you will have one or more resources that will be limited and will be bottlenecks that limit the overall performance.  Adding more of other resources will often make no difference to performance that a user might notice.

On the machine I’m using right now to browse the web the bottleneck is RAM.  A combination of bloated web pages and memory inefficient web browsers uses lots of memory, I have 1.5G of RAM and currently there is 1.3G of swap in use and performance suffers because of it.  It’s not uncommon for the machine to page enough that the mouse cursor is not responsive while browsing the web.

My options for getting faster net access on this machine are to add more RAM (it can’t take more than 2G – so that doesn’t gain much), to use more memory efficient web browsers and X server, and to simply buy a new machine. Dell is currently selling desktop machines with 2G of RAM, as they are 64bit systems and will therefore use more memory than 32bit systems for the same tasks they will probably give less performance than my 32bit machine with 1.5G of RAM for my usage patterns.

Also the latest EeePC [4] ships with 1G of RAM as standard and is limited to a maximum of 2G, I think that this is typical of Netbook class systems. I don’t use my EeePC for any serious work, but I know some people who do.

Does anyone have suggestions on memory efficient web browsers for Linux? I’m currently using Konqueror and Iceweasel (Firefox). Maybe the government could get a better return on their investment by spending a small amount of money sponsoring the development of free web browsers. A million dollars spent on optimising Firefox seems likely to provide good performance benefits for everyone.

My wife’s web browsing experience is bottlenecked by the speed of the video hardware in her machine (built-in video on a Dell PowerEdge T105 which is an ATI ES1000). The recent dramatic price reductions of large TFT monitors seem likely to make video performance more of an issue, and also increases the RAM used by the X server.

Someone who has reasonably good net access at the moment will have an ADSL2+ connection and a computer that is equivalent to a low-end new Dell machine (which is more powerful than the majority of systems in use). In that case the bottleneck will be in the PC used for web browsing if you are doing anything serious (EG having dozens of windows open, including PDFs and other files that are commonly loaded from the web). If however a machine was used for simply downloading web pages with large pictures in a single session then FTTH would provide a real benefit. Downloading movies over the net would also benefit a lot from FTTH. So it seems to me that browsing the web for research and education (which involves cross-referencing many sites) would gain more of a benefit from new hardware (which will become cheap in a few years) while porn surfing and downloading movies would gain significantly from FTTH.

The NBN will have the potential to offer great bi-directional speeds. The ADSL technology imposes a limit on the combination of upload and download speeds, and due to interference it’s apparently technically easier to get a high download speed. But the upload speeds could be improved a lot by using different DSLAMS. Being able to send out data at a reasonable speed (20Mbit/s or more) has the potential to significantly improve the use of the net in Australia. But if the major ISPs continue to have terms of service prohibiting the running of servers then that won’t make much difference to most users.

Finally there’s the issue of International data transfer which is slow and expensive. This is going to keep all affordable net access plans limited to a small quota (20G of downloads per month or less).

It seems to me that the best way of spending taxpayer money to improve net access would be to provide better connectivity to the rest of the world through subsidised International links.

Brendan makes an interesting point that the NBN is essentially a subsidy to the entertainment industry and that copyright law reform should be a higher priority [5].

10

Australian Democracy is “Microsoft Compatible”

Here is the Australian Electoral Commission documentation on how to register a political party [1]. It includes the requirement for “A Microsoft compatible electronic membership list (and paper copy) providing the following information“.

So a prerequisite for registering a political party appears to be the ownership of a PC running Windows. While it may be the case that I could create a plain text file on a Linux machine and append some CR characters to each line, or create a CSV format spread-sheet/database file the most common interpretation of this is likely to be that MS-Office is required.

Such blatant promotion of a software vendor in a government document is unacceptable. Anyone who wishes to use other software for their political activities should be permitted to do so without restriction.

5

Fads and Internet Services

Richard Glover has written a polemic about fads on the net [1]. His points are essentially good, but he does over-reach them a bit (which is part of the polemic style) and he also seems a little unimaginative about the future of technology. He starts by suggesting that Twitter [2] is a fad. Twitter has grown in popularity very rapidly, and I believe that the company is not guaranteed to last forever (the business model could fail or a better implementation of the concept could take the users). But the basic premise of Twitter (SMS length messages being published on the net in a similar manner to blog posts) is something that has been proven to work well. So even though I have not felt inclined to use Twitter (either as a reader or a writer) I don’t think that the class of service will ever go away.

He also cites an example of watching TV news on a mobile phone. While I believe he is right about that not being a viable business, it’s because TV news itself isn’t a great thing. Before the Internet was commonly used the only ways of getting news of something that happened in the last few hours was via TV or radio, if you want pictures with that then TV was the only option. If you wanted quality news (in-depth coverage, insightful analysis, and a depth of detail) then you were probably out of luck, but the best option available to you was the the newspaper. Watching a TV news segment on a mobile phone (or on a PC connected to the net) is not effective. What I want is a newspaper that is updated as soon as events happen and which contains full color pictures and video. There are a number of web sites which provide this service, The Sydney Morning Herald (which employs Richard Glover) [3] is one example.

I think that the current fad for mobile phone TV is like the fad for WAP [4]. Many years ago I worked for an ISP that installed a WAP server, one of my duties was to keep the WAP server running. Everyone who knew anything about technology knew that the project was going to fail, WAP phones were horribly expensive (you could expect to spend an extra 200 guilders or more to buy such a phone) and the features of WAP were not particularly exciting. Also requiring that everything be re-written for WAP was just insanity. The problem with the WAP fad was managers who knew nothing about technology making technical decisions. I’m sure that the TV on mobile phone fad is driven by managers in TV companies that are greedy for more revenue opportunities and don’t stop to think about the implications. I might watch a news show on free TV, but I’m certainly not going to pay by the minute to watch it on my mobile phone. Instead I can just go to a web site such as the SMH and read news items, see pictures, and even watch the occasional video (most TV news doesn’t really require any video – they just re-enact scenes or use stock footage to have something on screen).

Finally he makes a sarcastic reference to chocolate fondue. That reminds me, someone gave me a fondue set years ago that I haven’t used yet. I’ll have to make a chocolate fondue! For those of you who live in Melbourne, the restaurant of the Melbourne Swiss Club [5] offers cheese fondue for the main course and chocolate fondue for dessert. Fondue is still with us!

11

Real-World Car Safety Tests

The car safety tests that are required for every new mass-market passenger vehicle are flawed in many ways. Here is a list of the most obvious flaws (please point out any that I’ve missed):

  1. There has been no research to make accurate crash-test dummies to represent women and children, and I believe that there has been no research to make crash-test dummies to accurately represent people of racial groups that are not common in the US. Basically the medical research used to make crash test dummies was performed on male cadavers that were readily available in the US.
  2. The standard tests involve a direct collision with a centrally targeted stationary object, a direct collision with an offset stationary object, and solid objects (representing cars) hitting the vehicle from the read and the side. These simulate crashes where there is little or no attempt made to avoid the collision, they are probably really good for protecting drunk drivers. But any sane and sober driver is probably going to make some effort to avoid the collision and the resulting impact will not be at a multiple of 90 degrees. Note that when a car directly hits the side of a moving car it is quite different to hitting the side of a stationary car (which is what is tested).
  3. There are no standard tests for the probability of a vehicle rolling in the event of a crash or of what would happen to the occupants if it was to roll. Rollover crashes are among the most dangerous…
  4. The tests do not take into account the ability of the driver to avoid a crash or minimise the damage. The ability to avoid crashes is a major advantage for cars with a low center of gravity, AWD, and traction control. It’s a major problem for vehicles with a high center of gravity and with tires that are not designed for road use (IE 4WD/SUV vehicles).

But generally the crash-test results are of some use provided that you start by looking at the results from vehicles that have good safety features such as the Audi Quattro, the AWD version of the VW Passat, a Mercedes with 4MOTION, or any other vehicle with constant four wheel drive, road tires, four wheel traction control, and a low center of gravity.

The RACV (the main car owners advocacy organisation in Victoria and also a major car insurance company) [1] has published the used car safety ratings report [2]. This was produced by the Monash University Accident Research Centre and is based on the analysis of 3,000,000 crashes reported to police in Australia and New Zealand. Results are only available for cars which have been in common use on Australian and New Zealand roads for some time (so there aren’t many entries for vehicles that are less than 5 years old or for particularly expensive vehicles).

The report also includes estimates on the purchase prices of some of the safest vehicles. A vehicle that is significantly better than average can be purchased for as little as $5000!

Now if you want to buy a new vehicle then choosing the latest version of a model that has rated well on the used-car tests should be safe if the new car crash tests also report good results. It seems likely that the latest Mazda 6 or VW Passat will also rate well on the used-car tests in a few years time. It’s a pity that the report didn’t note which of the vehicles that rated well have models that have good features to avoid collisions such as EBA, ABS, AWD, and traction-control.

A friend who is active in the free software community recently had a very lucky escape from a significant crash. From his description I doubt that car safety features had much to do with him escaping without injury, I think that it was mostly luck. While his car did have a good range of safety features (and was rated well on the used-car tests), a high-speed collision that involves a truck can easily result in a car being squashed flat. I have already sent him the RACV link which he is using as part of the process to decide what new car to purchase. But I think that this information needs to be spread more widely.

I have not searched for information on such analysis of crashes being performed in other countries, please leave a comment if you know of any good research that will be useful for other people. One thing to note however is that given the global scope of car manufacturing, results from one country will have some validity in others. I expect that a VW Passat that is sold in Germany or the US will be almost identical to the Australian version.

3

Tithing for Open Source

It’s common to hear a complaint of the form “I get paid to keep computers running not hack an OS” coming from someone who uses an Open Source OS such as Linux, BSD, or Open Solaris.

It seems to me that part of the job of keeping computers running when using Open Source software IS to hack the source and fix bugs. This takes the place of praying, begging, and having your employer pay arbitrary extra amounts of money to the vendor when you have problems with proprietary software.

It’s well understood that a good system-administrator will anticipate problems and implement solutions to them in advance. You don’t wait for a system to run out of disk space and then fix it – you install cron jobs to compress and remove old log files and have a monitoring system to tell you if disk space really runs low.

It seems to me that the approach that many companies take towards fixing software bugs goes against this ideal. They wait for software to fail in production and then file a bug report and hope that someone else will fix it.

When allocating time for various tasks it’s not uncommon to have various amounts of staff time devoted to different servers, departments, or projects. I believe that having a fixed amount of time devoted to finding and fixing bugs in Open Source software (both current versions and pre-release versions) would save money for a company in the long term. If 10% of the time of the most skilled programmers was assigned to finding and fixing bugs in the OS then the overall quality would improve. If a company depends on Debian then it would make sense to have this 10% time include testing out the production programs on Debian/Unstable, it if depends on Red Hat Enterprise Linux then it would make sense to test them out on Rawhide. This would increase the ability of the future releases of Debian or RHEL to support the applications in question, and might also discover some application bugs.

Also it’s very important to submit patches with bug reports. It’s not uncommon for a bug report to be critically important to a user but not overly important to the rest of the world. Such bugs can stay in a bug tracking system for a long time without getting fixed. But if there is a patch submitted that includes necessary documentation patches and a description of the tests that it has passed then it will probably be easier to include it than to debate whether it’s really needed.

If a project is only running for a matter of weeks or months (EG a consulting company that comes in, deploys a “solution” and then leaves) then there is probably no benefit for doing this. But if a company is going to be running servers for many years which will periodically be upgraded then it would be a real benefit to have bugs fixed in future versions.