Archives

Categories

I need an LMTP server

I am working on a system where a front-end mail server sends mail to what it considers to be a LDA (Local Delivery Agent) which actually sends mail to a back-end server via LMTP. I can’t remove that fake LDA from the design because it does a bunch of business specific processing along the way.

I am working on changing the back-end from Cyrus to Dovecot. Currently the mail goes from the fake LDA to the Cyrus LMTP server. What I would like to do is to have an LMTP server run on the back-end machine that launches the Dovecot deliver program immediately and then returns an appropriate code.

So far I have been experimenting with having Postfix run on the back-end to use deliver as the real LDA. The first problem with this is that the mail will be written to the Postfix queue and then written to the mail store. Doubling the number of writes is a real problem for a system that is going to be write-bottlenecked – it would significantly increase the hardware costs. The second problem is that when an account goes over quota the back-end server would be generating a bounce message. I would prefer the front-end server to generate the bounce on an un-munged message.

Basically all I need is a simple daemon (which could even be launched from inetd) that talks LMTP (a very simple cut-down version of SMTP) and executes a single command to receive the data. It might be necessary to serialise running the delivery process, in which case the mail data would have to be stored in memory and there would need to be a semaphore around executing the delivery program.

Does anyone know of such a program? If not then I’ll have to write it myself (which shouldn’t be difficult) and GPL it. If I have to do that then I need a suitable name for it. Any suggestions would be appreciated.

The FAIL Meme

One of the recent poor trends in mailing list discussions is to reply to a message with a comment such as “FAIL” or “EPIC FAIL“.

The FAIL meme has been around for a while and actually does some good in some situations, slate has a good article about it [1]. The first example cited in that article is that ‘when Ben Bernanke and Henry Paulson testified before the Senate banking committee last month about Paulson’s proposed bailout bill, a demonstrator in the audience held up an 8.5-by-11 piece of paper with one word scrawled on it in block letters: “FAIL.”‘. This is an effective form of political demonstration, short words generally work well on placards (if only because the letters can be larger and therefore read from a greater distance) and anyone can understand the meaning of “FAIL” in that context.

There are some blogs dedicated to publicising supposed failures, failblog.org and ShipmentOfFail.com are two examples. I cite these as supposed failures because some of the pictures that they contain are obviously staged. It’s basically an Internet equivalent of the “Funniest Home Videos” shows that I never watched because they were not particularly funny.

So using the word “FAIL” on it’s own can be an effective form of political protest and can be used for mildly amusing web sites. But where it falls down is when it’s applied to a discussion that involves people who are from different cultures or have different levels of background knowledge – which covers most mailing list discussions.

Something that might be obviously wrong to some people is often not obvious at all to others. For example being forced to reboot a computer for any reason other than a kernel upgrade seems obviously wrong to me (and to most people who use Linux or other Unix systems) but Windows users seem happy to reboot machines after applying patches or upgrades. So writing a message with “FAIL” as the only word in a discussion with Windows users would not be productive. It could however be reasonable to forward a link to a page on a Microsoft web site to Linux people for their amusement with “FAIL” as the only comment – anyone who would find the link in question amusing would require no more explanation.

Sometimes when in a debate someone will write a message that only says “FAIL“, this is a very unconvincing argument that will not convince the opposition or any onlookers.

Generally it seems that using “FAIL” in a discussion with other like-minded people when talking about someone outside your group for the purpose of amusement can be effective. But any other use is going to be a “FAIL“.

As a more general rule single-word messages seem to have little value apart from certain limited situations. I have identified the following seven scenarios where a single word message is useful. Can anyone think of any others?

  1. Code review – someone posts code (or design for code) and people who like it will write “ACK” or something similar.
  2. Arranging a meeting – the question “who wants to meet for lunch tomorrow” has “me” as a valid answer.
  3. Voting – “yes” and “no” are valid answers for a poll, but a mailing list or forum probably isn’t the best place for it.
  4. Citing an example to refute a claim – often a single word won’t be a great response but may be adequate to prove a point.
  5. Answering a request for a recommendation – if asked to recommend a laptop I might say “Thinkpad” or if asked to recommend a server I might say “HP“. Both those answers are poor (I recommend EeePC for netbooks and Dell for small/cheap servers), so while such an answer would be useful it would be below my usual quality standards for email (I prefer to write at least two paragraphs explaining why I recommend something).
  6. Informing people that something has been done by replying to a request with the word “Done“.
  7. Agreeing to a contract or proposal with “OK” or “Yes“.

Update: I added another two reasonable uses of single word messages,

Case Sensitivity and Published Passwords

When I first started running a SE Linux Play Machine [1] I used passwords such as “123456“. Then for a while I had “selinux” but when I created a T-shirt design (see the main Play Machine page for details) I changed the password to “SELINUX” because that is easier to read on a shirt.

Unfortunately the last time I rebuilt the Play Machine I used a password of “selinux“, some people worked this out and still logged in so I didn’t realise that anything was wrong until a comment was placed on my blog yesterday. So for the past three weeks or so some people have been finding themselves unable to login. The password is now “SELINUX” again, sorry for any inconvenience.

It’s a pity that I can’t make sshd a little less case sensitive for passwords. A PAM module to implement a caps-lock mode where the opposite case is tried would be useful for this case and some others too.

SE Linux Lenny Status Update

I previously described four levels of SE Linux support on the desktop [1].

Last night I updated my APT repository of SE Linux packages for Lenny (as described on my document about installing SE Linux [2]). I included a new policy package that supports logging in to a graphical session via gdm in either unconfined_t or user_t. This covers all the functionality I described as target 2 (some restricted users). I have tested this to a moderate degree.

Target 3 was having all users restricted and no unconfined_t domain (the policy module unconfined.pp not being linked into the running policy). I had previously done a large part of the work towards that goal in preparation for running a SE Linux Play Machine (with public root password) [3] on Lenny – but until last night I had not published it. The combination of the policy needed to run with no unconfined_t domain and the policy to allow logging in as user_t via gdm should mean that a desktop system with gdm for graphical login that has no unconfined_t domain will work – but I have not tested this. So target 3 is likely to have been achieved, if testing reveals any problems in this regard then I’ll release another policy update.

So now the only remaining target is MLS.

Also I have been setting up a mail server with a MySQL database for user account data and using Courier-Maildrop for delivery, so I’ve written policy for that and also made some other improvements to the policy regarding complex mail servers.

You Have the Right to Not Search My Bag

This afternoon I was in a Safeway/Woolworths store (an Australian supermarket chain) and the lady on the checkout asked to inspect my backpack on the way out. The conversation went as follows:
Checkout Lady: Can I inspect your bag?
Me: Sure. – I put my backpack on the counter
CL: Could you open it for me?
Me: It’s OK, you can do it.
CL: I’m not allowed to open your bag, can you open it?
Me: I don’t mind, you can open it.

We iterated over the last two lines a few times, when it became clear that no progress was going to be made I asked “Can I go now?” and left.

It seems rather pointless to demand to search someone’s bag if you are not permitted to open it. Not that they have any power to search bags anyway. I discussed this with a police officer about 20 years ago and was told that store staff can do nothing other than refuse to allow you into their store in future if you don’t agree to a bag search. Stores claim that it’s a condition of entry that your bag be searched, but apparently that was not enforceable. Of course the law could have changed recently, I guess it would only require a terrorist threat related to supermarket products (baking soda can make your bread rise explosively) to get the law changed.

The last time my bag was searched was when leaving a JB Hi-Fi store. I had a brand new EeePC (purchased from a different store) in one hand and a bag in the other. The EeePC was identical to ones that they had on display and they didn’t even ask about it. It seems hardly worth the effort of searching bags when anyone can carry out expensive gear in their hand without being questioned.

A Police SMS about Fire Risk

My wife and I have each received SMS messages from “Vic.Police” that say:

Extreme weather expected tonight (Monday) & tomorrow. High wind & fire risk. Listen to the ABC local radio for emergency update. Do not reply to this message.

Presumably the police are trying to contact everyone in Victoria. The problem seems to be related to the high wind speed that is forecast, the temperature is only predicted to be 32C (as opposed to the 38C that they were forecasting a few days ago and the temperatures of 46C or more a few weeks ago).

The last reports were that the firefighters were still working on putting out fires, and the unclear news coverage seemed to suggest that some of the fires had been burning since the 7th of February. A day of extreme fire danger that starts without any fires would be bad enough, but starting with some fires that are already out of control is destined to give a very bad result.

Below is the link to my previous post about people trying to take advantage of a tragedy to benefit their own political causes. For anyone who wants to rail against abortion, homosexuality, or the Greens party, please show some decency and do so based on relevant facts and do it at an appropriate time. I suggest that anyone who writes later this week about ways to avoid bushfires should be careful to check their claims for accuracy and scientific evidence (hint – the CSIRO and NASA have published a lot of useful background information).

http://etbe.coker.com.au/2009/02/25/tragedy-and-profit/

Links February 2009

Michael Anissimov writes about the theft of computers from the Los Alamos nuclear weapons lab [1]. He suggests that this incident (and others like it) pose a great risk to out civilisation. He advocates donating towards The Lifeboat Foundation [2] to try and mitigate risks to humanity. They suggest pledging $1000 per year for 25 years.

It’s interesting to note that people in Pakistan pay $8 per month for net access that better by most objective metrics than that which most people in first world can get [3]. It seems that we need to remove the cartel for the local loop to get good net access, either deregulate it entirely or make it owned by the local government who are more directly responsive to the residents.

Bruce Schneier wrote a post about a proposed US law to force all mobile phones with cameras to make a “click” sound when taking a picture [4]. The law is largely irrelevant, as it’s been law in Japan for a while most phones are already designed in that way. One interesting comment from MarkH was: But if congress REALLY wishes to benefit the public, I suggest that all guns in the U.S. be required, before each discharge, to make loud sounds (with appropriate time sequencing) simulating the flintlock technology that was common at the beginning of U.S. history, including cocking, use of the ramrod, etc. This would give fair warning of an impending discharge, and would limit firing rates to a few per minute. ROFL

Brief review of a Google Android phone vs an iPhone [5]. The Android G1 is now on sale in Australia! [6].

LWN has an article about the panel discussion at the LCA Security Mini-conf [7]. Jonathan Corbet has quoted me quite a bit in the article, thanks Jonathan!

Peter Ward gave an interesting TED talk about Hydrogen Sulphide and mass extinctions [8]. The best available evidence is that one of the worst extinctions was caused by H2S in the atmosphere which was produced by bacteria. The bacteria in question like a large amount of CO2 in the atmosphere. It’s yet another reason for reducing the CO2 production.

Michael Anissimov has written a good article summarising some of the dangers of space exploration [9], he suggests colonising the sea, deserts, and Antartica first (all of which are much easier and safer). “Until we gain the ability to create huge (miles wide or larger) air bubbles in space enclosed by rapidly self-healing transparent membranes, it will be cramped and overwhelmingly boring. You’ll spend even more time on the Internet up there than down here, and your connection will be slow“. A confined space and slow net access, that’s like being on a plane.

Tragedy and Profit

Every time something goes wrong there will be someone who tries to take advantage of the situation. The recent bushfires in Australia that have killed hundreds of people (the count is not known yet) are a good example. Pastor Nalliah of Catch the Fire Ministries [1] claims that it is due to legalising abortion. This is astoundingly wrong.

In a more extreme example representatives of the Westboro Baptist Church were planning to visit Australia to launch a protest in support of the bushfires [2]. I have not yet found any news reports about whether they actually visited Australia or protested – it’s most likely that they decided not to visit due to the Australian laws being very different to US laws regarding the relative importance of freedom of speech and incitement to violence. Apparently the insane Westboro Baptist Church people (who are best known for GodHatesFags.com and GodHatesAmerica.com) believe that God hates Australia and caused the fires (presumably due to Australia not persecuting homosexuals). Danny Nalliah has permanently damaged his own reputation by acting in a similar way to the Westboro Baptist Church. The reputation of Catch The Fire now depends on how quickly they get a new pastor…

Please note well that the vast majority of Christians have nothing in common with Westboro or Catch The Fire. I don’t recall the last time I met an Australian Christian who was strongly opposed to homosexuality or abortion.

Now we do have to try and investigate ways of avoiding future tragedies, and the work to do this needs to begin immediately. John Brumby (the Premier of Victoria) has announced that Victoria will get new strict building codes for fire resistant buildings [3]. There have been many anecdotes of people who claim to have been saved by attaching sprinkler systems to their homes, by building concrete bunkers to hide in while the fire passes, and using other techniques to save their home or save themselves. Some more research on the most effective ways of achieving such goals would be worthwhile, an increase in funding for the CSIRO to investigate the related issues would be a good thing. The article also has an interesting quote “As the fallout from the disaster widened, the union representing the nation’s 13,000 firefighters warned both the federal and state governments to take global warming seriously to prevent a repeat of last weekend’s lethal firestorm“. However given that traditionally Australia and the US have been the two nations most opposed to any efforts to mitigate global warming it seems unlikely that anything will change in this regard in a hurry.

The attempts to link bushfires to abortion and homosexuality are offensive, but can be ignored in any remotely serious debate about politics. However there are some other groups trying to profit from the tragedy that make claims which are not as ridiculous.

On the 9th of February the Australian Green party was compelled to release an official statement from Spokesperson Scott Ludlam, Sarah Hanson-Young, Rachel Siewert, Christine Milne, and Bob Brown following some political discussion about Greens policies [4]. There have been attempts to blame the Greens for the tragedy which were politically motivated, some of which came from groups that traditionally oppose the Greens for other reasons (I’m not going to provide the detail – anyone who is really interested can do google searches on the people in question). On the 16th of February Bob Brown (the leader of the Green party) felt obliged to make another media release reiterating the fact that the Greens support prescribed burn-offs to limit the scope of wild fires [5], he also decried the hate mongering that has been occurring in the wake of the disaster.

One of the strange memes that seems to be spread by opponents to the Greens is that the Greens are all supposedly from the city and know nothing about the country. To avoid being subject to such attack I feel obliged to note that on one of the bad fire days I visited my parents. I spent the morning with my father and some friends at a park that was not far from the fire area, my friends then returned to their home which was not far from the fire area. I then had lunch with my parents and watched the smoke through the dining room window. After that my friends didn’t respond to email for a while and I was concerned that they may have lost their house or maybe suffered injury or death. I didn’t know them well enough to feel it appropriate to try a dozen different ways of contacting them (I’m sure that many other people were doing so), but I was rather concerned until my wife received an email from them.

But I don’t base my political beliefs on what I personally observe or my connections to people on the edge of the fire zone. I believe in the Green principles of “Peace and Non Violence, Grassroots Democracy, Social and Economic Justice, Ecological Sustainability” and the use of science and statistics to determine the best ways of achieving those goals.

Red Hat, Microsoft, and Virtualisation Support

Red Hat has just announced a deal with MS for support of RHEL virtual machines on Windows Server and Windows virtual machines on RHEL [1]. It seems that this deal won’t deliver anything before “calendar H2 2009” so nothing will immediately happen – but the amount of testing to get these things working correctly is significant.

Red Hat has stated that “the agreements contain no patent or open source licensing components” and “the agreements contain no financial clauses, other than industry-standard certification/validation testing fees” so it seems that there is nothing controversial in this. Of course that hasn’t stopped some people from getting worked up about it.

I think that this deal is a good thing. I have some clients who run CentOS and RHEL servers (that I installed and manage) as well as some Windows servers. Some of these clients have made decisions about the Windows servers that concern me (such as not using ECC RAM, RAID, or backups). It seems to me that if I was to use slightly more powerful hardware for the Linux servers I could run Windows virtual machines for those clients, manage all the backups at the block device level (without bothering the Windows sysadmins). This also has the potential to save the client some costs in terms of purchasing hardware and managing it.

When this deal with MS produces some results (maybe in 6 months time) I will recommend that some of my clients convert CentOS machines to RHEL to take advantage of it. If my clients take my advice in this regard then it will result in a small increase in revenue and market share for RHEL. So Red Hat’s action in this regard seems to be a good business decision for them. If my clients take my advice and allow me to use virtualisation to better protect their critical data that is on Windows servers then it will be a significant benefit for the users.

Lenny Play Machine Online

As Debian/Lenny has been released and the temperatures in my part of the world are no longer insanely hot I have put my SE Linux Play Machine [1] online again. It is running Debian/Lenny and is a Xen DomU on a Debian/Lenny Dom0.

To get this working I had to make a few more fixes to the SE Linux policy and will update my Lenny repository (as mentioned in my document on installing SE Linux on Lenny [2]) in the near future.

I have reformatted most of the text from the thanks.txt file on my Play Machine and put is online on my documents blog [3]. I have also graphed the logins to my Play Machine using Webalizer [4] with 1KB transfer in the graph meaning one minute of login time. Below is the Perl code I used to convert the output of “last -i” to what looks like an Apache log file, the program takes a single command-line parameter which indicates the year that the data is from (which is not included in last output) and takes the output of “last -i” on standard input and gives a web log on standard output.

#!/usr/bin/perl

my @output;

while(<STDIN>)
{
  if(not $_ =~ /^root.*pts/)
  {
    next;
  }
  $_ =~ s/  +/ /g;
  $_ =~ s/^root pts.[0-9]+ //;
  chomp $_;
  my @arr = split(' ', $_);
  my $url = "/";
  if($arr[6] =~ /crash/)
  {
    $url = "/crash";
  }
  my $t = $arr[7];
  $t =~ s/[()]//g;
  my @times = split(':', $t);
  if($times[0] =~ /\+/)
  {
    my @hours = split('\+', $times[0]);
    $t = $hours[0] * 24 * 60 + $hours[1] * 60 + $times[1];
  }
  else
  {
    $t = $times[0] * 60 + $times[1];
  }
  $t *= 1024;
  if($t == 0)
  {
    $t = 1;
  }
  if(length($arr[3]) == 1)
  {
    $arr[3] = "0" . $arr[3];
  }
  $output[$#output + 1] = "$arr[0] – – [$arr[3]/$arr[2]/$ARGV[0]:$arr[4]:00 +0000] \"GET $url HTTP/1.0\" 200 $t \"-\"\n";
}

my $i;
for($i = $#output; $i > -1; $i--)
{
  print $output[$i];
}