|
|
When I started blogging I used Blogger [1]. After some time I decided that it did not offer me the freedom I desired. I could not easily make changes (I could have created new themes, but it would have taken an unreasonable amount of work). I currently use WordPress, it’s still a lot of work to change themes, but at least it’s CSS and PHP coding which can be used for other things. Blogger offers no statistics on web use (I tried adding support for Google Analytics but couldn’t get it to work properly), what I want is Webalizer or something similar (which is easy to do when running your own server).
Blogger is a reasonable way of starting blogging, but if you use it then you want to make it easy to change to something that you own. Blogger has a feature of using a DNS name in a domain that you own for the name of your blog (which is much less obvious than it once was), I regret not using that feature as I still have my old posts on blogger and don’t want to break the links.
Blogger has in the past had problems with time-stamps on posts, when I used blogger I had some complaints that my posts were staying at the top of Planet listings for unreasonable amounts of time (I never tracked this down before switching to my own platform).
Hosting your own blog is not as difficult as you might expect (initially at least). It becomes difficult when you want to install lots of plug-ins, but then any blogging solution would be difficult if you want to do that. The WordPress [2] package in Debian works well and has good support for multiple WordPress blogs. There is a separate product named WordPress-MU [3] which is designed for people who want to run a service in competition with Blogger, some people recommend that you use WordPress-MU if you want to set up blogs for several people. I disagree. If you are setting up blogs for a small number of people then you can use the standard WordPress package and create a file named /etc/wordpress/config-whatever.example.com.php which contains the configuration for whatever.example.com and then create a new WordPress blog by using the web-based interface to do the rest. It would not be difficult to create the configuration file in question with an M4 script if you have a moderate number of blogs to host (maybe a hundred or so). I think that it’s only if you want to host thousands of blogs that you need the features of WordPress-MU. Note that MU is not as well packaged as the base WordPress and has some rough edges. Last time I tried to set up MU I was not successful.
This is not to say that WordPress is inherently the best program, there are many other free software blogging platforms out there. WordPress is the one that I use and am happy to recommend but if your requirements differ from mine then another platform may be better for you. I also suggest that WordPress be used as the base-line for comparing blogging software.
Blogger does not require significant resources. A virtual host with 256M of RAM should be more than adequate to run WordPress plus MySQL. Such virtual hosts are getting quite cheap nowadays, and one such host could easily be shared by a number of bloggers. My blog uses about 1.2G of data transfer per month. vpsland.com offers virtual hosts starting at 150G per month data transfer with 192M of RAM being the minimum. Prices start at $US15 per month. While I can’t compare vpsland.com to other virtual hosting providers (having never used any other such service) I can say that they work reasonably well and I have a client who is happy with them. So it seems that a minimal plan with vpsland.com would host 20 blogs with the same traffic as mine (with RAM being the limiting factor) and a slightly larger plan (with more RAM and more bandwidth) that costs $US30 or $US40 per month could handle 100 or more blogs that are similar to mine. If you get together with some friends and share a virtual server then blogging would not be expensive. Incidentally I had previously read a blog comment about people being hesitant to share servers with their friends (as they apparently would rather grant some unknown people at a faceless corporation the ability to snoop on them than people that they know). The advantage of a blog server in this regard is that everything is public anyway!
If you have good technical skills then I recommend using WordPress as your first blogging platform. If you find that you don’t like it for some reason then you can convert to another platform if you own the domain. If you are setting up a blog for a less technical user then WordPress is also a good choice. My sister uses WordPress, not that she made much of a choice (I had set up a Blogger account for her some time ago which she never used – I guess that could be considered as a choice to not use Blogger) but that I set up a WordPress blog for her and she seemed to like using it.
My previous post about the SFWA falsely issuing DMCA take-down notices [1] has got some reactions, many of which indicate a lack of clear reading of my post.
I am not advocating boycotting sci-fi. I am merely changing the priorities for my reading list. There must be at least 10 sci-fi books on my shelf of books to read, so refraining from buying more for a while isn’t going to impact my reading much either.
In regard to whether boycots are good or bad, it should be considered that the enjoyment of a work of art is highly subjective. If you don’t like the artist then you probably won’t enjoy the art. Boycots on products for which there are objective criteria of quality are often unsuccessful because the buyers have to make a conscious decision to buy a product of a lower quality or a higher price. I can’t make any objective claim about the relative merits of the work of Cory Doctorow and Jerry Pournelle. But it is impossible for me to enjoy reading Jerry’s work as much as I enjoy reading Cory’s work due to my opinion of the two authors.
I have just read a post by Eva Whitley [2] (the widow of Jack L. Chalker – one of my favourite authors when I was younger). She starts by describing Jack’s attitude towards electronic publishing, which sounds quite enlightened. She then notes that Del Rey has the rights to some of his books which are officially still in-print but in practice impossible to get, Baen is still paying royalties but had paid him a large advance so he won’t receive money until the royalties exceed the advance, and some books are out of print but no publisher wants to buy the rights to do a re-print. So it seems that she is not receiving money from her late husbands work partly due to book companies being uncooperative (not printing the books and not permitting others to print them without payment) and partly due to being paid in advance.
Eva states that she is happy that the SFWA issued take-down requests for copies of the work of her late husband. That is good and in fact it’s a legal requirement. No-one can legally get copyright laws enforced unless they own the copyright or they are acting on behalf of the copyright owner. She is entitled to authorise the SFWA (or any other group) to act on her behalf in regard to her copyright violations. Neither the SFWA nor anyone else can take action against copyright violations without such permission.
Jerry Pournelle has written about the situation [3]. He complains that the site owners wanted the items listed individually. The Wikipedia page about the DMCA [4] makes it quite clear that you must provide such detailed information to get something taken offline, and that the DMCA take-down request MUST include a statement claiming ownership of the material UNDER PENALTY OF PERJURY.
If you make a false DMCA claim then you are committing perjury, according to Wikipedia the penalty for perjury in the US is up to five years in prison [5]. Sure any idiot can write up DMCA take-down requests for random stuff on the net and get them acted on (here’s an example of a 15yo idiot who did just that [6]). But if you want to stay out of jail you have to avoid making such false claims.
Jerry expresses a total lack of sympathy for Cory Doctorow and other people who have been victims of slander of title [7]. I wonder how he would react if someone started making public statements under penalty of perjury claiming that he didn’t own the title to some of his work. I suspect that he would desire something similar to what he desires to be done to people associated with 9-11 [8].
Finally one thing that I suggest doing to make some additional money from writing sci-fi is to release T-shirts. A basic T from cafepress.com was $7 last time I checked. $11 for a shirt based on a sci-fi book is not overly expensive and gives $4 profit for the author (more than twice what most authors make from a book sale). Shirt sales are unlikely to make as much money as book sales due to lower volumes, but the effort involved in creating a shirt design is not so great. A publishing company may deny an author (or their estate) future revenue by refraining from printing further copies, but unless they also own the trademarks related to the book and deny the author (or their estate) the right to use them then it should still be possible to make money from merchandise. Making money from merchandise is not as glamourous as making money from book royalties, but it can be effective as demonstrated by xkcd.com.
I have just converted a Fedora Core 5 server to a CentOS 5 Xen Dom0 with Fedora Core 5 as a DomU.
The process took a little longer than expected because I didn’t have console or network access to the DomU initially. It turned out that /etc/modprobe.conf was configured to have the tg3 device for Ethernet while I really needed xennet to get networking going.
The console problem was due to the fact that the device /dev/xvc0 is used for the console in DomU’s and the Fedora Core 5 image was configured for a non-Xen mode of operation. Incidentally it seems a little strange that a default install of CentOS as a DomU will get gettys for /dev/tty[1..6] when none of them seem accessible. After I changed the /etc/inittab file to get the correct name it still didn’t work. It seems that the SE Linux policy in Fedora Core 5 doesn’t have the correct context for the /dev/xvc0 device.
semanage fcontext -a -f -c -s system_u -t tty_device_t /dev/xvc0
So I had to run the above semanage command to change the policy configuration, followed by restorecon /dev/xcv0 to apply the change (although once the change is configured it will apply after the next reboot).
There are many situations where multiple DNS names for a single IP address that runs a single service are useful. One common example is with business web servers that have both www.example.com and example.com being active, so whichever a customer hits they will get the right content (the last thing you want is for a potential customer to make some trivial mistake and then give up).
Having both DNS names be equal and separate is common. One example of this is the way http://planet.ubuntulinux.org/ and http://planet.ubuntu.com/ both have the same content, it seems to me that planet.ubuntu.com is the more official name as the wiki for adding yourself to the Planet is wiki.ubuntu.com. Another example of this is the way http://planet.debian.org/ and http://planet.debian.net/ both have the same content. So far this month I have had 337 referrals to my blog from planet.debian.org and 147 from planet.debian.net. So even though I can’t find any official reason for preferring one over another the fact that more than 2/3 of the referrals from that planet come from the planet.debian.org address indicates that most people regard it as the canonical one.
In times past there was no problem with such things, it was quite routine to have web servers with multiple names and no-one cared about this (until of course one name went away and a portion of the user-base had broken links). Now there are three main problems with having two names visible:
- Confusion for users. When a post on thedebianuser.org referred to my post about Planet Ubuntu it used a different URL to the one I had used. I was briefly worried that I had missed half (or more) of the content by getting my links from the wrong blog – but it turned out that the same content was on both addresses.
- More confusing web stats for the people who run sites that are referenced (primarily the bloggers in the case of a Planet installation). This also means a lower ranking as the counts are split. In my Webalizer logs planet.debian.org is in position #5 and planet.debian.net is in position #14. If they were combined they would get position #3. One thing to keep in mind is that the number of hits that you get has some impact on the content. If someone sees repeated large amounts of traffic coming from planet.debian.org then they are likely to write more content that appeals to those users.
- Problems with sites that have strange security policies. Some bloggers configure their servers to only serve images if the referrer field in the HTTP protocol has an acceptable value (to prevent bandwidth theft by unethical people who link to their pictures). My approach to this problem is reactive (I rename the picture to break the links when it happens) because I have not had it happen often enough to do anything else. But I can understand why some people want to do more. If we assume that an increasing number of bloggers do this, it would be good to not make things difficult for them by having the smallest possible number of referrer URLs. It would suck for the readers to find that planet.debian.org has the pictures but planet.debian.net doesn’t.
The solution to this is simple, one name should redirect to the other. Having something like the following in the Apache virtual host configuration (or the .htaccess) file for the least preferred name should redirect all access to the other name.
RewriteCond %{REQUEST_URI} ^(.*$) [NC]
RewriteRule . http://planet.example.com/%1 [R=301,L]
In my posts last night I omitted the URLs for the Planet Searches from the email version (by not making them human readable). Here they are:
Here is a Google Custom Search for Planet Linux Australia (homepage):
I’ve just been experimenting with Google Custom Search [1]. Below are two custom search engines I created to generate searches for Planet Debian and Planet Ubuntu (for each Planet it searches all the blogs that are syndicated – and doesn’t just get the category that is syndicated). It’s interesting to compare search terms such as “selinux” to get an idea for how much topics are being discussed in the two communities. I’m going to set up cron jobs to update these CSEs as the Planet subscription lists change. Also it would be quite easy for me to set up a custom search that covers both Debian and Ubuntu, and other planets as well.
If nothing else this will save me from the problem of finding a blog post that has just scrolled off a Planet that I read.
Planet Debian (homepage):
Planet Ubuntu (homepage):
I have just added two new categories to my blog, one is for the most popular posts [1] (as indicated by the number of hits on the permalink pages). The other is for the best posts [2]. My criteria for adding a post to the best-posts list is that it provides some information that is new or some analysis that others do not appear to have performed, that it doesn’t get refuted by someone else (sometimes an idea seems good but someone points out a flaw), and that there is some level of interest in it from readers (based on page hits, comments, and links from other blogs).
Both of these categories may be added to posts some days or weeks after they are published. So adding the feeds for them to a syndication configuration might not be a good idea as they will always include posts that are old. I expect that a typical Planet configuration would never display posts from those feeds.
I suggest that other people consider adding similar categories to their blogs. It will allow readers who quickly browse your blog to see the posts that you regard as your best content and other bloggers in the same space to see what gets the most hits (which is worth-while if you don’t consider blogging to be a zero sum game).
I expect that someone will suggest that I only write posts that are eligible for the best-posts category. However this is one example of a post which I don’t consider to be eligible but which will still be useful to some people.
On scienceblogs.com there is an interesting article about statistics and “The Surge” in Iraq [1]. It explains how there is not yet enough data to statistically determine whether The Surge is succeeding in improving the situation in Iraq. Some of the comments point out that the “ethnic cleansing” in some parts of Iraq has been mostly completed. When one of the groups in a disputed area is annihilated or driven out you can expect a reduction in violence.
The Guardian reports that Alan Greenspan has stated in his memoirs [2] “I am saddened that it is politically inconvenient to acknowledge what everyone knows: the Iraq war is largely about oil“. His meoirs are published as The Age of Turbulence: Adventures in a New World . I’ve added it to my Amazon Wish List.
I have just read an interesting post by Ted Ts’O about copyright protection on the net [1]. Ted is well known as a free software programmer, but it’s slightly less well known that he is an avid Science-Fiction fan. In the Free Software community most people seem to be interested in Sci-Fi, but Ted is more interested than most.
Ted’s post concerns the irresponsible actions of the Science Fiction and Fantasy Writers of America [2] (SFWA). To summarise it they issued DMCA [3] take-down notices for any web page that matched a search on the names “Asimov” or “Silverberg“. I don’t approve of the DMCA laws as a collection, but take-down notices are not necessarily bad (I have issued such notices for unauthorised copies of my own work in the past). The problem in this case is that the words in question are extremely common, not only might they be used as the names for other people (authors or characters) but Asimov in particular is a well known term when describing the potential development of intelligent computers that operate robots (see the Three Laws of Robotics [4]), the term “not Asimov Compliant” has been used by Alastair Reynolds in Century Rain to refer to a class of military robots that have no compunction about killing humans.
Among the fall-out of the SFWA actions was the removal of a free novel by Cory Doctorow [4]. Incidentally my favourite free to download Cory Doctorow book is Eastern Standard Tribe [5]. craphound.com has Cory’s blog as well as links to other free Sci-Fi that he’s written.
Reading the links from Ted’s post took me to a blog entry by the current SFWA vice-president [6] which describes authors such as Cory Doctorow as “webscabs“. This offends me greatly. My work and that of my friends in writing free software could be described in the same way (and in fact is described in a similar way by some software monopolists). Every blogger could have their work described in a similar way by paid journalists.
The fact that the SFWA VP is not representing SFWA when writing such comments does little to allay concern about this. It seems to me that people with such ideas are intent on attacking my community, and that it would be wrong of me to give any of them $0.50 by buying one of their books. I resolve to not buy any more Sci-Fi books until I have read all the freely available books that I want to read. After that I will prioritise my book purchases with a significant factor being how well the author gets the concepts of copyright etc. If nothing else an author who can’t understand how copyright (something that is essential to their own livelyhood) interacts with current computer systems will have significant difficulties in predicting how technology and society will develop over the next hundred or thousand years.
My problem in reading Sci-Fi books is not in discovering books that are enjoyable and which contain interesting concepts, but in finding time to read them. Thanks to SFWA for giving me an extra criteria to cull the list of books to read.
Recently my iRiver [1] H320 had some milk based drink spilt on it. I’m not sure what the drink was (I discovered it when my iRiver stopped working and the drink was dry) but it smelled like coffee or hot chocolate when I washed it off (I considered tasting it but decided that knowing exactly which drink had damaged my iRiver probably wouldn’t help me fix it).
The initial problem was that no buttons other than the play button (which is also used to turn it on) worked. When I first discovered this I had no way of hitting the reset button so my iRiver played until the battery ran flat. I tried to disassemble it by removing the five tiny phillips-head screws from the sides, but that didn’t make any part of it loose. I tried using a small amount of force on the front piece of plastic and broke two of the clips that hold it in place while getting another two loose without breaking (but there were still at least four clips to go).
Then I realised that the problem was that the keys were physically sticking and that maybe if I washed the keypad out I might get it to work. So I spent some time in a cycle of dripping water into my iRiver, pressing the buttons to get some of the nasty stuff dissolved, and then using a towel to soak up some of the water with milk or whatever. After repeating this for a while the buttons all seemed to work well apart from the play button which kept registering presses when I wasn’t touching it. This meant that it always automatically turned on and then played a song in a stuttering manner as the play button is also the pause button and it paused and played as rapidly as it could.
Finally I left it in the sun to dry for a few hours, which seemed to do some good. The play button mostly works now. Also it seems quite easy to get water between the front layer of protective plastic and the layer behind (which actually houses the keypad). So I have several large drops of water spread out between the layers which move around as I squeeze it. I think that if I get that dried out before algae can grow then everything will be fine.
|
|