Photograph Your Work

One thing I should have learned before (but didn’t) and hope I’ve learned now is to photograph sysadmin work.

If you work as a sysadmin you probably have a good phone, if you are going to run ssh from a phone or use a phone to read docs while in a server room with connectivity problems you need a phone with a good screen. You will also want a phone that has current security support. Such a phone will have a reasonable amount of storage space, I doubt that you can get a phone with less than 32G of storage that has a decent screen and Android security support. Admittedly Apple has longer security support for iPhones than Google does for Nexus/Pixel phones so it might be possible to get an older iPhone with a decent screen and hardly any space (but that’s not the point here).

If you have 32G of storage on your phone then there’s no real possibility of using up your storage space by photographing a day’s work. You could probably take an unreasonable number of photos of a week’s work as well as a few videos and not use up much of that.

The first time I needed photos recently was about 9 months ago when I was replacing some network gear (new DSL modem and switch for a client). The network sockets in the rack weren’t labelled and I found it unreasonably difficult to discover where everything was (the tangle of cables made tracking them impossible). What I should have done is to photograph the cables before I started and then I would have known where to connect everything. A 12MP camera allows zooming in on photos to get details, so a couple of quick shots of that rack would have saved me a lot of time – and in the case where everything goes as planned taking a couple of photos isn’t going to delay things.

Last night there was a power failure in a server room that hosts a couple of my machines. When power came back on the air-conditioner didn’t start up and the end result was a server with one of it’s disks totally dead (maybe due to heat, maybe power failures, maybe it just wore out). For unknown reasons BTRFS wouldn’t allow me to replace the disk in the RAID-1 array so I needed to copy the data to a new disk and create a new mirror (taking a lot of my time and also giving downtime). While I was working on this the filesystem would only mount read-only so no records of the kernel errors were stored. If I had taken photos of the screen I would have records of this which might allow me to reproduce the problem and file a bug report. Now I have no records, I can’t reproduce it, and I have a risk that next time a disk dies in a BTRFS RAID-1 I’ll have the same problem. Also presumably random people all over the world will suffer needless pain because of this while lacking the skills to file a good bug report because I didn’t make good enough records to reproduce it.

Hopefully next time I’m in a situation like this I’ll think to take some photos instead of just rebooting and wiping the evidence.

As an aside I’ve been finding my phone camera useful for zooming in on serial numbers that I can’t read otherwise. I’ve got new glasses on order that will hopefully address this, but in the mean time it’s the only way I can read the fine print. Another good use of a phone camera is recording error messages that scroll past too quickly to read and aren’t logged. Some phones support slow motion video capture (up to 120fps or more) and even for phones that don’t you can use slow play (my favourite Android video player MX Player works well at 5% normal speed) to capture most messages that are too quick to read.

Android Screen Saving

Just over a year ago I bought a Samsung Galaxy Note 2 [1]. About 3 months ago I noticed that some of the Ingress menus had burned in to the screen. Back in ancient computer times there were “screen saver” programs that blanked the screen to avoid this, then the “screen saver” programs transitioned to displaying a variety of fancy graphics which didn’t really fulfill the purpose of saving the screen. With LCD screens I have the impression that screen burn wasn’t an issue, but now with modern phones we have LED displays which have the problem again.

Unfortunately there doesn’t seem to be a free screen-saver program for Android in the Google Play store. While I can turn the screen off entirely there are some apps such as Ingress that I’d like to keep running while the screen is off or greatly dimmed. Now I sometimes pull the notification menu down when I’m going to leave Ingress idle for a while, this doesn’t stop the screen burning but it does cause different parts to burn which alleviates the problem.

It would be nice if apps were designed to alleviate this. A long running app should have an option to change the color of it’s menus, it would be ideal to randomly change the color on startup. If the common menus such as the “COMM” menu would appear in either red, green, or blue (the 3 primary colors of light) in a ratio according to the tendency to burn (blue burns fastest so should display least) then it probably wouldn’t cause noticable screen burn after 9 months. The next thing that they could do is to slightly vary the position of the menus, instead of having a thin line that’s strongly burned into the screen there would be a fat line lightly burned in which should be easier to ignore.

It’s good when apps have an option of a “dark” theme, that involves less light coming from the screen that should reduce battery use and screen burn. A dark theme should be at least default and probably mandatory for long running apps, a dark theme is fortunately the only option for Ingress.

I am a little disappointed with my phone. I’m not the most intensive Ingress player so I think that the screen should have lasted for more than 9 months before being obviously burned.

Desktop Publishing is Wrong

When I first started using computers a “word processor” was a program that edited text. The most common and affordable printers were dot-matrix and people who wanted good quality printing used daisy wheel printers. Text from a word processor was sent to a printer a letter at a time. The options for fancy printing were bold and italic (for dot-matrix), underlines, and the use of spaces to justify text.

It really wasn’t much good if you wanted to include pictures, graphs, or tables. But if you just wanted to write some text it worked really well.

When you were editing text it was typical that the entire screen (25 rows of 80 columns) would be filled with the text you were writing. Some word processors used 2 or 3 lines at the top or bottom of the screen to display status information.

Some time after that desktop publishing (DTP) programs became available. Initially most people had no interest in them because of the lack of suitable printers, the early LASER printers were very expensive and the graphics mode of dot matrix printers was slow to print and gave fairly low quality. Printing graphics on a cheap dot matrix printer using the thin continuous paper usually resulted in damaging the paper – a bad result that wasn’t worth the effort.

When LASER and Inkjet printers started to become common word processing programs started getting many more features and basically took over from desktop publishing programs. This made them slower and more cumbersome to use. For example Star Office/OpenOffice/LibreOffice has distinguished itself by remaining equally slow as it transitioned from running on an OS/2 system with 16M of RAM in the early 90’s to a Linux system with 256M of RAM in the late 90’s to a Linux system with 1G of RAM in more recent times. It’s nice that with the development of PCs that have AMD64 CPUs and 4G+ of RAM we have finally managed to increase PC power faster than LibreOffice can consume it. But it would be nicer if they could optimise for the common cases. LibreOffice isn’t the only culprit, it seems that every word processor that has been in continual development for that period of time has had the same feature bloat.

The DTP features that made word processing programs so much slower also required more menus to control them. So instead of just having text on the screen with maybe a couple of lines for status we have a menu bar at the top followed by a couple of lines of “toolbars”, then a line showing how much width of the screen is used for margins. At the bottom of the screen there’s a search bar and a status bar.

Screen Layout

By definition the operation of a DTP program will be based around the size of the paper to be used. The default for this is A4 (or “Letter” in the US) in a “portrait” layout (higher than it is wide). The cheapest (and therefore most common) monitors in use are designed for displaying wide-screen 16:9 ratio movies. So we have images of A4 paper with a width:height ratio of 0.707:1 displayed on a wide-screen monitor with a 1.777:1 ratio. This means that only about 40% of the screen space would be used if you don’t zoom in (but if you zoom in then you can’t see many rows of text on the screen). One of the stupid ways this is used is by companies that send around word processing documents when plain text files would do, so everyone who reads the document uses a small portion of the screen space and a large portion of the email bandwidth.

Note that this problem of wasted screen space isn’t specific to DTP programs. When I use the Google Keep website [1] to edit notes on my PC they take up a small fraction of the screen space (about 1/3 screen width and 80% screen height) for no good reason. Keep displays about 70 characters per line and 36 lines per page. Really every program that allows editing moderate amounts of text should allow more than 80 characters per line if the screen is large enough and as many lines as fit on the screen.

One way to alleviate the screen waste on DTP programs is to use a “landscape” layout for the paper. This is something that all modern printers support (AFAIK the only printers you can buy nowadays are LASER and ink-jet and it’s just a big image that gets sent to the printer). I tried to do this with LibreOffice but couldn’t figure out how. I’m sure that someone will comment and tell me I’m stupid for missing it, but I think that when someone with my experience of computers can’t easily figure out how to perform what should be a simple task then it’s unreasonably difficult for the vast majority of computer users who just want to print a document.

When trying to work out how to use landscape layout in LibreOffice I discovered the “Web Layout” option in the “View” menu which allows all the screen space to be used for text (apart from the menu bar, tool bars, etc). That also means that there are no page breaks! That means I can use LibreOffice to just write text, take advantage of the spelling and grammar correcting features, and only have screen space wasted by the tool bars and menus etc.

I never worked out how to get Google Docs to use a landscape document or a single webpage view. That’s especially disappointing given that the proportion of documents that are printed from Google Docs is probably much lower than most word processing or DTP programs.

What I Want

What I’d like to have is a word processing program that’s suitable for writing draft blog posts and magazine articles. For blog posts most of the formatting is done by the blog software and for magazine articles the editorial policy demands plain text in most situations, so there’s no possible benefit of DTP features.

The ability to edit a document on an Android phone and on a Linux PC is a good feature. While the size of a phone screen limits what can be done it does allow jotting down ideas and correcting mistakes. I previously wrote about using Google Keep on a phone for lecture notes [2]. It seems that the practical ability of Keep to edit notes on a PC is about limited to the notes for a 45 minute lecture. So while Keep works well for that task it won’t do well for anything bigger unless Google make some changes.

Google Docs is quite good for editing medium size documents on a phone if you use the Android app. Given the limitations of the device size and input capabilities it works really well. But it’s not much good for use on a PC.

I’ve seen a positive review of One Note from Microsoft [3]. But apart from the fact that it’s from Microsoft (with all the issues that involves) there’s the issue of requiring another account. Using an Android phone requires a Gmail account (in practice for almost all possible uses if not in theory) so there’s no need to get an extra account for Google Keep or Docs.

What would be ideal is an Android editor that could talk to a cloud service that I run (maybe using WebDAV) and which could use the same data as a Linux-X11 application.

Any suggestions?

Sound Device Order with ALSA

One problem I have had with my new Dell PowerEdge server/workstation [1] is that sound doesn’t work correctly. When I initially installed it things were OK but after installing a new monitor sound stopped working.

The command “aplay -l” showed the following:
**** List of PLAYBACK Hardware Devices ****
card 0: Generic [HD-Audio Generic], device 3: HDMI 0 [HDMI 0]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 1: Speaker [Logitech USB Speaker], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0

So the HDMI sound hardware (which had no speakers connected) became ALSA card 0 (default playback) and the USB speakers became card 1. It should be possible to convert KDE to use card 1 and then have other programs inherit this, but I wasn’t able to configure that with Debian/Wheezy.

My first attempt at solving this was to blacklist the HDMI and motherboard drivers (as suggested by Lindsay on the LUV mailing list). I added the following to /etc/modprobe.d/hdmi-blacklist.conf:
blacklist snd_hda_codec_hdmi
blacklist snd_hda_intel

Blacklisting the drivers works well enough. But the problem is that I will eventually want to install HDMI speakers to get better quality than the old Logitech portable USB speakers and it would be convenient to have things just work.

Jason white suggested using the module options to specify the ALSA card order. The file /etc/modprobe.d/alsa-base.conf in Debian comes with an entry specifying that the USB driver is never to be card 0, which is exactly what I don’t want. So I commented out the previous option for snd-usb-audio and put in the following ones to replace it:
# make USB 0 and HDMI/Intel anything else
options snd-usb-audio index=0
options snd_hda_codec_hdmi=-2
options snd_hda_intel=-2

Now I get the following from “aplay -l” and both KDE and mplayer will play to the desired card by default:
**** List of PLAYBACK Hardware Devices ****
card 0: Speaker [Logitech USB Speaker], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 1: Generic [HD-Audio Generic], device 3: HDMI 0 [HDMI 0]
  Subdevices: 1/1
  Subdevice #0: subdevice #0

Advice for Web Editing

A client has recently asked for my advice on web editing software. There are lots of programs out there for editing web sites and according to a quick Google search there are free Windows programs to do most things that you would want to do.

The first thing I’m wondering about is whether the best option is to just get a Linux PC for web editing. PCs capable of running Linux are almost free nowadays (any system which is too slow for the last couple of Windows versions will do nicely). While some time will have to be spent in learning a new OS someone who uses Linux for such tasks will be able to use fully-featured programs such as the GIMP which are installed as part of the OS. While it is possible to configure a Windows system to run rsync to copy a development site to the production server and to have all the useful tools installed it’s much easier to run a few apt-get or yum commands to install the software and then copy some scripts to the user’s home directory.

The next issue is whether web editing is the best idea. Sites that are manually edited tend to be very simple, inconsistent, or both. Some sort of CMS seems to be the better option. WordPress is a CMS that I’m very familiar with so it’s easy for me to install it for a client, while I try and resist the temptation to force my favorite software on clients there is the issue that I can install WordPress quickly which therefore saves money for my client. WordPress is a CMS that supports installing different themes (and has a huge repository of free themes). The content that it manages consists of “pages” and “posts”, two arbitrary types of document. Supporting two types of document with a common look and feel and common important data in a side-bar seems to describe the core functionality used by most web sites for small businesses.

Does anyone have any other ideas for ways of solving this problem? Note that it should be reasonably easy to use for someone who hasn’t had much experience at doing such things, it shouldn’t take much sysadmin time to install or cost to run.

Google web sites and Chromium CPU Use

Chromium is the free software build of the Google Chrome web browser. It’s essentially the same as the Google code but will often be an older version, particularly when you get Chromium from Debian/Stable (or any other Linux distribution that doesn’t track the latest versions all the time) and compare it to getting Chrome straight from Google.

My wife is using Chromium on an AMD Opteron 1212 for all the usual web browsing tasks. Recently I’ve noticed that it takes a lot of CPU time whenever she leaves a Google web site open, that can be Google+, Gmail, or Youtube.

Web standards are complex and it’s difficult to do everything the way that one might desire. Making a web browser that doesn’t take 100% CPU time when the user is away from their desk may be a difficult technical challenge. Designing a web site that doesn’t trigger such unwanted behavior in common web browsers might also be a challenge.

But when one company produces both a web browser and some web sites that get a lot of traffic it’s rather disappointing that they don’t get this right.

It could be that Google have fixed this in a more recent version of the Chrome source tree, and it could be that they fixed the browser code before rolling out a new version of Google+ etc which causes problems with the old version (which might explain why I’ve never seen this problem). But even if that is the case it’s still disappointing that they aren’t supporting older versions. There is a real need for computers that don’t need to be updated all the time, running a 3 month old Linux distribution such as Debian/Wheezy shouldn’t be a problem.

There’s also a possibility that part of the cause of the problem is that an Opteron 1212 is a relatively slow CPU by today’s standards and it’s the slowest system I’m currently supporting for serious desktop use. I don’t even think it was one of the fastest CPUs available when it was released 4 years ago. But I think we should be able to expect systems to remain usable for more than 4 years. The Opteron 1212 system is a Dell PowerEdge tower server that is used as a workstation and a file server, so while I get desktop systems with faster CPUs for free I want to keep using the old PowerEdge server to avoid data corruption. As an aside I’ve been storing important data on BTRFS for a year now and the only data loss I’ve suffered has been due to a faulty DIMM. The filesystem checksums built in to modern filesystems such as BTRFS and ZFS mean that RAM corruption covers a greater portion of the risk to data integrity and the greater complexity of the data structures in such filesystems gives the possibility of corruption that can’t be fixed without mkfs (as happened to me twice on the system with a bad DIMM).

The consequences of such wasted CPU use are reduced CPU time for other programs which might be doing something useful, extra electricity use, and more noise from CPU cooling fans (which is particularly annoying for me in this case).

Any suggesstions for reducing the CPU use of web browsers, particularly when idle?

Advice on Buying a PC

A common topic of discussion on computer users’ group mailing lists is advice on buying a PC. I think that most of the offered advice isn’t particularly useful with an excessive focus on building or upgrading PCs and on getting the latest and greatest. So I’ll blog about it instead of getting involved in more mailing-list debates.

A Historical Perspective – the PC as an Investment

In the late 80’s a reasonably high-end white-box PC cost a bit over $5,000 in Australia (or about $4,000 without a monitor). That was cheaper than name-brand PCs which cost upwards of $7,000 but was still a lot of money. $5,000 in 1988 would be comparable to $10,000 in today’s money. That made a PC a rather expensive item which needed to be preserved. There weren’t a lot of people who could just discard such an investment so a lot of thought was given to upgrading a PC.

Now a quite powerful desktop PC can be purchased for a bit under $400 (maybe $550 if you include a good monitor) and a nice laptop is about the same price as a desktop PC and monitor. Laptops are almost impossible to upgrade apart from adding more RAM or storage but hardly anyone cares because they are so cheap. Desktop PCs can be upgraded in some ways but most people don’t bother apart from RAM, storage, and sometimes a new video card.

If you have the skill required to successfully replace a CPU or motherboard then your time is probably worth enough that getting more value out of a PC that was worth $400 when new and is worth maybe $100 when it’s a couple of years old probably isn’t a good investment.

Times have changed and PCs just aren’t worth enough to be bothered upgrading. A PC is a disposable item not an investment.

Buying Something Expensive?

There are a range of things that you can buy. You can spend $200 on a second-hand PC that’s a couple of years old, $400 on a new PC that’s OK but not really fast, or you can spend $1000 or more on a very high end PC. The $1000 PC will probably perform poorly when compared to a PC that sells for $400 next year. The $400 PC will probably perform poorly when compared to the second-hand systems that are available next year.

If you spend more money to get a faster PC then you are only getting a faster PC for a year until newer cheaper systems enter the market.

As newer and better hardware is continually being released at low enough prices that make upgrades a bad deal I recommend just not buying expensive systems. For my own use I find that e-waste is a good source of hardware. If I couldn’t do that then I’d buy from an auction site that specialises in corporate sales, they have some nice name-brand systems in good condition at low prices.

One thing to note is that this is more difficult for Windows users due to “anti-piracy” features. With recent versions of Windows you can’t just put an old hard drive in a new PC and have it work. So the case for buying faster hardware is stronger for Windows than for Linux.

That said, $1,000 isn’t a lot of money. So spending more money for a high-end system isn’t necessarily a big deal. But we should keep in mind that it’s just a matter of getting a certain level of performance a year before it is available in cheaper systems. Getting a $1,000 high-end system instead of a $400 cheap system means getting that level of performance maybe a year earlier and therefore at a price premium of maybe $2 per day. I’m sure that most people spend more than $2 per day on more frivolous things than a faster PC.

Understanding How a Computer Works

As so many things are run by computers I believe that everyone should have some basic knowledge about how computers work. But a basic knowledge of computer architecture isn’t required when selecting parts to assemble to make a system, one can know all about selecting a CPU and motherboard to match without understanding what a CPU does (apart from a vague idea that it’s something to do with calculations). Also one can have a good knowledge of how computers work without knowing anything about the part numbers that could be assembled to make a working system.

If someone wants to learn about the various parts on sale then sites such as Tom’s Hardware [1] provide a lot of good information that allows people to learn without the risk of damaging expensive parts. In fact the people who work for Tom’s Hardware frequently test parts to destruction for the education and entertainment of readers.

But anyone who wants to understand computers would be better off spending their time using any old PC to read Wikipedia pages on the topic instead of spending their time and money assembling one PC. To learn about the basics of computer operation the Wikipedia page for “CPU” is a good place to start. Then the Wikipedia page for “hard drive” is a good start for learning about storage and the Wikipedia page for Graphics Processing Unit to learn about graphics processing. Anyone who reads those three pages as well as a selection of pages that they link to will learn a lot more than they could ever learn by assembling a PC. Of course there’s lots of other things to learn about computers but Wikipedia has pages for every topic you can imagine.

I think that the argument that people should assemble PCs to understand how they work was not well supported in 1990 and ceased to be accurate once Wikipedia became popular and well populated.

Getting a Quality System

There are a lot of arguments about quality and reliability, most without any supporting data. I believe that a system designed and manufactured by a company such as HP, Lenovo, NEC, Dell, etc is likely to be more reliable than a collection of parts uniquely assembled by a home user – but I admit to a lack of data to support this belief.

One thing that is clear however is the fact that ECC RAM can make a significant difference to system reliability as many types of error (including power problems) show up as corrupted memory. The cheapest Dell PowerEdge server (which has ECC RAM) is advertised at $699 so it’s not a feature that’s out of reach of regular users.

I think that anyone who makes claims about PC reliability and fails to mention the benefits of ECC RAM (as used in Dell PowerEdge tower systems, Dell Precision workstations, and HP XW workstations among others) hasn’t properly considered their advice.

Also when discussing overall reliability the use of RAID storage and a good backup scheme should be considered. Good backups can do more to save your data than anything else.

Conclusion

I think it’s best to use a system with ECC RAM as a file server. Make good backups. Use ZFS (in future BTRFS) for file storage so that data doesn’t get corrupted on disk. Use reasonably cheap systems as workstations and replace them when they become too old.

Update: I find it rather ironic when a discussion about advice on buying a PC gets significant input from people who are well paid for computer work. It doesn’t take long for such a discussion to take enough time that the people involved could spent their time working instead, put enough money in a hat to buy a new PC for the user in question, and still had money left over.

Geographic Sorting – Lessons to Learn from Ingress

I’ve recently been spending a bit of my spare time playing Ingress (see the Wikipedia page if you haven’t heard of it). A quick summary is that Ingress is an Android phone game that involves geo-location of “portals” that you aim to control and most operations on a portal can only be performed when you are within 40 meters – so you do a lot of travelling to get to portals at various locations. One reasonably common operation that can be performed remotely is recharging a portal by using it’s key, after playing for a while you end up with a collection of keys which can be difficult to manage.

Until recently the set of portal keys was ordered alphabetically. This isn’t particularly useful given the fact that portal names are made up by random people who photograph things that they consider to be landmarks. If people tried to use a consistent geographic naming system (which was short enough to fit in large print on a phone display) then it would be really difficult to make it usable. But as joke names are accepted there’s just no benefit in having a sort by name.

A recent update to the Ingress client (the program which runs on the Android phone and is used for all game operations) changed the sort order to be by distance. This makes it really easy to see the portals which are near you (which is really useful) but also means that the order changes whenever you move – which isn’t such a good idea for use on a mobile phone. It’s quite common for Ingress players to recharge portals while on public transport. But with the new Ingress client the list order will change as you move so anyone who does recharging on a train will find the order of the list changing during the process and it’s really difficult to find items in a list which is in a different order each time you look at it.

This problem of ordering by location has a much greater scope than Ingress. One example is collections of GPS tagged photographs, it wouldn’t make any sense to mix the pictures of two different sets of holiday pictures because they were both taken in countries that are the same distance from my current location (as the current Ingress algorithm would do).

It seems to me that the best way of sorting geo-tagged items (Ingress portals, photos, etc) is to base it on the distance from a fixed point which the user can select. It could default to the user’s current location but in that case the order of the list should remain unchanged at least until the user returns to the main menu and I think it would be ideal for the order to remain unchanged until the user requests it.

I think that most Ingress players would agree with me that fixing annoying mis-features of the Ingress client such as this one would be better for the game than adding new features. While most computer games have some degree of make-work (in almost every case a computer could do things better than a person) I don’t think that finding things in a changing list should be part of the make-work.

Also it would be nice if Google released some code for doing this properly to reduce the incidence of other developers implementing the same mistakes as the Ingress developers in this regard.

Conversion of Video Files

To convert video files between formats I use Makefiles, this means I can run “make -j2” on my dual-core server to get both cores going at once. avconv uses 8 threads for it’s computation and I’ve seen it take up to 190% CPU time for brief periods of time, but overall it seems to average a lot less, if nothing else then running two copies at once allows one to calculate while the other is waiting for disk IO.

Here is a basic Makefile to generate a subdirectory full of mp4 files from a directory full of flv files. I used to use this to convert my Youtube music archive for my Android devices until I installed MX Player which can play every type of video file you can imagine [1]. I’ll probably encounter some situation where this script becomes necessary again so I keep it around. It’s also a very simple example of how to run a batch conversion of video files.

MP4S:=$(shell for n in *.flv ; do echo $$n | sed -e s/^/mp4\\// -e s/flv$$/mp4/ ; done)

all: $(MP4S)

mp4/%.mp4: %.flv
        avconv -i $< -strict experimental -b $$(~/bin/video-encoding-rate $<) $@ > /dev/null

Here is a more complex Makefile. I use it on my directory of big videos (more than 1280*720 resolution) and scales them down for my favorite Android devices (Samsung Galaxy S3, Samsung Galaxy S, and Sony Ericsson Xperia X10). My Galaxy S3 can’t play a FullHD version of Gangnam Style without going slow so I need to do this even for the fastest phones. This makefile generates three subdirectories of mp4 files for the three devices.

S3MP4S:=$(shell for n in *.mp4 ; do echo $$n | sed -e s/^/s3\\// -e s/.mp4$$/-s3.mp4/ -e s/.flv$$/-s3.mp4/; done)
XPERIAMP4S:=$(shell for n in *.mp4 ; do echo $$n | sed -e s/^/xperiax10\\// -e s/.mp4$$/-xperiax10.mp4/ -e s/.flv$$/-xperiax10.mp4/; done)
SMP4S:=$(shell for n in *.mp4 ; do echo $$n | sed -e s/^/galaxys\\// -e s/.mp4$$/-galaxys.mp4/ -e s/.flv$$/-galaxys.mp4/; done)

all: $(S3MP4S) $(XPERIAMP4S) $(SMP4S)

s3/%-s3.mp4: %.mp4
        avconv -i $< -strict experimental -s $(shell ~/bin/video-scale-resolution 1280 720 $<) $@ > /dev/null

galaxys/%-galaxys.mp4: %.mp4
        echo avconv -i $< -strict experimental -s $(shell ~/bin/video-scale-resolution 800 480 $<) $@ > /dev/null

xperiax10/%-xperiax10.mp4: %.mp4
        echo avconv -i $< -strict experimental -s $(shell ~/bin/video-scale-resolution 854 480 $<) $@ > /dev/null

The following script is used by the above Makefile to determine the resolution to use. Some Youtube videos have unusual combinations of width and height (Linkin Park seems to like doing this) so I scale them down so it fits the phone in one dimension and the other dimension is scaled appropriately. This requires a script from the Mplayer package and expects it to be in the location that it’s used in the Debian package, for distributions other than Debian a minor change will be required.

#!/bin/bash
set -e
OUT_VIDEO_WIDTH=$1
OUT_VIDEO_HEIGHT=$2

eval $(/usr/share/mplayer/midentify.sh $3)
XMULT=$(echo $ID_VIDEO_WIDTH*100/$OUT_VIDEO_WIDTH | bc)
YMULT=$(echo $ID_VIDEO_HEIGHT*100/$OUT_VIDEO_HEIGHT | bc)
if [ $XMULT -gt $YMULT ]; then
  NEWX=$OUT_VIDEO_WIDTH
  NEWY=$(echo $OUT_VIDEO_WIDTH*$ID_VIDEO_HEIGHT/$ID_VIDEO_WIDTH/2*2|bc)
else
  NEWX=$(echo $OUT_VIDEO_HEIGHT*$ID_VIDEO_WIDTH/$ID_VIDEO_HEIGHT/2*2|bc)
  NEWY=$OUT_VIDEO_HEIGHT
fi
echo ${NEWX}x${NEWY}

Note that I can’t preserve TAB characters in a blog post. So those Makefiles won’t work until you replace strings of 8 spaces with a TAB character.

The Death of the Netbook

The Age has an interesting article about how Apple supposedly killed the Netbook [1]. It’s one of many articles with a similar spin on the news that the last two companies making Netbooks are going to cease production. The main point of these articles is that Apple decided that Netbooks were crap and killed the market for them by producing tablets and light laptops that squeeze them out of the market.

Is the Macbook Air a Netbook?

According to the Wikipedia page the Macbook Air [2] weighs 1080g for the 11″ version and 1340g for the 13″ version. According to Wikipedia the EeePC 701 (the first EeePC) weighs 922g and the last EeePC weighs 1460g [3]. The last EeePC produced is heavier than ANY Macbook Air while the first (and lightest) EeePC is only 158g lighter than the 11″ Macbook Air.

The 11″ Macbook Air is 300*192*17mm (979cm^3) in size while the EeePC 701 is 225*165*35mm (1299cm^3) and the biggest EeePC was 266*191*38mm (1931cm^3). So the 11″ Macbook Air is 13% wider than the widest EeePC but takes less volume than any EeePC. The 13″ Macbook Air is 325*227*17mm (1254cm^3) which is still less volume than any EeePC. The Wikipedia page about Netbooks defines them as being small, lightweight, legacy-free (in terms of hardware not software) and cheap [4]. The Macbook Air clearly meets all the criteria apart from price.

The Apple US web site offers the version of the 11″ Macbook Air with 64G of storage for $999 with free shipping, for comparison the EeePC 701 was on sale in stores for $500 in 2008. The CPI adjusted price for the EeePC 701 would be at least $550 in today’s money. The Macbook is a bit less than twice as expensive as the EeePC was, but that’s more of an issue of Apple being expensive – a few years ago companies like HP were also selling Netbooks that were more expensive than the EeePC.

Unless having an awful keyboard is a criteria for being a Netbook I think that the Macbook Air meets the criteria.

As an aside, a relative recently asked me for advice on a device that is like a Macbook Air but cheaper. Does anyone know of a good option?

Is Netbook Production Ceasing?

Officeworks currently sells an ASUS “Notebook” that has a 11.6″ display and weighs 1.3kg for $398, it’s got a metal body that looks a bit like a Macbook Air (which is the latest fashion and is good for heat dissipation). That’s not advertised as a Netbook or a “Eee” product but it’s cheap, lighter than the heaviest EeePC, and not much bigger than an EeePC.

It seems that the general prices of laptops other than Apple products (which have always had higher prices) have been dropping a lot recently. There are lots of good options if you want a laptop that costs $500 or less. Even Thinkpads (one of the most expensive and best designed ranges of laptops) are well below $1000.

Do the Articles about Netbooks Make Sense?

The claims being made are that Apple skipped Netbooks because they couldn’t make a good profit. This disregards the fact that the iPhone and iPad (which are very profitable) are in the high end of the price range that was occupied by Netbooks. While Apple does make a good deal of money from the iPhone App Market it would be possible to make a Netbook with a lower production price than an iPhone because making things smaller requires more engineering work and often more expensive parts. This also disregards the fact that there are a range devices which work as an iPad case with keyboard, an iPad with such a keyboard meets most criteria for being a Netbook, so Apple is one iPad keyboard device away from selling Netbooks.

It’s interesting to note that I haven’t yet seen an article about the profits from Netbooks which didn’t make an issue of the MS-Windows license fees. The first Netbooks only ran Linux but later on they switched to Windows, that had to make a big impact on profits. An article about Netbooks which just assumes that everyone has to pay a MS license fee is missing too much of the Netbook history to be useful. I wonder if anyone could make products that are as profitable as the iPhone and Macbook Air if they had to pay for MS license fees and design their hardware to work with MS software (as opposed to Apple who can change their software to allow a cheaper hardware design).

The articles also claim that Netbooks give a bad user experience. When I bought my EeePC 701 it was the fastest system I owned for loading OpenOffice, SSD random read speeds were really good (writes sucked but that didn’t matter so much). The keyboard on an EeePC 701 is not nearly as good as a full size laptop but it is also a lot better than using a tablet, I’ve used both a 10″ Android tablet and an EeePC as a ssh client and there is no comparison. When I’m going somewhere that requires random sysadmin work (or other serious typing) and I can’t carry much weight then I still take my EeePC 701 and I don’t consider taking a tablet. The low resolution of the screen is a major issue, but it’s about the same as a Macbook Air so that’s not an advantage for Apple. I knew some people who used an EeePC 701 for the majority of their work, I couldn’t do that but obviously some people have different requirements.

I now use my phone for many tasks that I used to do on my EeePC (even light sysadmin work) so my EeePC sometimes goes unused for months. But it’s still an important part of my collection of computers. It works well for what it does and I don’t feel any need to buy a replacement. When it wears out I’ll probably buy something similar to an 11″ Macbook Air to replace it unless there’s a good option of a tablet with a detachable keyboard.

My plans for computer ownership for the near future are based on a reasonably large Android phone (currently a Samsung Galaxy S3 but maybe a Galaxy Note 2 or similar next year), a small laptop or large tablet with hardware keyboard (currently an EeePC 701), a large laptop (currently a Thinkpad T61), and a workstation (currently a NEC system with an Intel E4600 CPU and a Dell U2711 27″ monitor). A reasonably small and light system with a hardware keyboard and solid state storage is an important part of my computer needs. If tablet computers with hardware keyboards replace traditional Netbooks that’s not really killing Netbooks but introducing a new version of the same thing.

But a good way of getting web hits on an article is to claim that a once popular product is dead.