Fixing the Correct Network Bottleneck


The latest news in the Australian IT industry is the new National Broadband Network (NBN) plan [1]. It will involve rolling out Fiber To The Home for 90% of the population, the plan is that it will cost the government $43,000,000,000 making it the biggest government project. Kevin Rudd used Twitter to say “Just announced biggest ever investment in Australian broadband – really exciting, infrastructure for the future” [2].

Now whenever someone says that a certain quantity of a resource is enough you can expect someone to try and refute that claim by mentioning that Bill Gates supposedly stated that “640K is enough” when referring to the RAM limits of the original IBM PC. As an aside, it’s generally believed that Bill Gates actually didn’t claim that 640K would be enough RAM, Wikiquote has him claiming to have never said any such thing [3]. He did however say that he had hoped that it would be enough for 10 years. I think that I needed that disclaimer before stating that I think that broadband speeds in Australia are high enough at the moment.

In any computer system you will have one or more resources that will be limited and will be bottlenecks that limit the overall performance.  Adding more of other resources will often make no difference to performance that a user might notice.

On the machine I’m using right now to browse the web the bottleneck is RAM.  A combination of bloated web pages and memory inefficient web browsers uses lots of memory, I have 1.5G of RAM and currently there is 1.3G of swap in use and performance suffers because of it.  It’s not uncommon for the machine to page enough that the mouse cursor is not responsive while browsing the web.

My options for getting faster net access on this machine are to add more RAM (it can’t take more than 2G – so that doesn’t gain much), to use more memory efficient web browsers and X server, and to simply buy a new machine. Dell is currently selling desktop machines with 2G of RAM, as they are 64bit systems and will therefore use more memory than 32bit systems for the same tasks they will probably give less performance than my 32bit machine with 1.5G of RAM for my usage patterns.

Also the latest EeePC [4] ships with 1G of RAM as standard and is limited to a maximum of 2G, I think that this is typical of Netbook class systems. I don’t use my EeePC for any serious work, but I know some people who do.

Does anyone have suggestions on memory efficient web browsers for Linux? I’m currently using Konqueror and Iceweasel (Firefox). Maybe the government could get a better return on their investment by spending a small amount of money sponsoring the development of free web browsers. A million dollars spent on optimising Firefox seems likely to provide good performance benefits for everyone.

My wife’s web browsing experience is bottlenecked by the speed of the video hardware in her machine (built-in video on a Dell PowerEdge T105 which is an ATI ES1000). The recent dramatic price reductions of large TFT monitors seem likely to make video performance more of an issue, and also increases the RAM used by the X server.

Someone who has reasonably good net access at the moment will have an ADSL2+ connection and a computer that is equivalent to a low-end new Dell machine (which is more powerful than the majority of systems in use). In that case the bottleneck will be in the PC used for web browsing if you are doing anything serious (EG having dozens of windows open, including PDFs and other files that are commonly loaded from the web). If however a machine was used for simply downloading web pages with large pictures in a single session then FTTH would provide a real benefit. Downloading movies over the net would also benefit a lot from FTTH. So it seems to me that browsing the web for research and education (which involves cross-referencing many sites) would gain more of a benefit from new hardware (which will become cheap in a few years) while porn surfing and downloading movies would gain significantly from FTTH.

The NBN will have the potential to offer great bi-directional speeds. The ADSL technology imposes a limit on the combination of upload and download speeds, and due to interference it’s apparently technically easier to get a high download speed. But the upload speeds could be improved a lot by using different DSLAMS. Being able to send out data at a reasonable speed (20Mbit/s or more) has the potential to significantly improve the use of the net in Australia. But if the major ISPs continue to have terms of service prohibiting the running of servers then that won’t make much difference to most users.

Finally there’s the issue of International data transfer which is slow and expensive. This is going to keep all affordable net access plans limited to a small quota (20G of downloads per month or less).

It seems to me that the best way of spending taxpayer money to improve net access would be to provide better connectivity to the rest of the world through subsidised International links.

Brendan makes an interesting point that the NBN is essentially a subsidy to the entertainment industry and that copyright law reform should be a higher priority [5].

10 thoughts on “Fixing the Correct Network Bottleneck”

  1. Joe Buck says:

    I had a home system that used to run a 64-bit distro, with a 64-bit Firefox and nspluginwrapper. Firefox would regularly go crazy and grow to 2G. I finally switched to a 32-bit distro. It’s a marked improvement.

  2. jaymzjulian says:

    Life is a little more complicated than this, though – we can actually get data in and out of the coutnry relativly efficiently, if we choose – particularly for bulk data (if there is 2 seconds latency on your youtube, bittorrent, smtp, it doens’t matter so much…)

    but a _major_ cost for ISP’s these days is getting data from the CO to the user – the data charged(!) ADSL1 backhauls are a prime example of this, which is one of the reasons that ADSL2 ends up being literally half the price. Except that now there is the infrastructure cost of laying down multiple sets of ADSL2 dslams.

    This is actually quite different to the situation in other countries, where international and peering traffic is the #1 cost by far – it is still the #1 cost here for a adsl2 (or indepentant adsl1-dslam isp) isp, but it does not dwarf costs like in other countries.

    The point i’m driving at, is that the point of the NBM isn’t to ay lots of speed now – but rather to lay a replacement network to the network we gifted to telstra, that isn’t operated on quite as fucked of a set of commercial terms. Being future proof while doing it just makes sense…. it’s not much cheaper to lay copper, after all.

  3. btmorex says:

    Couldn’t you just close a bunch of windows? Maybe I’m in the minority here, but I rarely have more than say 10 tabs open in firefox. 1GB of memory is easily enough to take care of that assuming you restart your browser occasionally and flash doesn’t go crazy (default disable flash is a good way to solve that problem).

    Also, it’s not that hard to find a cheap notebook that will accommodate 4GB of memory. The cheapest general purpose laptop I could find on Dell’s site, the inspiron 15, can take 4GB of memory and it starts at $399.

  4. etbe says:

    Joe: 64bit however makes certain CPU operations more efficient and allows hardware virtualisation. It’s the way of the future.

    Jaymz: I’ve recently read some analysis of the situation by people who run big ISPs, they claim that international data transfer is a major cost. You may be right that the local loop is the major cost for a 20G plan, but if you want to transfer 100G per month then I think that data transfer will be the problem.

    We should just take back that gift from Telstra.

    btmorex: Sometimes I work on something and then go back to it a week later. It’s easiest to leave the windows open.

    As for getting a laptop with 4G of RAM, I doubt that I will be able to buy a cheap Netbook with 4G any time soon, and a laptop to compare with my current machine (Thinkpad T41p with 1400*1050 display) would probably be moderately expensive.

    I think that RAM is increasing faster than the inefficiency of web browsers, so we may be OK in a few years. ;)

  5. glandium says:

    I don’t know what web sites you are browsing to, but I usually have epiphany loaded with 20+ tabs *and* iceweasel running, and still don’t use more than a GB of RAM for that + the rest of the system (1.5 at most). It rarely goes beyond that with browsing alone, and the swap is rarely filled with anything.

    Still, I have 4GB, so I can do a bunch of other things…

  6. mariuz says:

    for browsers i recomend arora 0.6 and on qt 4.5
    I think it’s 0.5 on unstable but you can rebuild it easily , it’s my main browser on low end machines also i have spotted it on maemo devices too

    On ubuntu jaunty and intrepid i have created some binaries to be used

  7. mariuz says:

    also i use the flash plugin 10 in my ubuntu on 64 bits , it’s supported by webkit and arora
    also i have seen that is better for you to try an firefox 3.1 , it’s a little bit faster

  8. clayton says:

    I have found that the following plugins greatly increase the responsiveness of Firefox: noscript and flashblock. They by default block all flash and scripts, respectively, by default, and over-riding the block in the (for me not common) event that you want to run the flash or script is a simple click of the mouse. And then there are the security benefits of this arrangement…. Highly recommended.

  9. brettly says:

    etbe Says:
    btmorex: Sometimes I work on something and then go back to it a week later. It’s easiest to leave the windows open.

    That’s really interesting – I’ve never even considered working like that. I guess that’s (partially) why I rarely use half of my 2Gb ram and almost never use any swap space. My processor gets a good working out though. Food for thought.

  10. etbe says:

    glandium: At the moment I have four Iceweasel (Firefox) tabs open and 51 Konqueror windows (I don’t use tabs with Konqueror). This isn’t a lot for my usage patterns.

    mariuz: Arora is interesting, a bit feature poor though. It has no support for exceptions to the web proxy, it doesn’t support browsing directories (eg file:/// ), the full-screen mode (F11) still displays the menus (this might be considered a feature), and the display of the tab bar when there is only one tab open makes it unsuitable for use on a NetBook. It seems to be in the early stages of development, so hopefully they can add some of these features without it getting bloated.

    Using one browser configured for flash and another configured without any support might improve performance.

    clayton: Thanks for the suggestion. I’ll try that noscript one. But for flash, most machines I run don’t have any support for it so it’s already blocked for me.

    brettly: I could close the windows and open them again when I need them. But then I would have to track the URLs. Also closing the windows would lose my position in the page (which is annoying for large web pages and PDF files) and in some situations I have positioned two windows so that I can compare two sets of content. If a web browser could allow me to close windows and then open them in the same way on demand (with the same size, font-size, position in horizontal and vertical scrolling, etc) then I would dramatically reduce the number of windows that I keep open.

Comments are closed.