I just had a conversation with someone who thinks that their office should have no servers.
The office in question has four servers, an Internet gateway/firewall system, the old file server (and also Xen server), the new file server, and the VOIP server.
The Internet gateway system could possibly be replaced by a suitably smart ADSL modem type device, but that would reduce the control over the network and wouldn’t provide much of a benefit.
The VOIP server has to be a separate system for low latency IMHO. In theory you could use a Xen DomU for running Asterisk or you could run Asterisk on the Dom0 of the file/Xen server. But that just makes things difficult. A VOIP server needs to be reliable and is something that you typically don’t want to touch once it’s working, in this case the Asterisk server has been a few more years without upgrades than the Xen server. An Asterisk system could be replaced by a dedicated telephony device which some people might consider to be removing a server, but really a dedicated VOIP server device is just as much of a server as a P4 running Asterisk but with greater expense. A major advantage of a P4 running Asterisk is that you can easily replace the system at no cost if there is a hardware problem.
Having two file servers is excessive for a relatively small office. But running two servers is the common practice when one server is being replaced. The alternative is to just immediately cut things over which has the potential for a lot of people to arrive at work on Monday and find multiple things not working as desired. Having two file servers is a temporary problem.
Table of Contents
File Servers
The first real problem when trying to remove servers from an office is the file server.
ADSL links with Annex M can theoretically upload data at 3Mb/s which means almost 400KB/s. So if you have an office with a theoretically perfect ADSL2+ Annex M installation then you could save a 4MB file to a file server on the Internet in not much more than 10 seconds if no-one else is using the Internet connection. Note that 4MB isn’t THAT big by today’s standards, the organisation in question has many files which are considerably bigger than that. Large files include TIFF and RAW files used for high quality image processing, MS-Office documents, and data files for most accounting programs. Saving a 65MB quick-books file in 3 minutes (assuming that your Annex M connection is perfect and no-one else is using the Internet) would have to suck.
Then there’s the issue of reading files, video files (which are often used for training and promotion) are generally larger than 100MB which would be more than 30 seconds of download time at ADSL2+ speed – but if someone sends an email to everyone in the office saying “please watch this video” then the average time to load it would be a lot more. Through quickly examining my collection of Youtube downloads I found a video which averaged 590KB/s, if an office using a theoretically perfect ADSL2+ connection giving 24Mb/s (3MB/s) download speed had such a file on a remote file server then a maximum of five people could view it at one time if no-one else in the office was using the Internet.
Now when the NBN is connected (which won’t happen in areas like the Melbourne CBD for at least another 3 years) it will be possible to get speeds like 100Mb/s download and 25Mb/s upload. That would allow up to 20 people to view videos at once and a 65MB quick-books file could be saved in a mere 22 seconds if everyone else was idle. Of course that relies on the size of data files remaining the same for another 3 years which seems unlikely, currently no Youtube videos use resolutions higher than 1920*1080 (so they don’t take full advantage of a $400 Dell monitor) and there’s always potential for storing more financial data. I expect that by the time we all have 100Mb/25Mb speeds on the NBN it will be as useful to us as 24Mb/3Mb ADSL2+ Annex M speeds are today (great for home use but limited for an office full of people).
There are of course various ways of caching data, but all of them involve something which would be considered to be a “server” and I expect that all of them are more difficult to install and manage than just having a local file server.
Of course instead of crunching the numbers for ADSL speeds etc you could just think for a moment about the way that 100baseT networking to the desktop has been replaced by Gigabit networking. When people expect each workstation to have 1000Mb/s send and receive speed it seems quite obvious that one ADSL connection shared by an entire office isn’t going to work well if all the work that is done depends on it.
Management could dictate that there is to be no server in the office, but if that was to happen then the users would create file shares on their workstations so you would end up with ad-hoc servers which aren’t correctly managed or backed up. That wouldn’t be an improvement and technically wouldn’t achieve the goal of not having servers.
Home Networking Without Servers
It is becoming increasingly common to have various servers in a home network. Due to a lack of space and power and the low requirements a home file server will usually be a workstation with some big disks, but there are cheap NAS devices which some people are installing at home. I don’t recommend the cheap NAS devices, I’m merely noting that they are being used.
Home entertainment is also something that can benefit from a server. A MythTV system for recording TV and playing music has more features than a dedicated PVR box. But even the most basic PVR ($169 for a 1TB device in Aldi now) is still a fairly complex computer which would probably conflict with any aim to have a house free of servers.
The home network design of having a workstation run as a file and print server can work reasonably well as long as the desktop tasks aren’t particularly demanding (IE no games) and the system doesn’t change much (IE don’t track Debian/Testing or otherwise have new versions of software). But this is really something that only works if you only have a few workstations.
Running an office without servers seems rather silly as it seems that none of my friends are able to have a home without a server.
Running Internet Services
Hypothetically speaking if one was to run an office without servers then that would require running all the servers in question on the Internet somewhere. For some things this can work better than a local server, for example most of my clients who insist on running a mail server in their office would probably get a better result if they had a mail server running on Linode or Hetzner – or one of the “Hosted Exchange” offerings if they want a Windows mail sever. But for a file server if you were to get around the issue of bandwidth required to access the files in normal use there’s the issue of managing a server (which is going to take more effort and expense than for a server on the LAN).
Then there’s the issue of backups. In my previous post about Hard Drives for Backup [1] I considered some of the issues related to backing data up over the Internet. The big problem however is a complete restore, if you have even a few dozen gigs of data that you want to transfer to a remote server in a hurry it can be a difficult problem. If you have hundreds of gigs then it becomes a very difficult problem. I’m sure that I could find a Melbourne based Data Center (DC) that gives the option of bringing a USB attached SATA disk for a restore – but even that case would give a significant delay when compared to backing things up on a LAN. If a server on the office LAN breaks in the afternoon my client can make arrangements to let me work in their office in the evening to fix it, but sometimes DCs don’t allow 24*7 access and sometimes when they do allow access there are organisational problems that make it impossible when you want it (EG the people at the client company who are authorised become unavailable).
The Growth of Servers
Generally it’s a really bad idea to build a server that has exactly the hardware you need. The smart thing to do is to install more of every resource (disk, RAM, CPU, etc) than is needed and to allow expansion when possible (EG have some RAM slots and drive bays free). No matter how well you know your environment and it’s users you can get surprised by the way that requirements change. Buying a slightly bigger server at the start costs hardly any money but upgrading a server will cost a lot.
Once you have a server that’s somewhat over-specced you will always find other things to run on it. Many things could be run elsewhere at some cost, but if you have unused hardware then you may as well use it. Xen and other virtualisation systems are really good in this regard as they allow you to add more services without making upgrades difficult. This means that it’s quite common to have a server that is purchased for one task but which ends up being used for many tasks.
Anyone who would aspire to an office without servers would probably regard adding extra features in such a manner to be a problem. But really if you want to allow the workers to do their jobs then it’s best to be able to add new services as needed without going through a budget approval process for each one.
Conclusion
There probably are some offices where no-one does any serious file access and everyone’s work is based around a web browser or some client software that is suited to storing data on the Internet. But for an office where the workers use traditional “Office” software such as MS-Office or Libre-Office a file server is necessary.
Some sort of telephony server is necessary no matter how you do things. If you have a traditional telephone system then you might try not to call the PABX a “server”, but really that’s what it is. Then when the traditional phone service becomes too expensive you have to consider whether to use Asterisk or a proprietary system, in either case it’s really a server.
In almost every case the issue isn’t whether to have a server in the office, but how many servers to have and how to manage them.
There are other options, though, particularly if you go down the proprietary software route. Dropbox can serve as a replacement for a small fileserver, and will opportunistically sync between machines on your LAN so your internet connection doesn’t get hammered with one person uploading a file and 5 other people immediately downloading it. I don’t think there’s an open-source equivalent that does this, yet.
Hosted email/exchange has a much lower admin cost than running your own server. I don’t know any non-techy small businesses (and I know a few) who have their own email server — it’s all hosted.
There are also, at least here in NZ, dozens of companies who will fall over each other to sell you hosted Asterisk or similar. Typically for little more than the SIP trunks/DIDs/etc would cost you if you ran your own.
File uploads are a pain, but LAN sync solves some of that. No one does annex M in NZ yet but VDSL is available in more and more places, with a whopping 10Mbps of upstream bandwidth.
As much as I dislike some of the “cloud” hype, for a lot of small businesses who would otherwise be sharing USB sticks or using a cheapie NAS with no or minimal backups, it’s a good choice. And if your NAS blows up or someone’s dog eats their laptop, your files and email are still around.
A handful of my clients have moved to a serverless office by moving to a terminal server based setup. At their office, they have a Cisco router which establishes a VPN connection to a VM cluster on our end, and their PCs simply act as dumb terminals connecting to their terminal server.
That way their file server, Exchange server, SQL server, and whatever else they require, can all remain hosted and not in the office. Without a terminal server based approach, yes, the file server would not have a snowball’s chance in hell of moving out of the office.
Donald: With a cloud storage system you SHOULD consider the possibility that the provider will go away and have full backups. While a cloud system can be good for situations where someone deleted the wrong file it doesn’t solve all backup needs. There have been a few occasions where Internet storage companies have disappeared due to “hardware failure” (possibly malice), bankruptcy, and government action.
I run mail servers for a number of small companies that consider it a security problem to have their mail on the same server as someone else’s.
I guess that for VOIP there shouldn’t be much bandwidth difference between having a local Asterisk box and a remote one, in either case the same amount of bandwidth will be used for active sessions. In fact a hosted Asterisk server should save bandwidth in the case of voice mail. On occasion I’ve suggested to clients that they provide a long messaging system that answers all the FAQ type questions about their business. Instead of having a receptionist answer some questions every day they have the voice response system store an answer.
The LAN sync for dropbox sounds interesting. It probably has a very different security model to a regular file server though.
Jeremy: If you have low enough latency then that should be OK for “office” applications. But there is still the issue of image editing, videos, and other high bandwidth applications. The real benefit of using a terminal server is that you can easily establish other offices with access to the same data.
Of course that isn’t such a good option for an organisation that has recently transitioned AWAY from a terminal server based system.
Hi!
Just a quick note – in lots of other countries internet speed is not an issue, so file servers can (not that I think they should) be moved out of office. I already have seen (in Latvia) couple of cases, when wi-fi access point is network bottleneck.
In the worst case, then: ‘cloud’ is commercial control, and risk; hosting your data/systems there, positions you further from them. There might always be some enticing advantage offered, but over time this model converges on an increasingly greedy monopoly. If not, it isn’t profitable, cuts are made at the expense of security or reliability, leading to one of those catastrophic failures. I don’t see how an equilibrium is possible. This may be just one of those things where you don’t get to say “I told you so” until the damage is already done, work interrupted, money wasted or data lost.
Going the other way, every person and office would simply need their own servers, of course powered by free software. The aim then would be trying to get ‘closer’ to the rest of the Internet in order to communicate and share that data more easily. Already that would be easier if less data is being synced wholesale to/from the cloud, and if local maildirs and HTTP caches can be used.
The monopoly might just shift down to a lower layer (from application, to network): carrier neutrality, or the price/restrictions of basic Internet access. To counter that I like the idea of there being private (free) WAN links, connecting building tenants and neighbouring premises who might otherwise have been exchanging data across the Pacific and back just to go next door. Even wireless meshes in small communities. IPv6 offers a much neater architecture for all this.