Dead Computer by Beatrice Murch
Very few people learn the importance of backups before having a stomach churning loss of data at the most inopportune time (right before a presentation or the night before your thesis is due). Some, such as yours truly, need to have multiple hard drive failures before they learn their lesson. Luckily we are now where cloud storage has done almost done away with the cumbersome tape or external hard drive backup systems that most businesses used to deploy. It is now easy enough to start backing up that you have no excuse not to.
If you are running your business off of a single computer then the easiest thing to do would be to start using Box or Dropbox. Both are extremely easy to setup. Just install and move your business files into the Box or Dropbox folder and they will automatically be backed up. Google (Drive) and Microsoft (SkyDrive) also offer similar services that tie into their office products. These are also all good ways to share large documents with clients.
For businesses that have servers (either on site or in the cloud) backing up is just as important. Many people make the mistake in thinking that their server in the cloud will not fail or lose data. All servers have the potential to crash and if you have any client data on them then you had better be backing it up. Telling a client you lost their data is never a conversation you want to have.
For servers it is often easier and more cost effective to backup data to Amazon’s S3 service than to try to use consumer products. The instructions below are for backing up Ubuntu but will work on any Linux distribution with just a little modification.
First you should signup for Amazon S3. You can get 5 GB of free storage for one year. After that the price is less than ten cents a gigabyte per month. After you have signed up create a bucket to store your files in.
Run the following to install the s3 client:
sudo apt-get install s3cmd
Configure s3 with your keys from signing up:
On Ubuntu you might want to run “sudo bash” before running “crontab -e” in case there are files that you want to backup that your user does not have access to.
Add something like the following lines to the end of your crontab file:
0 21 * * * /usr/bin/s3cmd -c /home/will/.s3cfg sync -r /samba/share s3://bayview
0 23 * * * /usr/bin/s3cmd -c /home/will/.s3cfg sync -r /samba/users s3://bayview
-c specifies where the .s3cfg file is (usually /home/username/.s3cfg).
-r specifies the directory you want backed up.
s3:// specifies the name of your bucket on s3.
Hit ctrl+x to exit and save the changes.
Now every night your files will be backed up to Amazon’s servers which are much better watched over than your own. Only files that have changed will be uploaded every night so you won’t be using too much bandwidth in most cases.
If you’ve got a database to backup (you likely do) then you’re going to want to back that up at least daily as well.
I started out with this but now use a very heavily modified version of it.
This appears to be an all in one solution for files and databases. I haven’t used it but it looks promising.
There is obviously many different ways to achieve the same results. The important thing is to have a backup system in place and to test it periodically. Set a weekly task in Outlook or Asana to remind you to run updates, virus scans, and to check your backups.