These days more and more I find myself feeling like one of our office IT admins. Only difference is I am doing it at home with various sharewares and trying not to make a career out of it. But it seems to suck up lot of time anytime something needs attention, because one thing leads to another. My approach has been one that has evolved through knocks and stumbles and common sense knowledge, while trying to keep time and money investment low, the former more than the latter. I try to use automation tools as much as possible, and someday I may get to manage everything from a single dashboard like a real management console.
Over last few years my home network has come to be one consisting of number of Win2K PCs, one PC with swappable boot drives, currently running Ubuntu, couple of laptops that come and go, couple of NAS drives (one of which is going away), and the usual paraphernalia of routers and access-points. Recently replaced the old router & wireless AP with a Linksys W54GL running Tomato. Have been happy with the QoS control I must say since we use Vonage. Also put in D-link gigabit switches to improve speed between my wired boxes, even though most of my cabling is older CAT5. But it has made a difference. Then I have my wi-fi gadgets like Nokia E71, N810, and Roku Soundbridge. But these usually don’t require too much effort except setting up network access and keeping my Townkyserver up. Because of all these the admin chores that I, like many other home users I am sure, find myself dealing with
- Ensuring backups on a regular basis
- Ensuring virus updates and scans are happening regularly
- Somehow keeping track of *health* of these PCs
- Updating stuff on an almost regular basis
Backup is something that has always bugged me as I had added machines to my network and I am slowly getting to a point where I can put the backup process for all my boxes on autopilot. I might as well state outright that online backup is not my cup of tea for various reaons.
RAID not for me: When talking of backup with friends the topic of RAID is almost impossible to avoid even though RAID is not a backup solution. Without getting esoteric, the mirroring flavor of RAID (1) gives me fault tolerance against loss of real time, frequently changing data, but it is not backup. The striping flavor of RAID is useful for those of us doing video authoring etc. I don’t deal with frequently changing data on my machines. Neither do I do much of games and video editing. For me it is mostly surfing, casual downloading, playing around with interesting apps, and some coding. My valuable data consists of documents, photos, music, purchased sharewares, downloaded emails, etc. For me RAID is an interesting thing to play around with, but does not solve any of my needs. Some use the comfort of RAID 1 mirroring to avoid backup chores. That approach makes me think of three things right away. Recovery hassle, effective disk utilization and generally non-trivial nature of RAID setup in a PC.
Common RAID 0/1 setups seem to be easier with these new NAS boxes. But PCs seem to often require both bios and OS level tweaking with a good understanding of what is going on under the hood. Most of my PC mobos support it. But I did not want spend more time tinkering with bios & OS when I can get by with easier approaches.
I hear both positive and not so positive stories about recovering from a failure on a RAID 1 system for home users with limited budget. Business folks probably have much better luck with this technology. When a drive goes I like to have that sense of “assurance” that I can take the disk to another machine and treat it as another simple NTFS/FAT/ext2 disk and try to recover file using a file recovery shareware. I have done this a few times before and luckily it seemed to work out. With RAID 1 disks that seems to get more difficult because of the mirroring technology used and one has to hope the disk rebuilding works as advertised.
Then when it comes to disk utilization I wonder if by using RAID 1 mirroring I am wasting disk with stuff that i don’t really care for particularly. But I guess that could be addressed by selectively choosing disks to mirror while remembering what mirroring buys me. Again I need to carefully figure out what I want to keep on RAIDed disks or partitions and how I deal with growing size of my data.
Simple Approach: For my important data I like to have two copies around, hopefully on two separate boxes, but minimally on two separate disks. The solution involves periodic replication of a snapshot of my worthy data on a different targets assuming that the source and destination do not go at the same time. Even if that happens as long as the data was written in a common (NTFS, FAT, ext2, etc.) format I can hope to attempt a recovery with one of many sharewares out there.
Going along with this periodic replication approach I had made one of my machines a backup machine. I loaded it up with a bunch of disks and used a synchronization software to backup data from different PCs to this box. The data that was specific to this box was backed up on one of the other boxes. Actually I ended up using all of my machines for some backup responsibility and creating a n-directional cross-replication thing going. I started using SyncBack SE as my tool here.
Opting for a NAS: I had this setup going for a while until the idea of a dedicated NAS instead of using a full Windows box started to take shape. A NAS would be small footprint system that is more optimized for file transfer across different protocols. I don’t need to set up all kind windows user accounts/roles on my windows box to control access for multiple users. Also this n-directional replication thing felt messy and I really wanted to move back to a single backup target model. Another problem with a windows box is that in the end it turns into a desktop PC and not just a backup machine and soon data is getting mixed up all over the drives in spite of my best effort.
So I was tempted to try out a NAS. Some of my friends have been looking at FreeNAS as something to try out on their spare PCs. I was not sure how much time I would have to put in to make that happen. I ended up finding a good price on a Linksys NAS200 box at Dell and I went for it. It turns out this was probably one of worst purchases I have ever made because I fell for the price without doing much research. This device is unbelievably slow and the internet is full of bad stories. Should have done my homework.
Finally ended up with a QNAP TS 209-Pro NAS cage. It is expensive for a diskless unit. But after over a year of use I can say it was worth it. It transfers files as fast as any other machines on my network and has a ton of features, most of which I don’t use. I like to keep by NAS as a NAS and not make it into another PC. But I do run the Twonkyserver on it mainly for occasional use with my Roku and N810. It runs on embedded Linux and I like and use the ability to log in via ssh.
TS-209 supports various RAID levels, but I did not enable it, since in my mostly static data scenario it did not make sense. Also using two 750 GB drive if I go with RAID 1 mirroring I can protect only 750 GB worth of data. Without RAID I can protect 1.5TB worth of data and since the other copy of data resides on my various PCs. If I did not have those drives on my PC the trade off would be between getting 1.5TB disks for my PCs or another TS-209 with 705×2 and enable RAID 0 on it. Again in terms of cost and my needs, the former works out much better since I already had a bunch of disks on my PCs.
Also decided not to use one Linear volume or JBOD mode on the store and have two disks as separate Single volumes. My issue was I already had data on the first disk when I added the second disk and creating a Linear volume would require me to clean the first disk and that would be the case any time one of the disks fail. I have to recreate the entire volume. Too much hassle. So I went along with the independent Single volumes model instead.
Most of the time the data gets written from the PCs to the NAS. But vacation photos on my laptop after a trip often gets loaded directly to the NAS, simply because that is easy to do. Then that data is backed up from the NAS to one of the PCs. To keep backup process fast I never touch the destination of a backup once I have set it up. All new writes and modifications always go to the source.
Imaging for fast recovery: Even with the sync solution working fine for data files it still did not address the system recovery issue for me. When the C: drive went ( a couple of times) having a drive image for that drive is what got me up and running again very quickly without having to install the OS all over again. I decided to try out Acronis a few years ago and it has saved my behind more than a few times now. Nowadays I use it to generate an image files for the C: partition of all my machines on one of the other local disks and then I using the same backup sync task to replicate the image to the NAS.
Getting it all going: Once I had this setup in place next thing was to automate them. Both Acronis and Syncback SE supporting good scheduling. There is nothing particularly interesting about scheduling except ensuring that I stagger them out to avoid disk I/O contention because the both image creation and the backup (potentially) processes can create significant disk write load.
So far things have been hanging in there ok. Keeping my fingers crossed. I am still in the process of setting up backup for the Ubuntu box. Next time I will describe how I have started using Nagios to actually monitor the successful execution of all these tasks.