People who know me in real-life know that I’m kind of OCD about backups. That comes from many years of trying to help clients who weren’t OCD about backups recover their data after catastrophic crashes and other data loss events. I’ve yet to hear a client utter the phrase, “Gee, I wish we didn’t have such a good backup plan.”
I’ve also come to accept that computers make fun mistresses, but terrible wives. My years in I.T. have taught me to never trust a computer because they’re heartlessly fickle. At some point in the course of the love affair, they’re likely to turn on you in very sudden and cruel ways that leave you helpless, and if you don’t have good backups, hopeless as well.
Before I get into my own backup plan, let me mention what I recommend for more average users whose entire livelihoods and those of their clients aren’t dependent on their computers being up and running all the time. It’s a two-pronged approach that involves an external hard drive, a copy of the version of Macrium Reflect that best suits their needs, a high-speed Internet connection, and a subscription to Backblaze Online Backup. Simply install Macrium Reflect and schedule it to make (or update) an image backup every day; and then install Backblaze using it’s default setting. Presto. You’re done.
The reason I suggest both local and online backup is that recovering all your data from an online backup provider can take day or weeks, depending on how much data you have. The online backup should therefore be considered your “doomsday” backup, to be invoked only when your local backup has been lost, stolen, or destroyed. Having both gives you the advantage of quick local accessibility in the case of something like a hard drive failure or other local event; but also provides redundancy in the cloud if your local backup should fail, get washed away in a flood, or go up in smoke in a fire along with the rest of your office.
My own situation is a bit different both because I depend on computers for a living and because I have to back up different kinds of computers, including Linux servers. I also know a bit more than the average user about what I actually need backed up; and because I do have a data cap on my Internet connection, I want more granularity of control over what gets uploaded to the cloud. I want only my most important data uploaded to the cloud. For the rest of it — the stuff that would just be a pain in the ass rather than a disaster if I lost it — I use a redundant set of local backups.
For my online backups of this particular machine, I use Backblaze B2, which is a business-oriented cloud backup service offered by the same company that provides Backblaze Online Backup. The basic concept will work as well with Amazon S3 or Cloud Drive or most other cloud backup providers, however. Here are the details and the sequence of the backup plan I use on this Windows 10 Professional machine.
1. Every morning, Macrium Reflect creates or updates an image backup and saves it to a folder on an ioSafe Solo G3 4TB Fireproof & Waterproof External Hard Drive.
2. Every afternoon, Macrium creates a clone of the hard drive to another external hard drive, in this case one housed in a more conventional USB 3.0 external hard drive enclosure. If you choose to do this, the most important thing is to make sure that the drive inside the enclosure is compatible with the one inside your computer so it can be swapped right into it. In the event that you have to use it, you’ll just swap the clone into the computer and boot it up. Sometimes a CHKDSK will also be required, other times not.
The advantage to having a clone is downtime reduction. It usually takes less time to swap a hard drive than it does to restore from an image, especially if the reason is a hard drive crash. You have to open the machine up to replace the failed drive anyway, so the drive you replace it with may as well be one that already has your OS, programs, and data, rather than a blank drive. The clone will be current as of the last time it was refreshed, which is why I suggest that you refresh it at least daily.
The reason for also making an image copy in addition to a clone is that the image copy can be incremental, so you can restore to a point previous to your last backup. This is helpful in cases such as viruses, accidentally deleted files, or changed files for which you find you need the older versions. You can’t do that with a clone, but you can with an image.
The problem for me bandwidth-wise is that both the clone and the image contain literally all the data on the computer, which is fine for local backup; but it’s more data than I’d want to upload to the cloud because of the bandwidth cap. Hence the next step:
3. Late at night, I use a home-made batch file to copy my most critical data to the ioSafe, and to then copy it from there to Backblaze B2. This script uses three command-line utilities: a built-in Windows utility called Robocopy, an open-source shadowing utility called ShadowSpawn, and another open-source utility called rclone.
The sequence in which the batch file makes things happen is as follows:
- It exports up-to-date backups of my Dreamweaver site settings and my Mailwasher Pro settings from the Windows Registry and copies them to my documents folder. Having these settings backed up would save me hours of time in the case that I had to rebuild my system. I say that from experience.
- It shadows, in succession, five of my most important data sets: my Thunderbird mail, my Mailwasher Pro data, my Downloads folder, my Desktop (where data I’m currently working on often resides), and my documents folder; and copies the new or changed data to a folder called “Files and Folders” on the ioSafe drive, which happens to be the D: drive on my system.
- It uploads the new and changed data from the ioSafe to Backblaze B2, executing a job called “Doomsday” to a bucket also called “Doomsday.”
The batch file therefore creates both a handy local file backup in a disaster-proof enclosure, and then backs up that backup to the cloud using Backblaze’s B2 service. So I’m literally making a backup of my backup. (I did mention that I was OCD about backups, didn’t I?) The other advantage is that the ioSafe serves as a cache, of sorts, to reduce the i/o load on the system drive while the upload is happening. This works fine if the backup drive is USB 3.0 or eSATA. For slower interfaces, maybe not so much.
For those who are interested, here’s the batch file in all its glory:
REM This step backs up the Dreamweaver site definitions reg export HKEY_CURRENT_USER\SOFTWARE\Adobe\Common\16.0\Sites "C:\Users\Geek On The Hill\Documents\Dreamweaver Registry Backups\16.0\site-backup-dw-16.reg" /y REM This step backs up the Mailwasher Pro registry settings reg export HKEY_CURRENT_USER\SOFTWARE\FireTrust "C:\Users\Geek On The Hill\Documents\Mailwasher Pro Registry Backup\mw-pro-reg.reg" /y REM The next steps shadow and back up the most important files shadowspawn "C:\Users\Geek On The Hill\AppData\Roaming\Thunderbird" Q: robocopy Q:\ "D:\Files and Folders\AppData\Thunderbird" /s /xa:st /xj /xd "cache" "temp" "tmp" /xo /fft /purge /tee shadowspawn "C:\Users\Geek On The Hill\AppData\Roaming\MailWasherPro" R: robocopy R:\ "D:\Files and Folders\AppData\MailWasherPro" /s /xa:st /xj /xd "cache" "temp" "tmp" /xo /fft /purge /tee shadowspawn "C:\Users\Geek On The Hill\Downloads" S: robocopy S:\ "D:\Files and Folders\Downloads" /s /xo /xj /fft /purge /tee shadowspawn "C:\Users\Geek On The Hill\Desktop" T: robocopy T:\ "D:\Files and Folders\Desktop" /s /xo /xj /purge /fft /tee shadowspawn "C:\Users\Geek On The Hill\Documents" U: robocopy U:\ "D:\Files and Folders\Documents" /s /xo /xj /purge /fft /tee REM This step uploads the files to Backblaze B2 cd C:\"Program Files"\rclone rclone sync -v --transfers 16 "D:\Files and Folders" Doomsday:Doomsday rclone cleanup Doomsday:Doomsday pause
The batch file must be run under the profile of the user whose data is being backed up, and requires Administrator privileges.
Of course, you’d also need to actually have a Backblaze B2 account (or an account on another provider’s cloud backup service, such as Amazon Cloud Drive or Amazon S3), and to configure rclone to connect to that provider’s service. But that’s very simple and intuitive to do if you’re comfortable in the command line. The details can be found in the rclone documentation.
I also use Amazon S3 to back up some of my servers, by the way. It’s also an excellent service and will work just fine with this system. Amazon’s infrastructure is also much more extensive, well-established, and redundant; but Backblaze is less expensive. Personally, I like and use both companies’ services. My experience has taught me that having multiple reliable providers is always a good thing.
I consider the above backup plan to be near-perfect. For me to completely lose all my data, all of the following things would have to happen nearly simultaneously:
- My computer crashes or is lost, stolen, or destroyed
- My clone backup fails or is lost, stolen, or destroyed
- My image backup fails or is lost, stolen, or destroyed
- The files backup on the ioSafe disappears, or
- The ioSafe is lost or stolen, or the drive inside of it does not survive a disaster
- Assuming the latter case, ioSafe is unable to recover the data that was on the drive under their warranty
- Backblaze suddenly goes out of business or suffers a catastrophe that completely destroys their datacenter
I think the chances of all those things happening simultaneously are pretty slim; and if they ever do, I reckon I’ll have bigger problems on my mind than my data loss. A scenario like that would take a catastrophe of biblical proportions to become reality.
The reason my plan is only near perfect (aside from nothing ever being really perfect) is that both Robocopy and rclone are limited in their deduplication abilities. If a file is moved, renamed, or copied to another location on the drive, chances are that it will be re-uploaded. Compressed archives will also be completely re-uploaded if files are added to or subtracted to the archives, even if they’re not encrypted.
So no, my backup plan is not perfect. It’s only nearly perfect. It will waste some bandwidth from time to time. Eventually, if I have the time, maybe I’ll figure out a way to improve on the deduplication, and then it will even more nearly perfect.