My photography backup strategy
Over the years, I’ve promised to describe my digital photography backup strategy to you multiple times. I wrote this article a few times in the past but have yet to publish it, as I’ve ironically lost it numerous times. But I always kept my photographs. Well, that is, if we don’t count the moment the card in the camera died, which wasn’t caused by backup strategy but faulty hardware even before I managed to get the files from the camera.
If you don’t care about the specifics of my workflow and you’re looking for a generally good strategy – check the 3-2-1 backup rule by Peter Krogh:
3: Create one primary backup and two copies of your data. 2: Save your backups to two different types of media. 1: Keep at least one backup file offsite.
But if you care about the workflow I developed when Peter published the rule above, please read on. A warning must be placed here, though – I’m an IT nerd, and some of the details might be overly technical 😉
1) Since the incident mentioned earlier, I’m trying to use cameras that are storing files on two cards simultaneously – I wonder if this can be counted as part of a backup strategy. Still, it’s good practice when your camera supports it. Both cards would likely die at different times.
2a) If I’m running out of space on the cards when travelling. I copy the data to my notebook and an external drive before removing the files from the cards. Using Rapid Photo downloader – it downloads all the media files and sorts them into the directories in both locations simultaneously. The notebook and external disk aren’t stored in the same bag to be safer. I also try to upload all the files to my home server, but uploading sixty-four gigabytes over the internet is only sometimes possible. Therefore, when high-speed internet isn’t available, I delete data from only one card from two sets, allowing me to keep another third copy and have two empty cards in the camera.
2b) Usually, I only fill some of the cards at a time, and before I get home, I would make just one extra copy to my notebook.
3) After getting home, I’m downloading all the pictures to my desktop and the server (again, Rapid Photo downloader). Once again, I’m keeping everything from the cards before all the copies are created.
4) Because all of my devices are connected to one local network + powered by one power source, + they are in one physical location. Therefore they are vulnerable to the same risks at the same time. I’ve reused an ancient Raspberry Pi 1 and put some external disk drives into it. This is a remote location online backup. This microcomputer periodically checks my home server photo backup directory and downloads any new files. Connectivity between the machines is provided and encrypted by tailstcale vpn. Synchronization is done by simple rsync that’s periodically executed by crontab.
5) Offline backups: I’m burning the pictures to BR disks (in the past, DVDs). Always two pieces, each on a medium from a different manufacturer and in another drive. One copy is stored at home, the second at a remote location.
6) After depositing the offline backup to the remote location, I’m finally willing to delete the data from my workstation, keep them live only on my home server, and remote copy in the old Raspberry Pi.
Yes, it sounds overcomplicated and a bit as if I was more paranoid than Fox Mulder, but once I’ve set this workflow up, it’s doing most of the jobs itself without my direct interaction. My pictures would survive even a nuclear blast and related EMP (at least in one of the locations). Nobody is saying that you should do the same – even if you would skip half of the operations I’ve listed, you would still fit into the 3-2-1 rule. However, I would also add something to the 3-2-1 rule – test your backups periodically. Manufacturers claim that their CD/DVD/BR disks should hold data for a hundred years, but it’s common for disks to be unreadable just two years later.
Most of the workflow can be used for any other data to be backed up. For example, all the data can be automatically backed up in the Nextcloud directory to the server, and Raspberry would create a secondary copy of the Nextcloud offsite. This solution is possible due to Nextcloud having clients for mobile devices, so one can access all their documents from their phone. At the same time, it’s important to say that all those data might be accessed by anyone who would grab your phone and try to open it.
BTW all the software I’m using is free and open source:)