I’ve returned after taking into consideration everyone’s advice on what I should do with my basic VPS and…listened to none of it for reasons. Well, I had a good reason. I wanted to diversify my internet consumption with all this Reddit API mess and have gone back in part to RSS. In that vein I have stood up Minflux and Wallabag on said VPS. Both are excellent if you need a RSS and a read it later app. And they can tightly integrate with one another which is rad.
So now that I’ve got it set up the way I want, what is the recommended method for backing it up in case of failure/data loss? It’s running Ubuntu 20.04, if that helps. I have Google drive space as well as Backblaze B2 I could leverage. I just need to know which direction to look for solutions. VPS is rented through Racknerd and I confirmed they don’t have snapshot function unfortunately.
I use Borg(matic) and rclone.
Personally i would use rsync for config files and schedule database dumps. rsync those too.
in fact thats exactly how I do it.
On my VPS, every night I shut down the docker containers, then backup everything (including postgres & mariadbs) with borg using borgmatic, upload to backblaze b2, then restart the containers.
You might look at what your hosting provider will do for you two. I am at Linode, and pay them $2/month and just turn on backup and they do it. Plus I can take one of my own snapshots any time. Like someone else said, if state matters think about that too. I.E. dumping databases, or shutting the VM down or services down and snapshotting it yourself.
I like the Backblaze idea too but have not done that yet.
A general strategy is 3 2 1.
Since you have BackBlaze B2, you can just regularly back up your config files and database exports to a B2 bucket, either with rclone and a cronjob or something like Duplicati for a more integrated solution.
You should do application level backups and put those in backblaze b2:
- for postgres look here.
- look at all the software you’re running and what they say about making backups.
- for files that don’t change often, making a an archive (with tar) is probably good enough. But if it changes during making the archive, the backup will be inconsistent.
- think about your RPO: how much data are you willing to loose in case of a crash? 1 day? 2 hours? 15min? Schedule your backups to be at least as frequent.
- Don’t forget to test your backups! Otherwise you’ll only find out that the backup is unusable when you need it most…