Last night I was writing a script and it made a directory literally named “~” on accident. It being 3am I did an rm -rf ~ without thinking and destroyed my home dir. Luckily some of the files were mounted in docker containers which my user didn’t have permission to delete. I was able to get back to an ok state but lost a bit of data.

I now realize I really should be making backups because shit happens. I self host a pypi repository, a docker registry both with containers and some game servers in and out of containers. What would be the simplest tool to backup to Google drive and easily restore?

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    1
    ·
    16 days ago

    I just uh, wrote a bash script that does it.

    It dumps databases as needed, and then makes a single tarball of each service. Or a couple depending on what needs doing to ensure a full backup of the data.

    Once all the services are backed up, I just push all the data to a S3 bucket, but you could use rclone or whatever instead.

    It’s not some fancy cool toy kids these days love like any of the dozens of other backup options, but I’m a fan of simple and well, a couple of tarballs in a S3 bucket is about as simple as it gets since restoring doesn’t require any tools or configuration or anything: just snag the tarballs you need, unarchive them, done.

    I also use a couple of tools for monitoring the progress and a separate script that can do a full restore to make sure shit works, but that’s mostly just doing what you did to make and upload the tarballs backwards.