How I do daily database backups for free

I really like free stuff. I remember how happy I was when I could host few containers on Openshift for free, where I hosted digrin.com at the time. So recently I wanted to create a daily backup for digrin if something bad happens. It’s as simple as export PostgreSQL DB to file and store it somewhere. I still remember megaupload.com as free file storage, so I just ship it to mega.nz now (15GB storage free) 🙂

So I started with DB export, postgres has a command for that, same as other database engines I assume. So this is how I exported my database at first (small bash file):

#!/bin/bash

DBPATH="/var/lib/postgresql/data/"
pg_dumpall -l digrin -c --no-owner --no-privileges -U postgres -h postgres > ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql

This created a DB backup with 150 MB file size. As Mega.nz has 15GB free storage, that would suffice for 100 daily backups. A simple way how to inflate it a bit is compression. I didn’t want to spend much time choosing the best compression algorithm, most common and easiest to use should be fine. Tar it is:

#!/bin/bash

DBPATH="/var/lib/postgresql/data/"
pg_dumpall -l digrin -c --no-owner --no-privileges -U postgres -h postgres > ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql
tar -czvf ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql.tar.gz ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql
rm ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql

Code is the same, but on line 5 I compress .sql file to tar.gz, lowering the size from 150 to 50 MB -> making it possible to store 3 times as much -> 300 exports. Command on last line just deletes .sql file we don’t need anymore. Scheduler creates DB backup daily, let’s say one hour before I wanna export it to cloud storage.

So we have a file we want to upload to some storage provider. I registered in mega.nz, installed megaput lib on the server. In cron I just run a short bash file to uploaded compressed DB backup:

10 5 * * * /root/digrin/db_backup.sh >> /root/digrin/log-output.txt

And this is what (/root/digrin/db_backup.sh) exports the file to mega.nz:

echo "starting push to mega.nz"
DBPATH="/opt/fish/digrin/postgres/var/lib/postgresql/data/"
megaput --path /Root/backups/digrin ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql.tar.gz
echo "probably uploaded :)"
rm ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql.tar.gz
echo "deleted backup file from server"

So this is what I get in mega.nz account. I plan to delete older backups every 6 months or so, basically for recent few months I would keep weekly backups, for older monthly, etc. And this is how it looks:

That is it. I never needed to use a backup file to save my ass yet, but with few lines of code, I can sleep a bit better.

One thought on “How I do daily database backups for free

  1. Hi there,

    nice article.

    I will add just a few notes – Imho tar does not do the actual compression – it connects multiple files into one archive and then you can call gzip (that -z param in tar command) which does the actual compression.

    The second and more important thing is that you should never store access keys to the backup storage on the DB server because if somebody gets access to it – it can delete your DB with all your backups (this actually happened to one web hosting company). You should instead ssh to the DB server from some remote worker which will download the DB backup and upload it somewhere…

    But overall it looks really good – and most importantly it is nice that you think about these things like DB backups 🙂

    Honza

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.