How I do daily database backups for free

August 09, 2020

I really like free stuff. I remember how happy I was when I could host few containers on Openshift for free, where I hosted digrin.com at the time. So recently I wanted to create a daily backup for digrin if something bad happens. It’s as simple as export PostgreSQL DB to file and store it somewhere. I still remember megaupload.com as free file storage, so I just ship it to mega.nz now (15GB storage free) 🙂

So I started with DB export, postgres has a command for that, same as other database engines I assume. So this is how I exported my database at first (small bash file):

#!/bin/bash

DBPATH="/var/lib/postgresql/data/"
pg_dumpall -l digrin -c --no-owner --no-privileges -U postgres -h postgres > ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql

This created a DB backup with 150 MB file size. As Mega.nz has 15GB free storage, that would suffice for 100 daily backups. A simple way how to inflate it a bit is compression. I didn’t want to spend much time choosing the best compression algorithm, most common and easiest to use should be fine. Tar it is:

#!/bin/bash

DBPATH="/var/lib/postgresql/data/"
pg_dumpall -l digrin -c --no-owner --no-privileges -U postgres -h postgres > ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql
tar -czvf ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql.tar.gz ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql
rm ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql

Code is the same, but on line 5 I compress .sql file to tar.gz, lowering the size from 150 to 50 MB -> making it possible to store 3 times as much -> 300 exports. Command on last line just deletes .sql file we don’t need anymore. Scheduler creates DB backup daily, let’s say one hour before I wanna export it to cloud storage.

So we have a file we want to upload to some storage provider. I registered in mega.nz, installed megaput lib on the server. In cron I just run a short bash file to uploaded compressed DB backup:

10 5 * * * /root/digrin/db_backup.sh >> /root/digrin/log-output.txt

And this is what (/root/digrin/db_backup.sh) exports the file to mega.nz:

echo "starting push to mega.nz"
DBPATH="/opt/fish/digrin/postgres/var/lib/postgresql/data/"
megaput --path /Root/backups/digrin ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql.tar.gz
echo "probably uploaded :)"
rm ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql.tar.gz
echo "deleted backup file from server"

So this is what I get in mega.nz account. I plan to delete older backups every 6 months or so, basically for recent few months I would keep weekly backups, for older monthly, etc. And this is how it looks:

That is it. I never needed to use a backup file to save my ass yet, but with few lines of code, I can sleep a bit better.

Update on 2021

Previous version was unnecessarily complicated, I moved it all to one file:

#!/bin/sh

DBPATH="/var/lib/postgresql/data13/"

pg_dump digrin -h postgres13 -U postgres --no-password --no-owner --no-privileges > ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql
tar -czvf ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql.tar.gz ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql
rm ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql

# push to mega.nz
megaput --username [email protected] --password ${MEGA_PASSWORD} --path /Root/backups/digrin/ ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql.tar.gz
echo "probably uploaded :)"
rm ${DBPATH}digrin_dump_`date +%Y-%m-%d`.sql.tar.gz
echo "deleted backup file from server"

Written by Lucas03 , who uses this as diary. Contact at admin[a]lucas03.com