thecweb.com

A place just for me

Website Backup

Why?

I’ve tried several backup solutions for linux, and I wasn’t able to find something that did what I wanted.  Backula was way too much work to set up.  I actually spent several hours installing and configuring a web interface for Bacula to make it easier, but once I got it all up and running I found it was just a reporting tool.  Bacula is more for an enterprise with hundreds of servers to backup.  Backuppc wasn’t really flexible enough.  I read through various tar backup scripts and finally decided to install Powershell on the server, and see about writing my own.  Then, I realized that I need to conceptualize what I want to accomplish, before I decide what tools to use…  So this document.  

What?

The major components are going to be the website, database, and all of /etc at least.  I also think a backup of the installed packages should be good.  I don’t want a large backup.  I don’t really think I should have to dedicate 30 GB to it.  One hiccup is that I want it to be stored on my Win 10 PC, so it can be synced with a google cloud account.  One of mine has a TB, so there should be plenty.  

Specifically?

  • DB – I’m guessing the output of this will be small, but testing
    mysqldump –all-databases > dump.sql

Only 2.8MB  ~ 84 MB per month

  • WordPress – everything appears to be under /var/www, so I’ll just tar that whole bitch up.
    sudo tar -cpvzf website.tar.gz /var/www/
    142MB – this might require incremental.  4.3 GB per month.  
  • Packages – apt-clone doesn’t appear to be a base package, but seems to be exactly what I want, so the command below will work.  I don’t really see needing more than one of these, but the files are small enough that 30 days would be fine. 
    sudo apt-clone clone <filename>  

Backup is 22k which is not worth my time to do the math.

  • /etc – super important because I don’t know how many hours I’ve dumped into this server.  Way too long to not have a backup. 
    sudo tar -cpvzf etc.tar.gz

When?

  • Hourly
    • In “hourly” folder we’ll have tar archives labeled 1-24
  • Daily
    • In “daily” folder we’ll have tar archives labeled 1-31
  • Monthly
    • In “monthly” folder we’ll have tar archives labeled 1-12

Code example

#!/snap/bin/pwsh
#
# Chris Weber
# 4/16/2021
# Updated
#
# Backup solution for web server
#


# Get day of month
$Hour = get-date -Format "hh"
# backup storage
#$backupStore = "/media/backup/hourly/$Hour"
# Generate backups and put in tmp
Set-Location -Path "/tmp"

# DB
/usr/bin/mysqldump --all-databases > db.sql

# Website
/usr/bin/tar -cpzf website.tar.gz /var/www/

# Packages
/usr/bin/apt-clone clone /tmp

# etc
/usr/bin/tar -cpzf etc.tar.gz /etc


# Move stuff to the place
Move-Item -force -path /tmp/db.sql -destination /media/backup/hourly/$Hour.sql
Move-Item -force -path /tmp/website.tar.gz -destination /media/backup/hourly/$Hour.website.tar.gz
Move-Item -force -path /tmp/apt-clone-state-thecweb.com.tar.gz -destination /media/backup/hourly/$Hour.apt-clone-state-thecweb.com.tar.gz
Move-Item -force -path /tmp/etc.tar.gz -destination /media/backup/hourly/$Hour.etc.tar.gz

Issues to fix!!!1!

  • Hourly backup is only doing 12 because I didn’t specify 24 hour time format when setting $hour

Leave a Reply

Your email address will not be published. Required fields are marked *