April 2021

Sneaker Net™ from Portland to KC

I volunteer to do some travel for work. It’s more secure and much easier to just ship out a NAS, fill it up, and then fly someone out to get NAS and bring it to the data center. So I got to go to Portland. It worked out so that I landed around noon, but my coworker didn’t land until 9 pm, so I got to explore. I saw some big-ass trees. Walked around the river front. Sooo many tent cities. It made me wonder if there was really a bigger homeless problem in Portland than most cities, or if they actually live better because at least they have tents?

The schedule was a little hectic. I took a 6 am flight to Dallas and then to Portland. Got the rental car. Checked in. Had a four cheese grilled cheese and Cajun tater tots at some bar. Checked out the Douglas Furs. Saw a big-ole metal syringe disposal trash can. Then went and picked up the co-worker at the airport. We had dinner at a different bar. Fish and Chips were pretty good. Actually the chips were the closest things I’ve ever had to the way the English do it, at least they looked like what I’ve seen on food network. Found a little café in an office building near the hotel and had breakfast burritos.

The actual NAS pickup took a couple hours. Longer than you’d think it would take, but it sounds like organization with this project was severely lacking 6-12 months ago. Still rebuilding it. From there is was to the airport at 2 pm, and take off at 3:45 pm. Landed in Pheniox and then on to KC. We landed at 11:30 pm. Data Center around 1 am, and then dropped my coworker off at his hotel near the airport.

It was funny, when we were boarding one of the planes one of the flight attendants yelled “how the heck did you guys get that this far!” And I replied “we bought it a ticket”, and that was that.

I will volunteer for another trip, because I enjoyed it. Although, I’m not sure how much of my enjoyment was because nothing was crowded. Like, what if I hate it next time because there will he thousands more people in my way? Time will tell, rookie beotch.

On the plane
Big-ass trees

Website Backup

Why?

I’ve tried several backup solutions for linux, and I wasn’t able to find something that did what I wanted.  Backula was way too much work to set up.  I actually spent several hours installing and configuring a web interface for Bacula to make it easier, but once I got it all up and running I found it was just a reporting tool.  Bacula is more for an enterprise with hundreds of servers to backup.  Backuppc wasn’t really flexible enough.  I read through various tar backup scripts and finally decided to install Powershell on the server, and see about writing my own.  Then, I realized that I need to conceptualize what I want to accomplish, before I decide what tools to use…  So this document.  

What?

The major components are going to be the website, database, and all of /etc at least.  I also think a backup of the installed packages should be good.  I don’t want a large backup.  I don’t really think I should have to dedicate 30 GB to it.  One hiccup is that I want it to be stored on my Win 10 PC, so it can be synced with a google cloud account.  One of mine has a TB, so there should be plenty.  

Specifically?

  • DB – I’m guessing the output of this will be small, but testing
    mysqldump –all-databases > dump.sql

Only 2.8MB  ~ 84 MB per month

  • WordPress – everything appears to be under /var/www, so I’ll just tar that whole bitch up.
    sudo tar -cpvzf website.tar.gz /var/www/
    142MB – this might require incremental.  4.3 GB per month.  
  • Packages – apt-clone doesn’t appear to be a base package, but seems to be exactly what I want, so the command below will work.  I don’t really see needing more than one of these, but the files are small enough that 30 days would be fine. 
    sudo apt-clone clone <filename>  

Backup is 22k which is not worth my time to do the math.

  • /etc – super important because I don’t know how many hours I’ve dumped into this server.  Way too long to not have a backup. 
    sudo tar -cpvzf etc.tar.gz

When?

  • Hourly
    • In “hourly” folder we’ll have tar archives labeled 1-24
  • Daily
    • In “daily” folder we’ll have tar archives labeled 1-31
  • Monthly
    • In “monthly” folder we’ll have tar archives labeled 1-12

Code example

#!/snap/bin/pwsh
#
# Chris Weber
# 4/16/2021
# Updated
#
# Backup solution for web server
#


# Get day of month
$Hour = get-date -Format "hh"
# backup storage
#$backupStore = "/media/backup/hourly/$Hour"
# Generate backups and put in tmp
Set-Location -Path "/tmp"

# DB
/usr/bin/mysqldump --all-databases > db.sql

# Website
/usr/bin/tar -cpzf website.tar.gz /var/www/

# Packages
/usr/bin/apt-clone clone /tmp

# etc
/usr/bin/tar -cpzf etc.tar.gz /etc


# Move stuff to the place
Move-Item -force -path /tmp/db.sql -destination /media/backup/hourly/$Hour.sql
Move-Item -force -path /tmp/website.tar.gz -destination /media/backup/hourly/$Hour.website.tar.gz
Move-Item -force -path /tmp/apt-clone-state-thecweb.com.tar.gz -destination /media/backup/hourly/$Hour.apt-clone-state-thecweb.com.tar.gz
Move-Item -force -path /tmp/etc.tar.gz -destination /media/backup/hourly/$Hour.etc.tar.gz

Issues to fix!!!1!

  • Hourly backup is only doing 12 because I didn’t specify 24 hour time format when setting $hour

Update to the camera

The wire came off before I could get anything connected, and the pad came off with it, so I gave up. Bought a Nest Video door bell instead. I like it a lot more.

Vaxxed and waxed, and ready for the beach.

I volunteered at a vaccine drive to push people around in wheel chairs. It was cool. Most people were nice. The person on oxygen that reeked of cigarettes. The one so large that there was no chance I could stop in time if a small child ran in front of us. I saw a person have a reaction and get taken away. Later on I spoke with one of the medics and he said he’s been giving them out since the beginning, and he’s only had a few reactions to the shot. This person, like most was just having an anxiety attack. Not surprising. It’s been stressful for everyone.