Oracle Linux 8.4 Webserver Virtualization Project


I’d like to move my webserver over to Oracle Linux, so that I can get more familiar with it for work, and I’d like to virtualize it. So I’ve got a 1 TB HD in a 5-6 gen i5. I’m also planning on testing my backups instead of restoring straight from the running system.

Getting it set was fairly simple. I was able to get 8.4 installed with just the virtualization features. I’ve got a bare install of 8.4 installed in a VM called thecwebVirt. So with the backup of my package list from thecweb, and the tar of /var/www, it should be fairly quick to spin this bitch back up. Oh, and I got a tar etc too.

Now I’m planning on setting up a caching DNS server to see if I can noticeably speed up my web browsing. I’ve never run my own D

NS server, but I should probably get some practice. Also, it’ll give me something to do while I’m in Zion National Park next week.

Simple, yet functional interface.

I bought a few more domain names on a whim

Listing them here as a reminder, and also to get the sweet, sweet, SEO baby!

Currently that all point here, but I’ll figure out what to do with them at some point. I was thinking for the .lol one, I could slap together something that would turn all the body text of this site into something funny, and output it to

Website Backup


I’ve tried several backup solutions for linux, and I wasn’t able to find something that did what I wanted.  Backula was way too much work to set up.  I actually spent several hours installing and configuring a web interface for Bacula to make it easier, but once I got it all up and running I found it was just a reporting tool.  Bacula is more for an enterprise with hundreds of servers to backup.  Backuppc wasn’t really flexible enough.  I read through various tar backup scripts and finally decided to install Powershell on the server, and see about writing my own.  Then, I realized that I need to conceptualize what I want to accomplish, before I decide what tools to use…  So this document.  


The major components are going to be the website, database, and all of /etc at least.  I also think a backup of the installed packages should be good.  I don’t want a large backup.  I don’t really think I should have to dedicate 30 GB to it.  One hiccup is that I want it to be stored on my Win 10 PC, so it can be synced with a google cloud account.  One of mine has a TB, so there should be plenty.  


  • DB – I’m guessing the output of this will be small, but testing
    mysqldump –all-databases > dump.sql

Only 2.8MB  ~ 84 MB per month

  • WordPress – everything appears to be under /var/www, so I’ll just tar that whole bitch up.
    sudo tar -cpvzf website.tar.gz /var/www/
    142MB – this might require incremental.  4.3 GB per month.  
  • Packages – apt-clone doesn’t appear to be a base package, but seems to be exactly what I want, so the command below will work.  I don’t really see needing more than one of these, but the files are small enough that 30 days would be fine. 
    sudo apt-clone clone <filename>  

Backup is 22k which is not worth my time to do the math.

  • /etc – super important because I don’t know how many hours I’ve dumped into this server.  Way too long to not have a backup. 
    sudo tar -cpvzf etc.tar.gz


  • Hourly
    • In “hourly” folder we’ll have tar archives labeled 1-24
  • Daily
    • In “daily” folder we’ll have tar archives labeled 1-31
  • Monthly
    • In “monthly” folder we’ll have tar archives labeled 1-12

Code example

# Chris Weber
# 4/16/2021
# Updated
# Backup solution for web server

# Get day of month
$Hour = get-date -Format "hh"
# backup storage
#$backupStore = "/media/backup/hourly/$Hour"
# Generate backups and put in tmp
Set-Location -Path "/tmp"

# DB
/usr/bin/mysqldump --all-databases > db.sql

# Website
/usr/bin/tar -cpzf website.tar.gz /var/www/

# Packages
/usr/bin/apt-clone clone /tmp

# etc
/usr/bin/tar -cpzf etc.tar.gz /etc

# Move stuff to the place
Move-Item -force -path /tmp/db.sql -destination /media/backup/hourly/$Hour.sql
Move-Item -force -path /tmp/website.tar.gz -destination /media/backup/hourly/$
Move-Item -force -path /tmp/ -destination /media/backup/hourly/$
Move-Item -force -path /tmp/etc.tar.gz -destination /media/backup/hourly/$Hour.etc.tar.gz

Issues to fix!!!1!

  • Hourly backup is only doing 12 because I didn’t specify 24 hour time format when setting $hour

Update to the camera

The wire came off before I could get anything connected, and the pad came off with it, so I gave up. Bought a Nest Video door bell instead. I like it a lot more.

$30 camera project

Got this cheap-ass camera on Amazon. It does 360 rotate, and like 120 tilt. The software sucks ass. It’s got the same chip as much nicer systems. Ingenic T20 is the chip. I’m going to attempt to load custom firmware to take it off the cloud, and I’ll run stuff internally.

Following info from here:

Monster Insights installed

I’ve had quite a few comments left on my site, the vast majority are obvious spam. Some don’t really seem to be though, or at least I can’t figure out what the end game is of comments like “great read, keep it up!” and crap like that. Maybe those are there to check if they can leave comments, and then they start the spam? I don’t know.

So any, I installed this plug-in and setup Google Analytics. Just the free monitoring on both systems, but this should help me decide whether I’ve had any real human visitors. I’m assuming not.

Clueless and Postfix

Clueless is on netflix, so I watched it tonight. Quite a few jokes went over my head when I was 13, but that didn’t stop me from watching it over and over again. I’d seen it too many times to actually sit and watch it, so I started setting up Postfix using this tutorial from Linux Bade.

Installed postfix, and attempted to send a message to my gmail. Postfix send me an NDR with error code 550-5.7.1. Looks like one of the reasons is I don’t have an IP6 PTR record… which is weird. I don’t know why google would assign an IP4, and not an IP6. Oh well, I’m sick of screwing with it for today.

I’m typing this from my Dell XPS 13 ultrabook. Ron Swanson knocked a glass of water on it a few weeks ago, and I finally got around to completely dismantling it to find the damage. After removing the wee little mother board by the pen, I found some corrosion on the underside. I has able to use a damp paper towel to clean some of it off. Then I used my phone as a magnify glass, and a scrapper to get in between the leads on the IC. Followed it up with a brush from a sneaker cleaning kit I had laying around. And to my surprise it powered back on, and booted to Widows without issue, I thought for sure I was writing off a $1200 dollar laptop.