Tinkering

Invention

Decaffeinated Re-Caffeinated Coffee. The purpose is to enable super accurate dosing of caffeine. With the idea being you can find your optimum balance of goods versus negative effects. Companion app of course.

WireGuard VPN

I’ve been just spinning up VMs left and right since I setup that Oracle VM. I decided that if I’m going to be out of town for a week, then I’d like to have a VPN in to the home network, so that I can get some work done. It is my vacation, so I’d like to do some hobbies.

So far the config is pretty simple. I like the approch they are using with wiregaurd too. Very “unixy” in that it is just a network interface, that will encrypt with a private key, and decrypt with the client’s public key. There is almost no CPU usage. It does one thing and it does it well. Use what ever key management or authentication scheme you want.

New VM on Universe. 2 GB ram and 25GB storage. Its pool is 100GB.

package is just called wireguard

## Create the Wireguard virtual network adapter
$ sudo ip link add dev wg0 type wireguard

## Set proper umask for key files, and generate private and public key files
$ umask 077
$ wg genkey > privatekey
$ wg pubkey < privatekey > publickey

## Setup network
$ sudo ip addr add 10.0.0.1/24 dev wg0

## attach key to interface
$ wg set wg0 private-key ./private
## up
$ sudo ip link set wg0 up

## create /etc/wireguard/wg0.conf
$ sudo vi /etc/wireguard/wg0.conf

contents of new file

[Interface]
SaveConfig = true
PostUp = iptables -A FORWARD -i wg0 -j ACCEPT; iptables -t nat -A POSTROUTING -o enp1s0 -j MASQUERADE;
PostDown = iptables -D FORWARD -i wg0 -j ACCEPT; iptables -t nat -D POSTROUTING -o enp1s0 -j MASQUERADE;
ListenPort = 56990
PrivateKey = QETsE2fXOXC81R/MRYDYjHTyjZxfSlF2vuiCgK5nv0U=

[Peer]
PublicKey = L/VrqKjC5/harAftr+2w0I0hs0MPy0QgXGvvAKqYZlA=
AllowedIPs = 10.0.0.2/32

Oracle Linux 8.4 Webserver Virtualization Project

Plan

I’d like to move my webserver over to Oracle Linux, so that I can get more familiar with it for work, and I’d like to virtualize it. So I’ve got a 1 TB HD in a 5-6 gen i5. I’m also planning on testing my backups instead of restoring straight from the running system.

Getting it set was fairly simple. I was able to get 8.4 installed with just the virtualization features. I’ve got a bare install of 8.4 installed in a VM called thecwebVirt. So with the backup of my package list from thecweb, and the tar of /var/www, it should be fairly quick to spin this bitch back up. Oh, and I got a tar etc too.

Now I’m planning on setting up a caching DNS server to see if I can noticeably speed up my web browsing. I’ve never run my own D

NS server, but I should probably get some practice. Also, it’ll give me something to do while I’m in Zion National Park next week.

Simple, yet functional interface.

I bought a few more domain names on a whim

Listing them here as a reminder, and also to get the sweet, sweet, SEO baby!

http://chrisweber.online

http://chrisweber.design

http://chrisweber.lol

Currently that all point here, but I’ll figure out what to do with them at some point. I was thinking for the .lol one, I could slap together something that would turn all the body text of this site into something funny, and output it to chrisweber.lol.

Cut down the dead tree in the front yard

It was an evergreen tree that got some sort of blight. The blight was going around the Midwest for probably ten years now. Colorado got hit hard.

You can faintly hear me yell “wohoo!”.

Website Backup

Why?

I’ve tried several backup solutions for linux, and I wasn’t able to find something that did what I wanted.  Backula was way too much work to set up.  I actually spent several hours installing and configuring a web interface for Bacula to make it easier, but once I got it all up and running I found it was just a reporting tool.  Bacula is more for an enterprise with hundreds of servers to backup.  Backuppc wasn’t really flexible enough.  I read through various tar backup scripts and finally decided to install Powershell on the server, and see about writing my own.  Then, I realized that I need to conceptualize what I want to accomplish, before I decide what tools to use…  So this document.  

What?

The major components are going to be the website, database, and all of /etc at least.  I also think a backup of the installed packages should be good.  I don’t want a large backup.  I don’t really think I should have to dedicate 30 GB to it.  One hiccup is that I want it to be stored on my Win 10 PC, so it can be synced with a google cloud account.  One of mine has a TB, so there should be plenty.  

Specifically?

  • DB – I’m guessing the output of this will be small, but testing
    mysqldump –all-databases > dump.sql

Only 2.8MB  ~ 84 MB per month

  • WordPress – everything appears to be under /var/www, so I’ll just tar that whole bitch up.
    sudo tar -cpvzf website.tar.gz /var/www/
    142MB – this might require incremental.  4.3 GB per month.  
  • Packages – apt-clone doesn’t appear to be a base package, but seems to be exactly what I want, so the command below will work.  I don’t really see needing more than one of these, but the files are small enough that 30 days would be fine. 
    sudo apt-clone clone <filename>  

Backup is 22k which is not worth my time to do the math.

  • /etc – super important because I don’t know how many hours I’ve dumped into this server.  Way too long to not have a backup. 
    sudo tar -cpvzf etc.tar.gz

When?

  • Hourly
    • In “hourly” folder we’ll have tar archives labeled 1-24
  • Daily
    • In “daily” folder we’ll have tar archives labeled 1-31
  • Monthly
    • In “monthly” folder we’ll have tar archives labeled 1-12

Code example

#!/snap/bin/pwsh
#
# Chris Weber
# 4/16/2021
# Updated
#
# Backup solution for web server
#


# Get day of month
$Hour = get-date -Format "hh"
# backup storage
#$backupStore = "/media/backup/hourly/$Hour"
# Generate backups and put in tmp
Set-Location -Path "/tmp"

# DB
/usr/bin/mysqldump --all-databases > db.sql

# Website
/usr/bin/tar -cpzf website.tar.gz /var/www/

# Packages
/usr/bin/apt-clone clone /tmp

# etc
/usr/bin/tar -cpzf etc.tar.gz /etc


# Move stuff to the place
Move-Item -force -path /tmp/db.sql -destination /media/backup/hourly/$Hour.sql
Move-Item -force -path /tmp/website.tar.gz -destination /media/backup/hourly/$Hour.website.tar.gz
Move-Item -force -path /tmp/apt-clone-state-thecweb.com.tar.gz -destination /media/backup/hourly/$Hour.apt-clone-state-thecweb.com.tar.gz
Move-Item -force -path /tmp/etc.tar.gz -destination /media/backup/hourly/$Hour.etc.tar.gz

Issues to fix!!!1!

  • Hourly backup is only doing 12 because I didn’t specify 24 hour time format when setting $hour

Update to the camera

The wire came off before I could get anything connected, and the pad came off with it, so I gave up. Bought a Nest Video door bell instead. I like it a lot more.

$30 camera project

Got this cheap-ass camera on Amazon. It does 360 rotate, and like 120 tilt. The software sucks ass. It’s got the same chip as much nicer systems. Ingenic T20 is the chip. I’m going to attempt to load custom firmware to take it off the cloud, and I’ll run stuff internally.

Following info from here: https://medium.com/hackernoon/hacking-a-25-iot-camera-to-do-more-than-its-worth-41a8d4dc805c