Skip to content

carnivuth/labcraft

Repository files navigation

Labcraft

Files for homelab provisioning and maintenance operations of my personal proxmox cluster for self-hosted services, application deployment environment and playhouse :)

Architecture

The main purpose of the server is to expose web interfaces of docker containers for some services that i use every day

---
title: torterra
---
flowchart LR
subgraph web_services
direction TB
A[(wailord)]
B{staraptor}
B --http requests--> A
end
subgraph dns_servers
direction TB
C[espeon]
D[umbreon]
C ~~~ D
end
web_services --dns queries--> dns_servers
Loading

Networking

Some services are exposed to the internet via HTTPS reverse proxy implemented with nginx

flowchart LR
A((Internet))
B{staraptor}
C[nextcloud]
D[...]
C & D --> B
B --> A
Loading

Some other services are exposed through port forwarding on the router

flowchart LR
A((Internet))
B{router <br>port forwarding}
C[wireguard]
C --> B
B --> A
Loading

Storage

The proxmox host has a bunch of disks installed and all of them are managed trough lvm, one of them is an nvme that manages volumes for vms and containers the others are for backing up data,

flowchart
	subgraph data_volume_group
		direction TB
		subgraph nvme
				A[container rootfs]
		end
	end
	subgraph backup_volume_group
	    direction TB
		C[backup volume]
	end
	A -- backup on --> C
Loading

Backups management

This infrastructure manages all of my backups, the backup centralizer is an lxc container that runs pbs

flowchart
    subgraph ditto
		subgraph data_volume_group
        A[rootfs]
		end
		subgraph backup_volume_group
        B[backup disk]
		end
    end
Loading

All of my personal pc use borg for managing backup locally and then copy content to the centralizer machine using rsync, backup is achieved trough a script that runs as a systemd timer

sequenceDiagram
participant laptop
participant ditto
laptop ->> laptop: creates backup
laptop ->> ditto: sync changes
Note over laptop,ditto: connection secured trough vpn
Loading

vms and containers backups are managed trough proxmox backup server installed on the centralizer

Installation

  • clone repository inside the proxmox host
cd /usr/local
git clone https://github.com/carnivuth/labcraft
  • create venv and install dependencies
cd labcraft
python -m venv env
source env/bin/activate
pip install -r requirements.txt
  • install ansible collections and roles
source env/bin/activate
ansible-galaxy collection install -r collections/requirements.yml
ansible-galaxy role install -r roles/requirements.yml
  • add secrets folowing this guide

  • create terraform vars file following the vars declaration in terraform/variables.tf

  • create a proxmox admin token for terraform

  • create templates for vms and containers

  • install terraform

apt install terraform
  • run terraform to deploy vms
cd terraform && terraform init && terraform plan -o /tmp/plan && terraform apply /tmp/plan

Handle secrets

Sensitive informations are stored inside an encrypted vault file generated with ansible-vault, in order to create it do the following:

  • create a sample with the following command:
grep -e 'vault_[a-z_]*' playbooks/group_vars/all/vars.yml inventory/inventory.proxmox.yml  -ho > sample.yml
  • create a file to store the vault password
pwgen -N 1 64 > passfile && chmod 600 passfile
  • set vault pass file in ansible.cfg
[defaults]
host_key_checking = False
vault_password_file=/usr/local/labcraft/passfile
  • add variables and encrypt the file with ansible vault
ansible-vault encrypt sample.yml
  • move the file to the group_vars folder
mv sample.yml playbooks/group_vars/all/vault.yml

Update management and provision

To avoid having to run ansible manually every time there is an update do the following

  • add the scripts/update_labcraft.sh to cron:
# path variable is needed
PATH=/usr/local/labcraft/env/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
* * * * * /usr/local/labcraft/update_labcraft.sh > /dev/null 2>&1

Then link workflows/middleware.sh to the git hooks dir (more on the topic here) as follows

cd .git/hooks
ln -fs ../../workflows/middleware.sh post-merge

So every time a commit is pushed to remote cron will pull the updates from remote repo and the git hook will run the correct workflow based on the file that was modified

---
title: UPDATE WORKFLOW
---
sequenceDiagram
participant dev_machine
participant github_repo
participant torterra

dev_machine ->> github_repo: push chainges
loop every x minutes
torterra ->> github_repo: fetch changes
alt changes
torterra ->> torterra: run middleware
torterra ->> torterra: run workflow based on the file that was modified
end
end
Loading
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy