Claude Based Knowledge: Backup & Roadmap
Backup-Prioritäten
Kritisch (Datenverlust = Katastrophe)
| Was | Wo | Größe (ca.) | Backup-Methode |
|---|
| PostgreSQL Job Tracker | jobtracker-db (:5432) | ~50 MB | pg_dump via Docker |
| PostgreSQL Intel Platform | intel-platform-postgres-1 (:5433) | ~100 MB | pg_dump via Docker |
| PostgreSQL Authentik | authentik-db | ~20 MB | pg_dump via Docker |
| PostgreSQL Outline | outline-db | ~30 MB | pg_dump via Docker |
| PostgreSQL Miniflux | miniflux-db | ~50 MB | pg_dump via Docker |
| MariaDB BookStack | bookstack-db | ~20 MB | mysqldump via Docker |
| Vaultwarden | ~/projects/vaultwarden/data/ | ~10 MB | Dateikopie |
| Forgejo Repos | ~/projects/forgejo/data/ | ~500 MB | Dateikopie + git clone |
| SSL-Zertifikate | /etc/letsencrypt/ | ~5 MB | Dateikopie |
Wichtig (Verlust = mehrere Stunden Arbeit)
| Was | Wo | Backup-Methode |
|---|
| n8n Workflows | ~/projects/job-tracker/n8n-data/ | Dateikopie |
| Uptime Kuma DB | ~/projects/uptime-kuma/data/kuma.db | Dateikopie |
| Outline Minio (S3 Files) | outline-minio Volume | Dateikopie |
| Nginx Configs | /etc/nginx/sites-available/ | Dateikopie |
| Docker Compose Files | ~/projects/*/docker-compose.yml | Git |
| .env Files | ~/projects//.env | Dateikopie (verschlüsselt!) |
| MkDocs Content | ~/mkdocs/docs/ | Git |
Niedrig (Schnell wiederherstellbar)
| Was | Warum niedrig |
|---|
| Docker Images | Werden aus Dockerfiles neu gebaut |
| node_modules | npm ci stellt sie wieder her |
| .next Build | npm run build erzeugt sie neu |
| Let’s Encrypt (erneuern) | Certbot holt neue Zertifikate |
Backup-Skript
#!/bin/bash
# /home/nsa/bin/backup.sh
# Tägliches Backup aller kritischen Daten
set -euo pipefail
BACKUP_DIR="/home/nsa/backups/$(date +%Y-%m-%d)"
mkdir -p "$BACKUP_DIR"
echo "=== Backup gestartet: $(date) ==="
# 1. PostgreSQL Dumps
echo "[1/7] PostgreSQL Dumps..."
docker exec jobtracker-db pg_dump -U jobtracker jobtracker | gzip > "$BACKUP_DIR/jobtracker.sql.gz"
docker exec intel-platform-postgres-1 pg_dump -U intel intel | gzip > "$BACKUP_DIR/intel-platform.sql.gz"
docker exec authentik-db pg_dump -U authentik authentik | gzip > "$BACKUP_DIR/authentik.sql.gz"
docker exec outline-db pg_dump -U outline outline | gzip > "$BACKUP_DIR/outline.sql.gz"
docker exec miniflux-db pg_dump -U miniflux miniflux | gzip > "$BACKUP_DIR/miniflux.sql.gz"
# 2. MariaDB Dump
echo "[2/7] MariaDB Dump..."
docker exec bookstack-db mysqldump -u bookstack --password="$BOOKSTACK_DB_PASS" bookstackapp | gzip > "$BACKUP_DIR/bookstack.sql.gz"
# 3. Vaultwarden
echo "[3/7] Vaultwarden..."
cp -r ~/projects/vaultwarden/data/ "$BACKUP_DIR/vaultwarden-data/"
# 4. Forgejo
echo "[4/7] Forgejo..."
tar czf "$BACKUP_DIR/forgejo-data.tar.gz" -C ~/projects/forgejo data/
# 5. Config Files
echo "[5/7] Config Files..."
tar czf "$BACKUP_DIR/nginx-configs.tar.gz" /etc/nginx/sites-available/
tar czf "$BACKUP_DIR/env-files.tar.gz" ~/projects/job-tracker/.env ~/projects/intel-platform/.env 2>/dev/null || true
# 6. Kleinere Services
echo "[6/7] Service-Daten..."
cp ~/projects/uptime-kuma/data/kuma.db "$BACKUP_DIR/kuma.db" 2>/dev/null || true
tar czf "$BACKUP_DIR/n8n-data.tar.gz" -C ~/projects/job-tracker n8n-data/
# 7. Aufräumen (Backups älter als 7 Tage)
echo "[7/7] Aufräumen..."
find /home/nsa/backups/ -maxdepth 1 -type d -mtime +7 -exec rm -rf {} \;
echo "=== Backup abgeschlossen: $(date) ==="
echo "Größe: $(du -sh "$BACKUP_DIR" | cut -f1)"
Cron einrichten
# Täglich um 03:00 Uhr
0 3 * * * /home/nsa/bin/backup.sh >> /home/nsa/backups/backup.log 2>&1
Offsite-Backup-Optionen
| Option | Kosten | Verschlüsselung | Automation |
|---|
| Hetzner Storage Box | ab 3,81€/mo (1 TB) | Via borgbackup | Einfach via SSH/rsync |
| Backblaze B2 | $6/TB/mo | S3-kompatibel, client-side | Via rclone |
| rsync.net | $0.02/GB/mo | Via borgbackup | SSH nativ |
| Zweiter VPS | ~5€/mo | Via rsync + GPG | Eigene Kontrolle |
Empfehlung: Hetzner Storage Box + borgbackup für inkrementelle, verschlüsselte Backups.
Roadmap
Phase 1: Stabilität (aktuell)
Phase 2: Automation
Phase 3: Optimierung
Phase 4: Erweiterung