Compare commits

...

10 Commits

Author SHA1 Message Date
2833f12b1a make log file destination configurable 2025-01-12 12:01:45 +01:00
a75604c097 fix expansion 2023-12-13 16:53:15 +01:00
2c643c73c6 delete temporary directories before starting host backup 2023-12-13 16:47:52 +01:00
8faea3fef1 Run two-stages backups (#1)
1. Download to a local directory, archive/encrypt there
1. Push encrypted archive to remote folder

This helps when the backup destination is e.g. a NFS drive.

Also, introduce harder checks and fix some flaws.

Reviewed-on: #1
Co-authored-by: Max Mehl <mail@mehl.mx>
Co-committed-by: Max Mehl <mail@mehl.mx>
2023-12-13 12:22:54 +01:00
6da9f1fabc allow to backup a specific host 2021-08-30 19:45:32 +02:00
46903a4038 support non-standard SSH ports, fix some shellchecks 2021-08-30 19:27:47 +02:00
4c8127c388 fix behaviour with UTF-8 file names; make mysql backup source depend on uberspace version 2020-11-06 15:13:37 +01:00
31105d3875 add REUSE badge 2019-09-04 12:01:45 +02:00
96be32af8b remove unused CIs 2019-08-07 10:57:12 +02:00
c046b4d0ca SPDX-Copyright -> SPDX-FileCopyrightText 2019-08-07 10:56:34 +02:00
9 changed files with 212 additions and 105 deletions

View File

@@ -1,4 +1,4 @@
# SPDX-Copyright: 2019 Free Software Foundation Europe e.V.
# SPDX-FileCopyrightText: 2019 Free Software Foundation Europe e.V.
# SPDX-License-Identifier: CC0-1.0
pipeline:

2
.gitignore vendored
View File

@@ -1,4 +1,4 @@
# SPDX-Copyright: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-FileCopyrightText: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-License-Identifier: CC0-1.0
config.cfg

View File

@@ -1,7 +0,0 @@
# SPDX-Copyright: 2019 Free Software Foundation Europe e.V.
# SPDX-License-Identifier: CC0-1.0
reuse:
image: fsfe/reuse:latest
script:
- reuse lint

View File

@@ -1,11 +0,0 @@
# SPDX-Copyright: 2019 Free Software Foundation Europe e.V.
# SPDX-License-Identifier: CC0-1.0
language: minimal
services:
- docker
before_install:
- docker pull fsfe/reuse:latest
- docker run --name reuse -v ${TRAVIS_BUILD_DIR}:/repo fsfe/reuse /bin/sh -c "cd /repo; reuse lint"

View File

@@ -1,21 +1,33 @@
<!--
SPDX-Copyright: 2019 Max Mehl <mail [at] mehl [dot] mx>
SPDX-FileCopyrightText: 2019 Max Mehl <mail [at] mehl [dot] mx>
SPDX-License-Identifier: GPL-3.0-or-later
-->
# Uberspace Backup
This Bash script is able to backup directories from Uberspace users (and also other SSH resources). For Uberspace hosts it can also backup MySQL databases by copying the backups Uberspace cleverly created for you.
[![REUSE compliant](https://api.reuse.software/badge/src.mehl.mx/mxmehl/uberspace-backup)](https://api.reuse.software/info/src.mehl.mx/mxmehl/uberspace-backup)
It is designed to work automatically on another server with enough harddisk space.
This Bash script is able to backup directories from Uberspace users (and also
other SSH resources). For Uberspace hosts it can also backup MySQL databases by
copying the backups Uberspace cleverly created for you.
It is designed to work automatically on another server with enough harddisk
space.
## Features
- Transfers files securely via rsync over SSH.
- Encrypts backups with GnuPG, using a public key only. Once a backup is encrypted it can only be decrypted with a private key. Make sure to delete the private key from the backupping server (after saving it on a more secure space of course) to keep your backups even safer.
- If desired, it can delete older backups and only retain a configurable amount of backups.
- Encrypts backups with GnuPG, using a public key only. Once a backup is
encrypted it can only be decrypted with a private key. Make sure to delete the
private key from the backupping server (after saving it on a more secure space
of course) to keep your backups even safer.
- If desired, it can delete older backups and only retain a configurable amount
of backups.
- Rather verbose logs will be written to backup.log.
- With the helper script `ssh-checker.sh` one can automatically test whether the hosts provided in the hosts file can be accessed. If not, the little helper is trying to put your public SSH key to the remote hosts' authorized_keys files by letting you type in the password manually once.
- With the helper script `ssh-checker.sh` one can automatically test whether the
hosts provided in the hosts file can be accessed. If not, the little helper is
trying to put your public SSH key to the remote hosts' authorized_keys files
by letting you type in the password manually once.
## Configuration
@@ -23,24 +35,60 @@ Configuration happens in two files: config.cfg and hosts.csv.
### config.cfg
Everything should be self-explanatory with the comments. Make sure to use the correct GPG fingerprint, and make sure to have its public key imported by the user executing the script. No private key has to be installed on the backupping system (but on the decrypting one of course).
Everything should be self-explanatory with the comments. Make sure to use the
correct GPG fingerprint, and make sure to have its public key imported by the
user executing the script. No private key has to be installed on the backupping
system (but on the decrypting one of course).
### hosts.csv
This file contains the hosts and its directories that shall be saved. It consists of two rows separated by `;`. The first one contains a `username@hostname` combination that will be used to sync files via SSH, and also as the backup destination directory name.
This file contains the hosts and its directories that shall be saved. It
consists of two rows separated by `;`. The first one contains a
`username@hostname` combination that will be used to sync files via SSH, and
also as the backup destination directory name.
The latter one contains all source directories that shall be transferred. This can be absolute file paths, or if it's a Uberspace host some special shortcuts:
The latter one contains all source directories that shall be transferred. This
can be absolute file paths, or if it's a Uberspace host some special
shortcuts:
- `%virtual` backups the virtual folder of your uberspace host (`/var/www/virtual/username/`) where for example the `html` folder is located in.
- `%mysql` downloads the latest backup of your MySQL databases that have been created by Uberspace themselves (their backup system is quite sophisticated).
- `%mails` downloads the directory `users` in the home directory which contains all email files of virtual mail users.
- `%virtual` backups the virtual folder of your uberspace host
(`/var/www/virtual/username/`) where for example the `html` folder is located
in.
- `%mysql` downloads the latest backup of your MySQL databases that have been
created by Uberspace themselves (their backup system is quite sophisticated).
- `%mails` downloads the directory `users` in the home directory which contains
all email files of virtual mail users.
- `%home` simply downloads the whole user's home directory.
You can give multiple locations that shall be backed up. Just separate them by `|` characters. See the example file for more.
You can give multiple locations that shall be backed up. Just separate them by
`|` characters. See the example file for more.
## Process
The script runs the following most important steps:
1. For each host in `hosts.csv`
1. Check SSH connection
1. Compose SSH host settings
1. For each backup source
1. Resolve special backup sources
1. Create backup destination
1. rsync source to destination
1. tar the destination
1. gpg-encrypt the destination
1. Delete older backups
1. Output completion info
## Manual run
You can run `ssh-checker.sh` and `uberspace-backup.sh` manually. Without any arguments given, both will check/backup all hosts.
You can provide an argument to check/backup a specific host. This argument has
to fully match a server's `user@hostname[:port]` declaration as on `hosts.csv`.
## Automatic runs
In order to let the script run regularily, simply put the script's absolute path in a cron file. For example, run `crontab -e` and insert at the bottom:
In order to let the script run regularily, simply put the script's absolute path
in a cron file. For example, run `crontab -e` and insert at the bottom:
```
10 3 * * * /home/archiver/uberspace-backup/uberspace-backup.sh
@@ -50,5 +98,8 @@ This will run the backups every night at 3:10.
## Known limitations
- Please note that paths like `~` or `$HOME` haven't been tested yet. Use absolute paths instead.
- At the moment, the backups don't follow symbolic links. That's why for example error logs aren't downloaded when using `%virtual`. Make sure to regularily check your backups to make sure all important files are saved.
- Please note that paths like `~` or `$HOME` haven't been tested yet. Use
absolute paths instead.
- At the moment, the backups don't follow symbolic links. That's why for example
error logs aren't downloaded when using `%virtual`. Make sure to regularly
check your backups to make sure all important files are saved.

View File

@@ -1,11 +1,14 @@
# SPDX-Copyright: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-FileCopyrightText: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-License-Identifier: CC0-1.0
# File with hosts and their backup source paths
HOSTS="$CURDIR"/hosts.csv
# Temporary download destination for backups
TEMPDIR=/tmp/uberspace-backup
# root dir where backups shall be saved to
BACKUPDIR=/var/backups/uberspace
BACKUPDIR=/mnt/remotesrv/uberspace
# GPG fingerprint of key used for encryption
GPG=6775E8DDD8CEABCC83E38CEHE6334BCA29DF8192
@@ -15,3 +18,6 @@ MAXBAK=3
# SSH key
#SSH_KEY="~/.ssh/mykey_rsa"
# Logfile. Default: $CURDIR/backup.log
# LOG_FILE=/var/log/uberspace-backup.log

View File

@@ -1,6 +1,6 @@
# SPDX-Copyright: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-FileCopyrightText: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-License-Identifier: CC0-1.0
# Username@Hostname; Path1 | Path2 | Path3
root@server; /home
# Username@Hostname[:Port]; Path1 | Path2 | Path3; Uberspace version (default = 7)
user@host.uberspace.de; %virtual | %mysql | /home/user/service
root@server:2222; /home

View File

@@ -1,5 +1,5 @@
#!/usr/bin/env bash
# SPDX-Copyright: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-FileCopyrightText: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-License-Identifier: GPL-3.0-or-later
########################################################################
#
@@ -15,7 +15,7 @@ source "$CURDIR"/config.cfg
if [ ! -e "${HOSTS}" ]; then echo "Missing hosts file. Please set a correct value of HOSTS= in your config file. Current value: ${HOSTS}"; exit 1; fi
if [ ! -z "${SSH_KEY}" ]; then
if [ -n "${SSH_KEY}" ]; then
SSH_KEY_ARG="-i ${SSH_KEY}"
else
# defaults
@@ -29,19 +29,31 @@ function trim {
sed -r -e 's/^\s*//g' -e 's/\s*$//g'
}
while read line; do
while read -r line; do
# if line is a comment, go to next line
if $(echo "$line" | grep -qE "^\s*#"); then continue; fi
if echo "$line" | grep -qE "^\s*#"; then continue; fi
RHOST=$(echo "$line" | cut -d";" -f1 | trim)
# Jump to next line if this line's host does not match host of ARG1 (if given)
if [[ "${ARG1}" != "" ]] && [[ "${ARG1}" != "${RHOST}" ]]; then
continue
fi
# Get SSH port if needed
if echo "$RHOST" | grep -q ":"; then
RPORT=$(echo "$RHOST" | cut -d":" -f2)
RHOST=$(echo "$RHOST" | cut -d":" -f1)
RPORT_ARG="-p ${RPORT}"
else
# defaults
RPORT=""
RPORT_ARG=""
fi
echo "[INFO] Trying ${RHOST}"
STATUS=$(ssh -n -o StrictHostKeyChecking=no -o BatchMode=yes -o ConnectTimeout=5 ${SSH_KEY_ARG} ${RHOST} "echo -n"; echo $?)
STATUS=$(ssh -n -o StrictHostKeyChecking=no -o BatchMode=yes -o ConnectTimeout=5 ${RPORT_ARG} ${SSH_KEY_ARG} "${RHOST}" "echo -n"; echo $?)
if [ $STATUS != 0 ]; then
echo -n "[ERROR] No SSH login possible for ${RHOST}. "
@@ -50,12 +62,10 @@ while read line; do
exit 1
else
echo "Adding public key with password: "
cat "${SSH_KEY}".pub | ssh ${RHOST} 'cat >> ~/.ssh/authorized_keys'
cat "${SSH_KEY}".pub | ssh -o StrictHostKeyChecking=no ${RPORT_ARG} ${SSH_KEY_ARG} "${RHOST}" 'cat >> ~/.ssh/authorized_keys'
fi
else
echo "[SUCCESS] SSH login possible for ${RHOST}."
fi
echo
done < "$HOSTS"

View File

@@ -1,5 +1,5 @@
#!/usr/bin/env bash
# SPDX-Copyright: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-FileCopyrightText: 2019 Max Mehl <mail [at] mehl [dot] mx>
# SPDX-License-Identifier: GPL-3.0-or-later
########################################################################
#
@@ -9,13 +9,22 @@
#
########################################################################
# Fail fast on errors
set -Eeuo pipefail
# Set correct UTF-8 encoding (for FreeBSD jail)
export LC_ALL=en_US.UTF-8
# Initialise variables
LOG_FILE=
CURDIR=$(dirname "$(readlink -f "$0")")
if [ ! -e "$CURDIR"/config.cfg ]; then echo "Missing config.cfg file. Edit and rename config.cfg.sample"; exit 1; fi
source "$CURDIR"/config.cfg
if [ ! -e "${HOSTS}" ]; then echo "Missing hosts file. Please set a correct value of HOSTS= in your config file. Current value: ${HOSTS}"; exit 1; fi
if [ ! -z "${SSH_KEY}" ]; then
if [ -n "${SSH_KEY}" ]; then
SSH_KEY_ARG="-i ${SSH_KEY}"
else
# defaults
@@ -23,9 +32,13 @@ else
SSH_KEY=~/.ssh/id_rsa
fi
if [ -z "${LOG_FILE}" ]; then
# defaults
LOG_FILE="$CURDIR"/backup.log
fi
# Get current date
DATE=$(date +"%Y-%m-%d_%H-%M")
LOG="$CURDIR"/backup.log
function trim {
sed -r -e 's/^[[:space:]]*//g' -e 's/[[:space:]]*$//g'
@@ -36,75 +49,120 @@ function pdate {
}
function logecho {
# Echo string and copy it to log while attaching the current date
echo "$(pdate) $@"
echo "$(pdate) $@" >> "$LOG"
echo "$(pdate) $*"
echo "$(pdate) $*" >> "$LOG_FILE"
}
while read line; do
# if line is a comment, go to next line
if $(echo "$line" | grep -qE "^\s*#"); then continue; fi
# Loop over all hosts
while read -r line; do
# if line is a comment or blank, go to next line
if echo "$line" | grep -qE "^\s*(#|$)"; then continue; fi
RHOST=$(echo "$line" | cut -d";" -f1 | trim)
RUSER=$(echo "$RHOST" | cut -d"@" -f1)
ALLRDIR=$(echo "$line" | cut -d";" -f2 | trim)
logecho "${RHOST}: Starting backups"
# Jump to next line if this line's host does not match host of first argument (if given)
if [[ "${1-}" != "" ]] && [[ "${1-}" != "${RHOST}" ]]; then
continue
fi
# Task ssh-checker.sh to check this host
if ! "${CURDIR}"/ssh-checker.sh "${RHOST}"; then
logecho "${RHOST}: ERROR when connecting via SSH. Please run ssh-checker.sh to debug."
logecho "${RHOST}: Aborting backup after an error."
continue
fi
NORDIR=$(echo $ALLRDIR | grep -o "|" | wc -l)
NORDIR=$[$NORDIR + 1]
RUSER=$(echo "$RHOST" | cut -d"@" -f1)
ALLRDIR=$(echo "$line" | cut -d";" -f2 | trim)
US_VERSION=$(echo "$line" | cut -d";" -f3 | trim)
# Get SSH port if needed
if echo "$RHOST" | grep -q ":"; then
RPORT=$(echo "$RHOST" | cut -d":" -f2)
RHOST=$(echo "$RHOST" | cut -d":" -f1)
RPORT_ARG="-p ${RPORT}"
else
# defaults
RPORT=""
RPORT_ARG=""
fi
logecho "${RHOST}: Starting backups"
logecho "${RHOST}: Deleting host's temporary directories in ${TEMPDIR}"
rm -rf "${TEMPDIR:?}/${RHOST:?}/"*
NORDIR=$(echo "$ALLRDIR" | grep -o "|" | wc -l || true)
NORDIR=$(($NORDIR + 1))
# Loop through all backup sources
for ((i = 1; i <= $NORDIR; i++)); do
RDIR=$(echo "$ALLRDIR" | cut -d"|" -f${i} | trim)
# Set a relative destination directory
if [ "${RDIR}" == "%virtual" ]; then
RDIR=/var/www/virtual/${RUSER}
DEST="$BACKUPDIR/$RHOST/$DATE/virtual"
DEST_REL="$RHOST/$DATE/virtual"
elif [ "${RDIR}" == "%mysql" ]; then
RDIR=mysql
DEST="$BACKUPDIR/$RHOST/$DATE/$(basename "${RDIR}")"
DEST_REL="$RHOST/$DATE/$(basename "${RDIR}")"
elif [ "${RDIR}" == "%mails" ]; then
RDIR=/home/${RUSER}/users
DEST="$BACKUPDIR/$RHOST/$DATE/mails"
DEST_REL="$RHOST/$DATE/mails"
elif [ "${RDIR}" == "%home" ]; then
RDIR=/home/${RUSER}
DEST="$BACKUPDIR/$RHOST/$DATE/home"
DEST_REL="$RHOST/$DATE/home"
else
DEST="$BACKUPDIR/$RHOST/$DATE/$(basename "${RDIR}")"
DEST_REL="$RHOST/$DATE/$(basename "${RDIR}")"
fi
# Define absolute temporary and final backup destination paths
# Example:
# DEST=/tmp/uberspace-backup/user@example.com/2019-01-01/virtual
# DEST_FINAL=/media/Uberspace/user@example.com/2019-01-01/
DEST="${TEMPDIR}/${DEST_REL}"
DEST_FINAL="$(dirname "${BACKUPDIR}/${DEST_REL}")"
# Set Source directory, and make exception for %mysql
SOURCE="${RDIR}"
if [ "${RDIR}" == "mysql" ]; then SOURCE=/mysqlbackup/latest/${RUSER}; fi
if [ "${RDIR}" == "mysql" ]; then
if [[ $US_VERSION == 6 ]]; then
SOURCE=/mysqlbackup/latest/${RUSER}
else
SOURCE=/mysql_backup/current/${RUSER}
fi
fi
# Create backup destination if necessary
# Create temporary and final backup destination if necessary
if [ ! -e "${DEST}" ]; then mkdir -p "${DEST}"; fi
if [ ! -e "${DEST_FINAL}" ]; then mkdir -p "${DEST_FINAL}"; fi
# RSYNC
logecho "${RHOST}: Downloading ${SOURCE} to ${DEST}"
rsync -a -e "ssh -q -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o Compression=no -T -x ${SSH_KEY_ARG}" ${RHOST}:${SOURCE}/ "${DEST}"/
rsync -a -e "ssh -q -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o Compression=no -T -x ${RPORT_ARG} ${SSH_KEY_ARG}" "${RHOST}:${SOURCE}"/ "${DEST}"/
# Pack backup directory, and delete uncompressed one
logecho "${RHOST}: Archiving $(basename ${DEST})"
tar cf ${DEST}.tar -C $(echo ${DEST} | sed "s|$(basename ${DEST})$||") $(basename ${DEST}) # TODO: avoid absolute paths
rm -rf ${DEST}
logecho "${RHOST}: Archiving $(basename "${DEST}")"
tar cf "${DEST}".tar -C $(echo ${DEST} | sed "s|$(basename ${DEST})$||") $(basename ${DEST}) # TODO: avoid absolute paths
rm -rf "${DEST}"
# Encrypt archive with GPG (it compresses at the same time)
logecho "${RHOST}: Encrypting and compressing $(basename ${DEST})"
gpg --output ${DEST}.tar.gpg --encrypt --recipient ${GPG} ${DEST}.tar
rm ${DEST}.tar
logecho "${RHOST}: Encrypting and compressing $(basename "${DEST}")"
gpg --output "${DEST}".tar.gpg --encrypt --recipient ${GPG} "${DEST}".tar
rm "${DEST}".tar
# Push encrypted backup to final backup destination
logecho "${RHOST}: Moving $(basename "${DEST}") to ${DEST_FINAL}"
cp "${DEST}".tar.gpg "${DEST_FINAL}/"
rm "${DEST}".tar.gpg
done # End of loop through all backup sources
# Delete all old directories except the $MAXBAK most recent
if [ $(ls -tp "${BACKUPDIR}"/"${RHOST}"/ | grep '/$' | wc -l | tr -d ' ') -gt $MAXBAK ]; then
logecho "${RHOST}: Removing older backups of $(basename ${DEST})"
ls -tpd "${BACKUPDIR}"/"${RHOST}"/* | grep '/$' | tail -n +$[$MAXBAK + 1] | xargs -0 | xargs rm -r --
oldbackups=$(ls -tp "${BACKUPDIR}"/"${RHOST}"/ | grep '/$' | tail -n +$(($MAXBAK + 1)))
logecho "${RHOST}: Removing older backup directories: ${oldbackups}"
ls -tpd "${BACKUPDIR}"/"${RHOST}"/* | grep '/$' | tail -n +$(($MAXBAK + 1)) | xargs -0 | xargs rm -r --
fi
done
done < "$HOSTS"