Direct to S3 Glacier backup and remote sync

After setting up mystic bbs on a VM one encounters the need to backup user generated content and remotely sync configs. To do this we leveraged git, acl, cron, and s3fuse-fvs.

Goal

  • Create an autonomous direct to glacier archival of key paths
  • Leverage git-archive and gitignore
  • Leverage git remote

Setup

Prereq; preconfigured vm running linux, access aws s3/deployed minio, and configured git.

apt install -y s3fuse acl

Once installed one then needs to configure the fstab and /etc/s3fuse.cfg, followed by setting up the /etc/passwd-s3fs file.

/etc/fstab entry

s3fuse  /mnt/backup fuse defaults,noauto,user,_netdev,allow_other,nonempty,umask=277,uid=1000,gid=1000      0 0

shell

/etc$ sudo mkdir /mnt/backup

/etc$ # use glacier (opt. if s3 only)

/etc$ [ -Z $USE_GLACIER ] || sudo setfattr -n user.s3fuse_request_restore -v 4 /mnt/backup 
/etc$ sudo touch /etc/passwd-s3fs && sudo chmod 0600 /etc/passwd-s3fs && sudo chown root. /etc/passwd-s3fs
/etc$ echo "$AWS_ACCESS_KEY_ID $AWS_SECRET_ACCESS_KEY" | sudo tee -a /etc/passwd-s3fs >/dev/null

/etc$ cat /etc/s3fuse.conf

## S3 bucket s3://... {xmcore-backup}

bucket_name=... #xmcore-backup
service=aws
aws_secret_file=/etc/passwd-s3fs

Next create the bucket and mount it. Once that is complete one can setup git:

cd /srv/bbs

touch .gitignore

git init .
git commit -m 'initial commit'
git tag master $(date +%s)

Operations

When one is ready to perform a backup; they can see the delta live by doing git status.
Inspecting changes is just as easy with git diff and generating a log of commited changes is easy with
git log.

Automation

So long as one is dutiful on maintaining a healthy .gitignore then automating is easy from here:

sync-with-live () {
  git commit -a -m
  git tag $(date +%s)
}

send-to-s3() {
  LATEST=$(git describe --abbrev=0 --tags)
  git archive --format=tar --prefix=${HOSTNAME}/$(LATEST} | gzip > /mnt/backup/${HOSTNAME}-backup.${LATEST}.tgz
}


# Example
cd /srv/bbs; sync-with-live && send-to-s3

Assuming that one has placed the above bash functions into their ~/.bashrc then the following cronjob would create a weekly backup.

crontab

@weekly bash -lic “cd /srv/bbs; sync-with-live && send-to-s3”

Next steps

One can take this further with signing of commits and encrypting on the fly with git-archive ... | gpg --encrypt | gzip > ...$target to ensure data protection. Including using a git remote repository for storing configurations or syncing with other nodes.

There is also means here to change the target filename and prefix names while I’ll leave as an exercise for you the reader.