My Nextcloud to FreeNAS backup strategy

I’m running a Nextcloud instance on a cloud server for easy remote work and file sharing. Sadly, at the time of writing, there still does not exist an official built-in backup system in Nextcloud so you have to build your own if you don’t want to risk losing your data on a server crash. In this post I’ll show my setup for making regular backups of my Nextcloud to my FreeNAS VM that I’m running at home.

The server-side backup script

This is the script responsible for creating the file and database backups of the Nextcloud instance. You need to edit NEXTCLOUD_DIR so that it points to your Nextcloud path and BACKUP_DIR to point to the directory where the nextcloud backups should be stored.

Important: The backup directory must only contain the Nextcloud backup files and nothing else.

#! /usr/bin/env bash

NEXTCLOUD_DIR=/var/www/nextcloud
BACKUP_DIR=/var/backups/nextcloud
BAK_FILE_DIR=nextcloud-dirbak_`date +"%Y%m%d"`.tar.gz
BAK_FILE_SQL=nextcloud-sqlbak_`date +"%Y%m%d"`.sql.gz

# put nextcloud in maintenance mode
sudo -u www-data php $NEXTCLOUD_DIR/occ maintenance:mode --on
# compress nextcloud directory
tar -cf - $NEXTCLOUD_DIR | gzip > $BACKUP_DIR/$BAK_FILE_DIR
# dump nextcloud db
mysqldump --default-character-set=utf8mb4 --single-transaction nextcloud | gzip > $BACKUP_DIR/$BAK_FILE_SQL
# disable maintenance mode
sudo -u www-data php $NEXTCLOUD_DIR/occ maintenance:mode --off
# delete backups older than 3 days
find $BACKUP_DIR -type f -mtime +3 -delete

Explanation:

Define the file name for the data directory and database backup files:

BAK_FILE_DIR=nextcloud-dirbak_`date +"%Y%m%d"`.tar.gz
BAK_FILE_SQL=nextcloud-sqlbak_`date +"%Y%m%d"`.sql.gz

(date +"%Y%m%d" will insert the current date like 20200524)

At the beginning we put Nextcloud into maintenance mode so that no one can modify any files while the backup process is running:

sudo -u www-data php $NEXTCLOUD_DIR/occ maintenance:mode --on

The next line creates an archive of the complete Nextcloud installation + data directory:

tar -cf - $NEXTCLOUD_DIR | gzip > $BACKUP_DIR/$BAK_FILE_DIR

Then we dump the Nextcloud database and also compress the sql file:

mysqldump --default-character-set=utf8mb4 --single-transaction nextcloud | gzip > $BACKUP_DIR/$BAK_FILE_SQL

Now we disable maintenance mode:

sudo -u www-data php $NEXTCLOUD_DIR/occ maintenance:mode --off

At at last we delete all backups older than 3 days:

find $BACKUP_DIR -type f -mtime +3 -delete

3 days is pretty short but since the plan is to backup the backups again we don’t need that many backups directly on the server where disk space is more valuable.

Copy the shell script to a location of your choice and make it executable. For my setup that would be chmod +x /root/backup_nextcloud.sh

Now is a good time to test the script and make sure everything works.

At last configure a Cronjob to execute this script daily. Make sure that the user that executes the Cronjob has permissions to the Nextcloud directory and mysqldump. I’m running the backup as root at 1am. So we want to edit the cron file with crontab -e and add a new line:

0 1 * * * /root/backup_nextcloud.sh &> /dev/null

Setting up an rsync task in FreeNAS

To copy the files to our FreeNAS machine we make use of the built-in rsync task management GUI.

In the navigation select “Tasks” -> “Rsync Tasks” and click on “ADD” in the top right corner to add a new rsync task.

You can now configure the settings:

FreeNAS Rsync Task GUI

  • As Path select the path on your NAS where you want to store the backups.
  • As User select root.
  • As Remote Host enter the hostname/IP of your server, e.g. my.server.com. If you want to login as another user (than root) add the user’s name with an @ like so: customuser@my.server.com.
  • As Mode select SSH, and update the port if you have SSH running on another port.
  • As Remote Path enter the path on your server where the backups are stored.
  • As Direction choose PULL because we want to download the data from the remote host, default would be to upload data to the remote host (PUSH).
  • Under Schedule the Rsync Task modify the default schedule to run daily and always a few hours after the Cronjob on the server, to be sure that the backup has finished before we download it. Here i scheduled it for 6am while my Cronjob on the server runs at 1am, giving it 5 hours of time to finish which should be plenty.

Enable the Task and click SAVE.

For rsync over SSH to work, the root user on your FreeNAS machine needs an SSH key which must also be added to the remote user’s authorized keys. Also make sure manually to connect once via SSH from your FreeNAS shell before running the Rsync task, otherwise the Rsync Task will not work!

Of course we don’t want our whole disk to gradually fill with old backups so we can create a Cronjob that deletes old backups:

FreeNAS Cronjob GUI

Delete all files older than 10 days:

find /mnt/exos-6tb-pool/backup/Nextcloud -type f -mtime +10 -delete

(of course you have to modify it to match your directory)

I have some ideas on how to improve the Nextcloud backup process in the future but this is how it worked for me in the past few years.


comments powered by Disqus