Secure & Automated Cloud Backups

An Interactive Guide to Borg, rClone, Docker, and Google Drive

The Modern Backup Strategy

This guide details a powerful, private, and efficient backup solution. We combine four key technologies to create an automated workflow that encrypts your data on your machine before sending it to the cloud. This ensures your data remains private and significantly reduces storage space through intelligent deduplication. The entire process is containerized with Docker for portability and consistency.

Workflow at a Glance

📂

Your Data

The source files you want to protect.

📦

BorgBackup

Encrypts, compresses, and deduplicates data into a secure repository.

🔄

rClone

Syncs the encrypted repository to your cloud provider.

☁️

Google Drive

Your secure, encrypted off-site storage.

Core Concepts

Client-Side Encryption

Your data is encrypted with your password on your computer before it ever leaves for the cloud. Google only stores scrambled, unreadable data blocks.

Deduplication

Borg intelligently breaks files into chunks. It only uploads new or changed chunks, making daily backups extremely fast and space-efficient, even for large files.

Initial Setup: Connecting rClone & Google Drive

This is a one-time process to authorize rClone to access your Google Drive securely. We'll create a dedicated API project in Google Cloud, which gives you full control over rClone's permissions.

Dockerization: Packaging Your Backup Tool

We'll use Docker to create a self-contained, portable environment with Borg and rClone installed. This ensures your backup script runs consistently anywhere Docker is installed. You'll need to create a project directory with the files below.

This file defines the container image. It starts from a minimal base image (Alpine Linux) and installs BorgBackup and rClone.

FROM alpine:latest RUN apk add --no-cache borgbackup rclone RUN mkdir -p /data /borg-repo /config /logs COPY entrypoint.sh /entrypoint.sh RUN chmod +x /entrypoint.sh ENTRYPOINT ["/entrypoint.sh"]

Host Directory Structure

Before running `docker-compose up`, your project folder on your computer (the "host") should look like this. You need to create these directories and files.

backup-project/ ├── my_data/ # <== Your actual data goes here │ ├── document1.txt │ └── important_folder/ ├── config/ │ ├── rclone.conf # <== Copy from ~/.config/rclone/rclone.conf │ └── borg_passphrase # <== File containing your Borg password ├── Dockerfile ├── docker-compose.yml └── entrypoint.sh

Automation & File Access

With the containerized setup complete, you can automate the backup process and even access your cloud storage as if it were a local drive on your machine.

Automating with Cron

Cron is a time-based job scheduler in Unix-like operating systems. You can use it to run your Docker backup automatically.

  1. Open your crontab file by running `crontab -e`.
  2. Add a line to schedule the backup. This example runs it daily at 3:00 AM:
# Run backup job every day at 3:00 AM 0 3 * * * cd /path/to/backup-project && docker-compose run --rm backup

Ensure you use the absolute path to your project directory.

Mounting Your Remote Drive

rClone can mount your cloud storage, making your remote files accessible in your local file system. This is incredibly useful for browsing or restoring individual files.

On Docker Host (Linux/macOS):

# Note: --allow-other and --daemon are key flags rclone mount gdrive_backup_crypt: /path/to/mount/point --allow-other --daemon

On Windows:

First, install rClone for Windows and WinFsp. Then run in Command Prompt or PowerShell:

rclone mount gdrive_backup_crypt: X:

This will create a new `X:` drive in "This PC" containing your decrypted cloud files.