Archiving to the Cloud: Automating Rclone with CaptureGem guide illustration
⏱️ 2 min read

Archiving to the Cloud: Automating Rclone with CaptureGem


Table of Contents

High-fidelity 4K recording eats disk space for breakfast. If you are recording multiple streams, even a 4TB NVMe will fill up in days. The solution is Cloud Archiving: keeping your active recordings on local fast storage, and automatically moving them to cheap cloud object storage once they are finished.

Why Rclone?

Rclone is the “Swiss army knife of cloud storage.” It is a command-line tool that syncs, moves, and manages files between your local drive and dozens of cloud providers. It is lightweight, fast, and extremely stable.

1. Choose Your Provider

For cam archiving, we recommend providers with zero egress fees (meaning it’s free to download your files back):

  • Cloudflare R2: $0.015/GB, zero egress fees. Best for high-performance retrieval.
  • Backblaze B2: $0.006/GB, free egress up to 3x your storage amount. Best value.

2. Configure Rclone

Install Rclone and run the interactive setup:

rclone config

Follow the prompts to add a new “Remote” (e.g., name it r2-remote).

3. The Automation Script

Instead of manually moving files, we can use a simple script. Create a file named archive.sh (Linux/Mac) or archive.ps1 (Windows).

Bash Template (Linux/Mac):

#!/bin/bash
# Move files older than 24 hours to the cloud
SOURCE="/path/to/capturegem/recordings"
REMOTE="r2-remote:my-bucket/archives"

rclone move "$SOURCE" "$REMOTE" \
    --min-age 24h \
    --include "*.mp4" \
    --transfers 4 \
    --checkers 8 \
    --log-file="/var/log/rclone-archive.log" \
    --delete-empty-src-dirs

PowerShell Template (Windows):

# Move files older than 1 day
$Source = "C:\Recordings"
$Remote = "r2-remote:my-bucket/archives"

rclone move $Source $Remote `
    --min-age 1d `
    --include "*.mp4" `
    --transfers 4 `
    --log-file="C:\Logs\rclone.log" `
    --delete-empty-src-dirs

4. Integration with CaptureGem

You can run this script as a Cron job (Linux) or Scheduled Task (Windows) every night at 3 AM.

Crontab example:

0 3 * * * /home/user/scripts/archive.sh

5. Verification and Safety

  • Dry Run First: Always run with --dry-run the first time to ensure it won’t delete the wrong files.
  • Checksums: Rclone uses MD5/SHA1 checksums to verify that the file arrived at the cloud perfectly before deleting the local copy.
  • Encryption: If you are storing sensitive content, use Rclone’s Crypt remote type to encrypt files before they leave your PC.

Conclusion

By offloading finished recordings to the cloud, you can maintain a lean, high-performance recording rig without ever worrying about “Disk Full” errors stopping your CaptureGem instance.

Related guides

Rate this guide

Loading ratings...

Was this guide helpful?

Comments