← Back to Docs

rclone — Scripted Sync and Transfer from the Command Line

Swiss-army command-line tool for cloud storage. Scripted sync, copy, and mount operations against any S3-compatible backend. Works on Linux, macOS, and Windows.

1. Install rclone

On Debian/Ubuntu:

sudo apt install rclone

On macOS:

brew install rclone

On Windows, download the .zip from rclone.org/downloads and extract rclone.exe to a folder on your PATH (e.g. C:\Program Files\rclone).

The apt package may lag behind upstream. For the latest version on any platform, use the official installer:

curl https://rclone.org/install.sh | sudo bash

Verify the install:

rclone version

2. Get your S3 credentials

Log in to your HummingTribe dashboard → S3 Storage tab. Copy your Access Key ID and reveal your Secret Access Key (shown once — save it now). Note your bucket name.

3. Configure the remote

Run the interactive config wizard:

rclone config

Answer the prompts: n) New remote name> hummingtribe Storage> s3 provider> Other env_auth> false access_key_id> your-access-key-id secret_access_key> your-secret-access-key region> (leave blank, press Enter) endpoint> storage.hummingtribe.com location_constraint> (leave blank) acl> private Edit advanced config? n Keep this "hummingtribe" remote? y q) Quit config

This writes the remote to ~/.config/rclone/rclone.conf (Linux/macOS) or %APPDATA%\rclone\rclone.conf (Windows). Lock down permissions on Linux/macOS — the file contains your S3 credentials in plaintext:

chmod 600 ~/.config/rclone/rclone.conf

4. Test the connection

List buckets on the remote:

rclone lsd hummingtribe:

You should see your bucket name in the output. List the contents of your bucket:

rclone ls hummingtribe:your-bucket-name

An empty bucket returns no output and exit code 0.

5. Copy or sync data

rclone copy uploads new and changed files without deleting anything on the destination:

rclone copy /path/to/data hummingtribe:your-bucket-name/data --progress

rclone sync makes the destination match the source — files deleted locally will be deleted on the remote:

rclone sync /path/to/data hummingtribe:your-bucket-name/data --progress

Always run sync with --dry-run first to confirm what will change:

rclone sync /path/to/data hummingtribe:your-bucket-name/data --dry-run

6. Useful flags

FlagPurpose
--progressLive transfer stats in the terminal
--dry-runShow what would happen without transferring anything
--transfers 8Run 8 parallel file transfers (default 4)
--checkers 16Run 16 parallel hash checks (default 8)
--bwlimit 10MCap bandwidth at 10 MB/s
--exclude "*.tmp"Skip files matching a pattern
--log-file /var/log/rclone.logWrite a log file
--log-level INFOLog verbosity (DEBUG, INFO, NOTICE, ERROR)

For large datasets, tune parallelism:

rclone sync /data hummingtribe:your-bucket-name/data \
  --transfers 16 --checkers 32 --progress

7. Automate with cron or Task Scheduler

Linux (cron): Create /etc/cron.d/rclone-sync:

0 2 * * * root /usr/bin/rclone sync /path/to/data hummingtribe:your-bucket-name/data --log-file /var/log/rclone.log --log-level INFO

The cron daemon runs as root, which reads /root/.config/rclone/rclone.conf. If you want cron to run as a specific user, set HOME=/home/username at the top of the crontab.

macOS (launchd): Create ~/Library/LaunchAgents/com.user.rclone.plist:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
  "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
  <key>Label</key>
  <string>com.user.rclone</string>
  <key>ProgramArguments</key>
  <array>
    <string>/usr/local/bin/rclone</string>
    <string>sync</string>
    <string>/path/to/data</string>
    <string>hummingtribe:your-bucket-name/data</string>
    <string>--log-file</string>
    <string>/tmp/rclone.log</string>
  </array>
  <key>StartCalendarInterval</key>
  <dict>
    <key>Hour</key>
    <integer>2</integer>
    <key>Minute</key>
    <integer>0</integer>
  </dict>
</dict>
</plist>

Load the agent:

launchctl load ~/Library/LaunchAgents/com.user.rclone.plist

Windows (Task Scheduler): Open Task Scheduler → Create Basic Task → Trigger: Daily at 02:00 → Action: Start a program → Program: C:\Program Files\rclone\rclone.exe → Arguments: sync C:\path\to\data hummingtribe:your-bucket-name/data --log-file C:\rclone.log

8. Verify and restore

Compare source and destination to confirm integrity:

rclone check /path/to/data hummingtribe:your-bucket-name/data

For a stricter check that downloads and hashes every file:

rclone check /path/to/data hummingtribe:your-bucket-name/data --download

Restore files by copying from the remote back to local disk:

rclone copy hummingtribe:your-bucket-name/data /path/to/restore --progress

To restore a single file:

rclone copy hummingtribe:your-bucket-name/data/report.pdf /path/to/restore

Manage your bucket and credentials from your HummingTribe dashboard.

← Back to Docs