Rclone Backblaze B2



  1. Here are the standard options specific to b2 (Backblaze B2).-b2-account. Account ID or Application Key ID. Config: account; Env Var: RCLONEB2ACCOUNT; Type: string; Default: '-b2-key. Application Key. Config: key; Env Var: RCLONEB2KEY; Type: string; Default: '-b2-hard-delete.
  2. Rclone is mature, open source software originally inspired by rsync and written in Go. The friendly support community are familiar with varied use cases. Official Ubuntu, Debian, Fedora, Brew and Chocolatey repos. For the latest version downloading from rclone.org is recommended. Rclone is widely used on Linux, Windows and Mac.

Why did you end up choosing cryptomator vs rclone? Original Poster 3 years ago. It took less than 2 minutes to upload 1gb to Backblaze b2 with Cyber Duck, but it took more than 10 minutes with Mountain Duck!!! Anyone compared the speed for two? As of June 30, 2020, Backblaze had 142,630 spinning hard drives in our cloud storage ecosystem spread across four data centers. Of that number, there were 2,271 boot drives and 140,059 data drives. This review looks at the Q2 2020 and lifetime hard drive failure rates of the data drive models currently in operation in our data centers.

Amazon S3 or Amazon Simple Storage Service is a service offered by Amazon Web Services (AWS) that provides object storage through a web service interface. Amazon S3 uses the same scalable storage infrastructure that Amazon.com uses to run its global e-commerce network.

What is the problem you are having with rclone?

Mounting Backblaze B2 and copying files to it
Mount seems to work but copying files fails

What is your rclone version (output from rclone version)

rclone v1.53.1

Rclone Backblaze B2Rclone backblaze b2 plus
  • os/arch: windows/amd64
  • go version: go1.15

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10 x64

Which cloud storage system are you using? (eg Google Drive)

Backblaze B2

The command you were trying to run (eg rclone copy /tmp remote:tmp)

The rclone config contents with secrets removed.

A log from the command with the -vv flag

..gist.github.com/UsefulVid/9f35dcc41bf71796d3f0a9e45e6297ea

Priority postbox near me. edit: Nissan sentra transmission.

Looks like to be solved with:
rclone.exe mount --vfs-cache-mode writes Backblaze Y: -o volname=local

edit2:
Still does not work
I used
rclone.exe mount --vfs-cache-mode writes Backblaze:usefulviddemo1 Y: -o volname=local
But I can not delete files:

Even with
rclone.exe mount --vfs-cache-mode full Backblaze:usefulviddemo1 Y: -o volname=local

it looks like it works but no files arrive at my bucket.

Martin Aspeli

Junior Member
Many of us used Crashplan Home to back up a FreeNAS volume to the cloud. That service is discontinuing, and to be honest, Crashplan was always fairly painful to set up. We now have better alternatives, and here's one of them:
  • Backblaze B2 as cloud storage. It's faster, cheaper and easier to manage.
  • rclone in a cron job to sync files nightly.
The basic steps to get this set up are:
1) Create a jail.
2) Mount a dataset on /mnt/Backup or similar that contains the data you want to back up.
3) Install postfix via ports and configure it as an MTA. I used Gmail's SMTP server. Only needed if you want email alerts of successful/failed backups.
4) Sign up to Backblaze B2 and create a 'bucket' for your backups. I think the bucket name as to be globally unique on B2 (waa?).
5) Download rclone from https://rclone.org/downloads/. You likely want the FreeBSD AMD64 binary. Copy it into your jail (or download it via curl in your jail itself). Copy the rclone binary to /usr/local/bin if you prefer to avoid typing the path each time.
6) Configure rclone to talk to your B2 bucket. See https://rclone.org/b2.
7) Run the initial backup. I did about 350Gb in 4 days. You likely want to use something like 'screen' to run this in a terminal you can later detatch from and come back to. I used the following command to do this:

You should obviously adjust the backup path and bucket name according to your setup. If you have a large data set, running this command can take a very long time. If you want to get more information about what it is doing, add the option '--log-level INFO'.
8) Create a script to run nightly backups. Here's mine:

B2 Backblaze



Rclone Backup

The if/else/fi bit at the end is sending an email telling me if the backup succeeded or failed (the absence of an email overnight means the backup didn't run). For that to work, you have to have postfix or another MTA set up in the jail. Google how to do this if you don't know already. You could also just ignore this part and check the logs periodically, or find some other alerting system.
The other thing to note is that I've chosen to use the --fast-list option (saves money and time, but uses more memory), --copy-links (follows symlinks in the backup directory) and --b2-hard-delete (by default, rclone only hides files you delete locally, which means they still cost you money; with this option they are gone from the remote backup. That may or may not be what you want!). The --min-age flag is used to let rclone ignore files that have been very recently modified, e.g. they are partially downloaded or transferred to the NAS. I write all the logs to /var/log/rclone.log (for which ideally you'd set up log rotation) and log at INFO level (reasonably chatty if you have a lot of changes, so you may want to dial it down to NOTICE level to avoid filling up your disk with log files).
9) Create a cron job to run this script nightly, e.g.

Download Rclone



.. and then install with

Rclone Backblaze B2 Plus

Adjust the path and timings (5am every night in this case) as required.

Rclone Version

That's it. It looks like it'll cost me maybe $20/yr to back up 350gb of photos, which is pretty good, and there's no weird Java/ssh-tunnel/config file mangling like there was with Crashplan.