Backblaze B2 and S3 compatible API

Mon May 04 2020

Backblaze started as an affordable cloud backup service, but over the last few years, the company has also taken its storage expertise and launched the developer-centric B2 Cloud Storage service, which promises to be significantly cheaper than similar offerings from the large cloud vendors. Pricing for B2 starts at $0.005 per GB/month. AWS S3 starts at $0.023 per GB/month.

The storage price alone isn’t going to make developers switch providers, though. There are some costs involved in supporting multiple heterogeneous systems, too.

By making B2 compatible with the S3 API, developers can now simply redirect their storage to Backblaze without the need for any extensive rewrites.

Read more about it here and here.

EDIT: I reworked my restic backups to use the AWS S3 cli instead. Why? For some reason restic was uploading at a 10kb/s rate as of late and wanted to rework it anyway. Also having all my data encrypted and having to have the password on hand (or memorized) each time I want to restore a single file from a remote location is a bit of a hassle.

So currently just using the following script to backup whatever I need from my server, utilizing aws s3 sync to sync files with b2.

1#!/bin/bash 2 3AWS_ACCESS_KEY_ID= 4AWS_SECRET_ACCESS_KEY= 5S3_BUCKET= 6S3_ENDPOINT= 7BASE_OPTIONS="--endpoint-url ${S3_ENDPOINT}" 8 9DIRECTORIES=( 10 /foo/bar 11 /lorem 12) 13 14FILES=( 15 /ipsum/dolor.sh 16) 17 18function cli() { 19 if [[ "$#" -eq 0 ]]; then 20 backup_dirs; backup_files; exit 0 21 fi 22 23 while [[ "$#" -gt 0 ]]; do 24 key="$1" 25 val="$2" 26 case $key in 27 -f | --files) backup_files; exit 0 ;; 28 -d | --directories | --dirs) backup_dirs; exit 0 ;; 29 esac 30 shift 31 done 32} 33 34function backup_dirs() { 35 for dir in "${DIRECTORIES[@]}" 36 do 37 directory=$(/usr/bin/basename $dir) 38 echo "[LOG] Creating backup of directory: ${dir}" 39 /usr/local/bin/aws $BASE_OPTIONS s3 sync $dir ${S3_BUCKET}/${directory} --exclude "*.DS_Store*" 40 done 41} 42 43function backup_files() { 44 /usr/bin/crontab -l > /tmp/crontab 45 for file in "${FILES[@]}" 46 do 47 dirname=$(/usr/bin/dirname $file) 48 filename=$(/usr/bin/basename $file) 49 echo "[LOG] Creating backup of file: ${file}" 50 /usr/local/bin/aws $BASE_OPTIONS s3 sync $dirname ${S3_BUCKET}/ --exclude "*" --include "${filename}" 51 done 52} 53 54cli "$@"