niels segers

Backblaze B2 and S3 compatible API

cover

May 4, 2020

Backblaze started as an affordable cloud backup service, but over the last few years, the company has also taken its storage expertise and launched the developer-centric B2 Cloud Storage service, which promises to be significantly cheaper than similar offerings from the large cloud vendors. Pricing for B2 starts at $0.005 per GB/month. AWS S3 starts at $0.023 per GB/month.

The storage price alone isn’t going to make developers switch providers, though. There are some costs involved in supporting multiple heterogeneous systems, too.

By making B2 compatible with the S3 API, developers can now simply redirect their storage to Backblaze without the need for any extensive rewrites.

Read more about it here and here.

EDIT: I reworked my restic backups to use the AWS S3 cli instead. Why? For some reason restic was uploading at a 10kb/s rate as of late and wanted to rework it anyway. Also having all my data encrypted and having to have the password on hand (or memorized) each time I want to restore a single file from a remote location is a bit of a hassle.

So currently just using the following script to backup whatever I need from my server, utilizing aws s3 sync to sync files with b2.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 #!/bin/bash AWS_ACCESS_KEY_ID= AWS_SECRET_ACCESS_KEY= S3_BUCKET= S3_ENDPOINT= BASE_OPTIONS="--endpoint-url ${S3_ENDPOINT}" DIRECTORIES=( /foo/bar /lorem ) FILES=( /ipsum/dolor.sh ) function cli() { if [[ "$#" -eq 0 ]]; then backup_dirs; backup_files; exit 0 fi while [[ "$#" -gt 0 ]]; do key="$1" val="$2" case $key in -f | --files) backup_files; exit 0 ;; -d | --directories | --dirs) backup_dirs; exit 0 ;; esac shift done } function backup_dirs() { for dir in "${DIRECTORIES[@]}" do directory=$(/usr/bin/basename $dir) echo "[LOG] Creating backup of directory: ${dir}" /usr/local/bin/aws $BASE_OPTIONS s3 sync $dir ${S3_BUCKET}/${directory} --exclude "*.DS_Store*" done } function backup_files() { /usr/bin/crontab -l > /tmp/crontab for file in "${FILES[@]}" do dirname=$(/usr/bin/dirname $file) filename=$(/usr/bin/basename $file) echo "[LOG] Creating backup of file: ${file}" /usr/local/bin/aws $BASE_OPTIONS s3 sync $dirname ${S3_BUCKET}/ --exclude "*" --include "${filename}" done } cli "$@"