GitHub With Cloudflare R2 Storage
By MzFit
- 7 minutes read - 1301 wordsGitHub is an absolutely fantastic service. Really. Microsoft deserves real kudos for taking it over and keeping it running as a world class service to the community. It drives open source development and the free tier for startups is generous enough that it’s pretty easy to run free until you start making a profit and start needing the enterprisey features. I’m such a fan I recently wrote a post about how I performance tuned my GitHub CICD pipeline.
That being said, the one annoyance I had with the free tier was the 500MB a month limit on package storage. Even with my package weighing in at 11MB, this meant I only got about 45 builds a month before running out.
In other words, I was limited to about 1.5 deploys a day. During productive streaks where I would check in my code to run my CI/CD pipeline multiple times per day, I was often running out of the ability to build and deploy packages by mid-month, sometimes sooner.
Now, the next tier up is only $4/month, and maybe I should have just bitten the bullet and upgraded. But… that’s $48 a year and I wasn’t convinced getting 2GB of storage for packages would be enough. Also, my lady would rather I spend that $48 on a Paramount+ subscription so she can watch NCIS re-runs. Explaining the intricacies of GitHub packaging to her is a non-starter. But I digress.
So I took at look at using Cloudflare’s free R2 offering where I could get 10GB of free storage and free egress without committing to a GitHub subscription.
Also, since R2 is an AWS S3 compatible service, I figured it was also a good opportunity to learn about AWS S3.
Doing this ended up being fairly simple.
The aws CLI is pre-installed on GitHub ubuntu runners (yay Microsoft! Supporting a competitor!) and since R2 is AWS compatible, you can use Amazon’s client with Cloudflare’s service. As an older developer who remembers vendor lock-in being the rule rather than the exception… whew. What a world we live in now: Microsoft’s cloud supporting an Amazon cloud client to talk to a CloudFlare cloud knock-off of Amazon’s cloud service!
Here’s a rough overview of the steps I used to set this up:
- create a cloudflare account
- create an R2 bucket
- set up the secrets
- add an upload script to your CI repo
- add a download script to your deploy repo.
NOTE: if you need to have immutable artifacts for whatever reason (e.g. you are operating in a regulated industry…) don’t do this. You’ll need to use a real artifact repository.
Create a Cloudflare Account
This is free and pretty straightfoward. Go to Cloudflare/plans and choose the free option.
Create an R2 Bucket
To create an R2 bucket, login and look at the left hand side of the Dashboard. If you expand “R2 Object Storage” and click “Overview” you should see a dashboard that looks like this:
Click the blue “Create bucket” button and fill in sensible answers on the next page.
Set up the secrets
Create the secrets
To create your tokens you need to click “Manage R2 Tokens”:
On the next page there will be a big blue “Create API Token”. Click that.
Answer the questions on that page.
- Permissions: Object Read & Write
- Specify Buckets: Apply to Specific Buckets Only
If you know what you’re doing you can change the TTL and IP address settings. Otherwise, leave them alone.
(You can also create a read token for your deploy pipeline and a separate write token for your CI pipeline if you really want to lock that down.)
Then at the bottom of the page click the blue button that says “Create API Token”.
Scroll down on the next page to the section that gives you your S3 credentials:
Keep this open in a window and navigate over to GitHub in another window.
Store tokens in GitHub
To store the Cloudflare access tokens securely you need to store them securely.
Within your GitHub repo, go to Settings -> Secrets and Variables -> Actions:
You’ll need to create 3 variables:
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
- CLOUDFLARE_ACCOUNT_ID
And for each of the values you’ll grab them from your Cloudflare window from the previous step.
The CLOUDFLARE_ACCOUNT_ID will come from the Cloudflare endpoint, e.g. https://THIS_IS_YOUR_ACCOUNT_ID.r2.cloudflarestorage.com
Add An Upload Script to Your CI Repo
Here’s the quick script I hacked together to upload a single file to Cloudflare (I use AWS naming conventions because it’s the AWS client even though it’s Cloudflare):
Filename: scripts/upload_to_aws.sh
#!/bin/bash
set -e
FILE=$1
if [[ ! -f $FILE ]];then
echo "$FILE does not exist"
exit 1
fi
which aws
if [[ $? != 0 ]];then
SYSTEM=$(uname)
if [[ $SYSTEM == "Linux" ]];then
sudo apt-get install -y awscli
elif [[ $SYSTEM == "Darwin" ]]; then
brew install awscli
else
echo "Unable to determine operating systm"
exit 1
fi
fi
aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
aws configure set region auto
aws s3 cp $FILE s3://YOUR_BUCKET_NAME/ --endpoint-url https://$CLOUDFLARE_ACCOUNT_ID.r2.cloudflarestorage.com
My github action job looks like this, where I have set up my GitHub action secrets and export them to be available to the CLI.
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/cache@v4
with:
path: |
~/.cache/go-build
~/go/pkg/mod
key: ${{ runner.os }}-go-${{ hashFiles('**/go.sum') }}
restore-keys: |
${{ runner.os }}-go-
- name: Run make package
run: |
export AWS_ACCESS_KEY_ID="${{secrets.AWS_ACCESS_KEY_ID}}"
export AWS_SECRET_ACCESS_KEY="${{secrets.AWS_SECRET_ACCESS_KEY}}"
export CLOUDFLARE_ACCOUNT_ID="${{secrets.CLOUDFLARE_ACCOUNT_ID}}"
make -j2 package
And my Makefile has this as the package target:
VERSION=`git log | head -1 | awk '{print $$2}'`
GOPATH=`go env GOPATH`/bin
export PATH := /usr/local/go/bin:$(HOME)/go/bin:/usr/local/bin:$(PATH)
arm64: GOARCH := arm64
arm64: clean
mkdir -p build
GOARCH=${GOARCH} go build -o build/meezy.${GOARCH}.${VERSION}
package: arm64
rm -f package-${VERSION}.zip
mkdir -p build
cd build && zip -r ../package-${VERSION}.zip ./*.*
cd ..
scripts/upload_to_aws.sh package-${VERSION}.zip
rm -f package-${VERSION}.zip
If you want to see my full Makefile, including linting and scanning, and some reasons behind why I set things up the way I did, you can take a look at this article which includes a link to a GitHub repo at the end.
Generally my philosophy is to make all logic work locally. Running locally is 4x faster than running in a GitHub actions runner, and generally I feel like my time is valuable. YMMV.
So I make my GitHub actions run one line scripts that I’ve already tested out locally. Hence my “make package” that runs locally.
Obviously to run locally you need to set your environment variables locally via .bashrc or .zshrc or what have you.
Add A Downoad Script to Your CD Repo
#!/bin/bash
VERSION=$1
which aws
if [[ $? != 0 ]];then
SYSTEM=$(uname)
if [[ $SYSTEM == "Linux" ]];then
sudo apt-get install -y awscli
elif [[ $SYSTEM == "Darwin" ]]; then
brew install awscli
else
echo "Unable to determine operating systm"
exit 1
fi
fi
if [[ $AWS_ACCESS_KEY_ID == "" ]];then
echo "AWS_ACCESS_KEY_ID is not set"
exit 1
fi
if [[ $AWS_SECRET_ACCESS_KEY == "" ]];then
echo "AWS_SECRET_ACCESS_KEY is not set"
exit 1
fi
aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
aws configure set region auto
aws s3 cp s3://YOUR_BUCKET_NAME/package-${VERSION}.zip ./file.zip --endpoint-url https://$CLOUDFLARE_ACCOUNT_ID.r2.cloudflarestorage.com
And my deploy pipeline looks something like:
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: deploy
run: |
export AWS_ACCESS_KEY_ID="${{secrets.AWS_ACCESS_KEY_ID}}"
export AWS_SECRET_ACCESS_KEY="${{secrets.AWS_SECRET_ACCESS_KEY}}"
export CLOUDFLARE_ACCOUNT_ID="${{secrets.CLOUDFLARE_ACCOUNT_ID}}"
scripts/download_from_aws.sh $VERSION
scripts/deploy_package.sh
Obviously, your deploy process will be different for your app and infrastructure so you’ll need to write your own version of “deploy_package.sh” whatever that looks like.
Conclusion
Cloudflare provides a great free service that can let you work around limitations in the free version of GitHub actions for pacakge storage.
I am not a lawyer, so I don’t think this breaks any of the Terms of Use from GitHub or Cloudflare. But that could change at any time and I certainly could be wrong, so use at your own risk.