Overcoming Storage Limits: Migrating from Supabase storage to Cloudflare R2
As developers and indiehackers, we often start our projects with free-tier services that offer generous limits. However, as our projects grow, we may find ourselves bumping up against these limits. Recently, I faced this exact situation with one of my Supabase projects. In this blog post, I'll share my experience of migrating from Supabase to Cloudflare R2 to overcome storage limitations, and provide some insights that might help you in similar situations.
The Challenge: Outgrowing Supabase's Free Tier
One of my indie hacking projects was happily running on Supabase's free tier, which offers 1GB of free storage. However, as the project evolved, I found myself using 2.9GB of storage - nearly three times the free limit!
The Solution: Cloudflare R2
After researching various options, I decided to migrate my data to Cloudflare R2. Here's why:
Generous Free Tier: Cloudflare R2 offers a whopping 10GB per month on their free plan.
Free Egress: Unlike many cloud storage providers, Cloudflare doesn't charge for data transfer out of their network.
Ample Request Limits: The free tier includes 1 million Class A operations and 10 million Class B operations per month.
Cost-Effective Scaling: Even if my usage grows beyond the free tier, Cloudflare R2 remains one of the most cost-effective options available.
The Migration Process
To migrate my data, I used a Python script leveraging the boto3 library. Here's a high-level overview of the process:
Supabase to Local: First, I downloaded all the files from Supabase to my local machine.
Local Processing: I performed some necessary computations on the data locally.
Local to Cloudflare R2: Finally, I uploaded the processed data to Cloudflare R2.
You might wonder why I didn't migrate directly from Supabase to Cloudflare R2. The reason is that I needed to perform some local computations on the data anyway, so having it on my local machine was a necessary step.
The Migration Script
To automate this process, I used Claude (an AI assistant) to help me write a Python script using boto3. This script handled the migration from Supabase to my local machine and then to Cloudflare R2.
Here's a simplified version of what the script might look like:
import boto3
from supabase import create_client, Client
# Supabase setup
supabase: Client = create_client(SUPABASE_URL, SUPABASE_KEY)
# Cloudflare R2 setup
s3 = boto3.client('s3',
endpoint_url = 'https://<accountid>.r2.cloudflarestorage.com',
aws_access_key_id = '<access_key_id>',
aws_secret_access_key = '<access_key_secret>'
)
# Download from Supabase
def download_from_supabase(bucket_name, file_name, local_path):
with open(local_path, 'wb+') as f:
res = supabase.storage.from_(bucket_name).download(file_name)
f.write(res)
# Upload to Cloudflare R2
def upload_to_r2(local_path, bucket_name, file_name):
s3.upload_file(local_path, bucket_name, file_name)
# Main migration process
def migrate_file(supabase_bucket, r2_bucket, file_name):
local_path = f"./temp/{file_name}"
download_from_supabase(supabase_bucket, file_name, local_path)
# Perform any necessary local computations here
upload_to_r2(local_path, r2_bucket, file_name)
# Run the migration for each file
files_to_migrate = [...] # List of files to migrate
for file in files_to_migrate:
migrate_file('supabase-bucket-name', 'r2-bucket-name', file)
This script provides a basic framework for the migration process. You'd need to customize it with your specific bucket names, file lists, and any local processing you need to perform.
Conclusion
Migrating from Supabase to Cloudflare R2 allowed me to overcome the storage limitations I was facing, while keeping costs low and maintaining high performance. If you're in a similar situation, consider exploring Cloudflare R2 as a potential solution.