Skip to main content
The Django backend defaults to local filesystem storage (media/). Supabase Storage is the recommended cloud replacement for production, providing a global CDN, Row Level Security (RLS), and a REST API compatible with Django’s Storage interface.

What uses storage

The following Django model fields write files to storage:
ModelFieldUpload pathVisibility
UserProfileprofile_pictureprofile_pics/Private
Speciesimgspecies/Public
ExhibicionImageimageexhibitions/Public
PurchaseOrdersqr_imageqr_codes/Public
Documentsdocumentdocumentos/Private
ServiciosEducativosimageservicios-educativos/Public
ProgramaEducativoimageprogramas/Public

Environment variables

.env (backend)
SUPABASE_URL=https://<project-ref>.supabase.co
SUPABASE_API_KEY=<anon-or-service-key>
SUPABASE_SERVICE_KEY=<service-role-key>
SUPABASE_STORAGE_BUCKET=media-files
ENVIRONMENT=production
Use the service role key (SUPABASE_SERVICE_KEY) only on the server — never expose it to the browser or commit it to source control. The anon key is safe for public bucket reads.

Installing dependencies

pip install supabase==2.3.4 requests python-magic Pillow
# or use the requirements file included in the backend
pip install -r requirements_supabase.txt

Django settings

Use local storage in development and Supabase in production:
config/settings.py
import os

if os.getenv('ENVIRONMENT') == 'production':
    DEFAULT_FILE_STORAGE = 'config.supabase_storage.SupabaseStorage'
    MEDIA_URL = (
        f"{os.getenv('SUPABASE_URL')}/storage/v1/object/public/"
        f"{os.getenv('SUPABASE_STORAGE_BUCKET', 'media-files')}/"
    )
else:
    DEFAULT_FILE_STORAGE = 'django.core.files.storage.FileSystemStorage'
    MEDIA_URL = '/media/'
    MEDIA_ROOT = BASE_DIR / 'media'

SUPABASE_URL = os.getenv('SUPABASE_URL')
SUPABASE_API_KEY = os.getenv('SUPABASE_API_KEY')
SUPABASE_STORAGE_BUCKET = os.getenv('SUPABASE_STORAGE_BUCKET', 'media-files')

django-storage-supabase (alternative)

The django-storage-supabase package provides a drop-in DEFAULT_FILE_STORAGE backend:
pip install django-storage-supabase
config/settings.py
DEFAULT_FILE_STORAGE = 'django_storage_supabase.supabase'
SUPABASE_API_KEY = os.environ.get('SUPABASE_API_KEY')
SUPABASE_URL = os.environ.get('SUPABASE_URL')
SUPABASE_ROOT_PATH = '/media/'
django-storage-supabase is currently in alpha. For production use, the custom SupabaseStorage class at config/supabase_storage.py is more reliable.

Custom SupabaseStorage class

The backend includes a hand-rolled SupabaseStorage class that implements the full Django Storage interface:
config/supabase_storage.py
from supabase import create_client, Client
from django.conf import settings
from django.core.files.storage import Storage
from django.core.files.base import ContentFile

class SupabaseStorage(Storage):
    def __init__(self):
        self.supabase: Client = create_client(
            settings.SUPABASE_URL,
            settings.SUPABASE_API_KEY
        )
        self.bucket_name = settings.SUPABASE_STORAGE_BUCKET

    def _save(self, name, content):
        try:
            self.supabase.storage.from_(self.bucket_name).upload(
                name, content.read()
            )
            return name
        except Exception as e:
            raise IOError(f"Error uploading file: {e}")

    def _open(self, name, mode='rb'):
        try:
            response = self.supabase.storage.from_(self.bucket_name).download(name)
            return ContentFile(response)
        except Exception as e:
            raise IOError(f"Error downloading file: {e}")

    def delete(self, name):
        self.supabase.storage.from_(self.bucket_name).remove([name])

    def exists(self, name):
        try:
            files = self.supabase.storage.from_(self.bucket_name).list()
            return any(file['name'] == name for file in files)
        except:
            return False

    def url(self, name):
        try:
            response = self.supabase.storage.from_(self.bucket_name).get_public_url(name)
            return response['publicURL']
        except Exception as e:
            return None

Creating buckets

1

Open the Storage section

In your Supabase dashboard, go to StorageBuckets and click New bucket.
2

Create the required buckets

Create the following buckets. Mark public buckets as Public so files are accessible without a signed URL:
Bucket nameAccess
user-profilesPrivate
speciesPublic
exhibitionsPublic
qr-codesPublic
documentsPrivate
educational-servicesPublic
educational-programsPublic
3

Apply RLS policies

Set Row Level Security policies (see Security policies below).

Uploading and retrieving files

Automatic — via Django models

No code changes are required in the models. Once DEFAULT_FILE_STORAGE is set, all ImageField and FileField saves route through Supabase automatically:
models.py
class Species(models.Model):
    name = models.CharField(max_length=100)
    img = models.ImageField(upload_to='species/')  # uploads to Supabase automatically

Manual — using the storage class directly

from config.supabase_storage import SupabaseStorage
from django.core.files.base import ContentFile

storage = SupabaseStorage()

# Upload
with open('photo.jpg', 'rb') as f:
    saved_name = storage.save('species/photo.jpg', ContentFile(f.read(), name='photo.jpg'))

# Get public URL
url = storage.url(saved_name)

# Check existence
if storage.exists(saved_name):
    print('File exists')

# Delete
storage.delete(saved_name)

Type-specific storage instances

from config.supabase_storage import (
    profile_pics_storage,
    species_storage,
    documents_storage
)

species_storage.save('species/new-fish.jpg', file_content)
documents_storage.save('docs/report.pdf', file_content)  # private bucket

Migrating existing local files

The management command migrate_to_supabase uploads all local media files and updates database references:
# Dry run — no changes made
python manage.py migrate_to_supabase --dry-run

# Migrate all models with a local backup
python manage.py migrate_to_supabase --backup

# Migrate a single model
python manage.py migrate_to_supabase --model Species --backup
The migration process:
  1. Creates a local backup in media_backup/
  2. Scans all models with file fields
  3. Uploads each file to Supabase, preserving directory structure
  4. Verifies each upload
  5. Updates the database reference to the new Supabase path

Security policies

Apply these RLS policies in the Supabase SQL editor (SQL EditorNew query):
-- Profile pictures: only the owning user can read or upload
CREATE POLICY "Users can view own profile pictures" ON storage.objects
FOR SELECT USING (
    bucket_id = 'user-profiles'
    AND auth.uid()::text = (storage.foldername(name))[1]
);

CREATE POLICY "Users can upload own profile pictures" ON storage.objects
FOR INSERT WITH CHECK (
    bucket_id = 'user-profiles'
    AND auth.uid()::text = (storage.foldername(name))[1]
);

-- Public content: species, exhibitions, educational services
CREATE POLICY "Public content is viewable by everyone" ON storage.objects
FOR SELECT USING (
    bucket_id IN ('species', 'exhibitions', 'educational-services')
);

-- Documents: authenticated users only
CREATE POLICY "Authenticated users can view documents" ON storage.objects
FOR SELECT USING (
    bucket_id = 'documents'
    AND auth.role() = 'authenticated'
);

-- Private per-user documents
CREATE POLICY "Users can access own documents" ON storage.objects
FOR ALL USING (
    bucket_id = 'documents'
    AND auth.uid()::text = (storage.foldername(name))[1]
);

File validation

The SupabaseStorage class validates uploads before sending them to Supabase:
  • MIME type check — rejects files that do not match the expected content type
  • Size limit — enforces a maximum file size per type
  • Filename sanitisation — strips dangerous characters from file names
  • Path traversal prevention — blocks ../ sequences in upload paths

Debugging

Enable verbose storage logs in settings.py:
config/settings.py
LOGGING = {
    'version': 1,
    'handlers': {
        'console': {
            'class': 'logging.StreamHandler',
        },
    },
    'loggers': {
        'supabase_storage': {
            'handlers': ['console'],
            'level': 'DEBUG',
        },
    },
}

Common errors

Verify that SUPABASE_API_KEY in .env is correct. For private bucket operations you may need the service role key (SUPABASE_SERVICE_KEY) instead of the anon key.
Check that the bucket name in SUPABASE_STORAGE_BUCKET matches exactly what you created in the dashboard (case-sensitive).
The bucket is set to private or an RLS policy is blocking access. Either make the bucket public or add a SELECT policy for the relevant role.
Use --model to migrate one model at a time, or check your network connection. Large MEDIA_ROOT directories with thousands of files will take several minutes.

Running the storage tests

# Run the full Supabase storage test suite
python manage.py test tests.test_supabase_storage

# Run a specific test case
python manage.py test tests.test_supabase_storage.SupabaseStorageTestCase.test_save_file_success
Test coverage includes upload, download, delete, URL generation, file existence checks, path traversal prevention, and bulk upload performance.

Build docs developers (and LLMs) love