Skip to main content

Storage Overview

ThinkEx supports two storage backends for handling file uploads (PDFs, images, and other media):

Local File Storage

Store files directly on your server’s filesystem in the ./uploads directory

Supabase Storage

Store files in Supabase cloud storage with CDN delivery

Local File Storage

Recommended for self-hosting - Simple setup with complete data control.

Features

  • Files stored in ./uploads/ directory relative to the application root
  • No external dependencies or API keys required
  • Full control over your data and storage location
  • Simple backup and migration (just copy the directory)
  • No bandwidth or storage limits beyond your server
  • Ideal for single-server deployments

Configuration

1

Set Storage Type

In your .env file, set:
.env
STORAGE_TYPE=local
The setup script automatically configures this for local development.
2

Verify Upload Directory

The ./uploads/ directory is created automatically when the first file is uploaded.Permissions: Ensure the Next.js process has write access:
mkdir -p ./uploads
chmod 755 ./uploads
3

Configure File Serving

Files are served via Next.js API routes at /api/uploads/[filename].No additional web server configuration is needed.

File Organization

Files are stored with unique identifiers to prevent conflicts:
thinkex/
└── uploads/
    ├── abc123-document.pdf
    ├── def456-image.png
    └── ghi789-screenshot.jpg

Backup and Migration

# Create timestamped backup
tar -czf uploads-backup-$(date +%Y%m%d).tar.gz ./uploads/

# Or copy to backup location
rsync -av ./uploads/ /backup/thinkex-uploads/

Disk Space Management

# Check upload directory size
du -sh ./uploads/

# List largest files
du -h ./uploads/* | sort -rh | head -20

# Check available disk space
df -h .
Files that are no longer referenced in the database can be identified and removed:
# This is a conceptual example - implement based on your needs
# Compare files in uploads/ against database records
# Remove files not found in database
Always backup before deleting files. Test cleanup scripts thoroughly.

Supabase Storage

Recommended for distributed deployments - Cloud-based storage with CDN delivery.

Features

  • Cloud storage with global CDN
  • Automatic scaling and redundancy
  • No local disk space requirements
  • Suitable for multi-server deployments
  • Built-in image transformations and optimizations
  • Pay-as-you-go pricing

Prerequisites

  1. A Supabase account (free tier available)
  2. A Supabase project created
  3. Storage bucket configured

Configuration

1

Create Supabase Project

  1. Sign up at supabase.com
  2. Create a new project
  3. Wait for project initialization (1-2 minutes)
2

Create Storage Bucket

In your Supabase project dashboard:
  1. Navigate to Storage in the sidebar
  2. Click New bucket
  3. Set bucket name: file-upload
  4. Set bucket to Public
  5. Click Create bucket
The bucket name must be file-upload and it must be set to Public for ThinkEx to access files correctly.
3

Get API Credentials

In Supabase Dashboard → Project SettingsAPI:
  1. Copy Project URL (e.g., https://yourproject.supabase.co)
  2. Copy anon public key from “Project API keys”
  3. Copy service_role secret key
The service_role key is shown only once. Store it securely.
4

Configure Environment Variables

In your .env file:
.env
# Storage Type
STORAGE_TYPE=supabase

# Supabase Configuration
NEXT_PUBLIC_SUPABASE_URL=https://yourproject.supabase.co
NEXT_PUBLIC_SUPABASE_PUBLISHABLE_OR_ANON_KEY=eyJ...your-anon-key...
SUPABASE_SERVICE_ROLE_KEY=eyJ...your-service-role-key...
Keep SUPABASE_SERVICE_ROLE_KEY secret. Never commit to version control or expose in client code.
5

Restart Application

# Development
pnpm dev

# Production
pnpm build && pnpm start

Bucket Configuration

The storage bucket settings in Supabase:
SettingValueDescription
Namefile-uploadRequired bucket name
Public✅ EnabledFiles must be publicly accessible
File size limit50 MB (default)Adjust based on needs
Allowed MIME typesAll (default)Or restrict to specific types

Storage Policies (RLS)

For production, configure Row Level Security policies in Supabase:
-- Allow anyone to read files
CREATE POLICY "Public Access"
ON storage.objects FOR SELECT
USING (bucket_id = 'file-upload');

Migration Between Storage Types

Migrate existing local files to Supabase:
# Install Supabase CLI
npm install -g supabase

# Login to Supabase
supabase login

# Upload files to bucket
supabase storage upload file-upload ./uploads/*
Update database records to reference new Supabase URLs after migration.

Comparison

When to Use Local Storage

Best for:
  • Single-server deployments
  • Full data ownership required
  • Predictable storage costs
  • Offline or air-gapped environments
  • Simple backup requirements
Considerations:
  • Server disk space limits
  • No CDN (slower global access)
  • Manual backup management

When to Use Supabase

Best for:
  • Multi-server deployments
  • Global user base (CDN benefits)
  • Scalable storage needs
  • Managed backups and redundancy
Considerations:
  • Requires internet connectivity
  • Third-party dependency
  • Usage-based costs
  • API rate limits

Troubleshooting

Problem: Cannot write to ./uploads/ directorySolution:
# Check directory permissions
ls -ld ./uploads/

# Fix permissions
chmod 755 ./uploads/

# Ensure ownership (replace 'user' with your app user)
chown -R user:user ./uploads/
Problem: Uploaded files return 404 errorsSolution:
  • Verify files exist: ls ./uploads/
  • Check STORAGE_TYPE=local in .env
  • Restart the Next.js server
  • Check file permissions: chmod 644 ./uploads/*
Problem: “Bucket not found” or “Access denied” errorsSolution:
  • Verify bucket name is exactly file-upload
  • Confirm bucket is set to Public
  • Check API credentials are correct
  • Verify NEXT_PUBLIC_SUPABASE_URL format: https://yourproject.supabase.co
Problem: File uploads fail silently or with errorsSolution:
  • Check Supabase project is active (not paused)
  • Verify SUPABASE_SERVICE_ROLE_KEY is correct
  • Check file size is under bucket limit (default 50MB)
  • Review Supabase Storage logs in dashboard
  • Ensure RLS policies allow uploads (or disable RLS for testing)
Problem: Files lost after changing STORAGE_TYPESolution:
  • Files are not automatically migrated
  • Use migration scripts (see “Migration Between Storage Types” above)
  • Database records still reference old storage locations
  • Consider keeping both active during migration period

Security Best Practices

Storage Security Checklist:
  • ✅ Never commit .env with Supabase credentials
  • ✅ Restrict SUPABASE_SERVICE_ROLE_KEY to server-side code only
  • ✅ Set appropriate file size limits (prevent abuse)
  • ✅ Validate file types before upload (prevent malicious files)
  • ✅ Use HTTPS for all file URLs in production
  • ✅ Regularly audit uploaded files for suspicious content
  • ✅ Implement rate limiting on upload endpoints
  • ✅ Backup storage regularly (local or Supabase)

Next Steps

Configuration

Review all environment variables

Installation

Back to installation guide

Build docs developers (and LLMs) love