Skip to main content

Deployment options

Typeset can be deployed to any platform that supports Next.js applications:
  • Vercel (recommended) - Zero-config deployment with automatic scaling
  • Docker - Containerized deployment for any cloud provider
  • Node.js server - Traditional server deployment
  • AWS, GCP, Azure - Cloud platform deployments
This guide covers Vercel (recommended) and Docker deployments.

Deploy to Vercel

Vercel is the recommended deployment platform for Typeset. It provides:
  • Automatic HTTPS certificates
  • Global CDN for static assets
  • Serverless API routes with auto-scaling
  • Zero-config deployment
  • Built-in analytics and monitoring
1

Install Vercel CLI

npm install -g vercel
Or use the included dependency:
pnpm vercel
2

Login to Vercel

vercel login
Follow the prompts to authenticate with your Vercel account.
3

Deploy

From your project directory:
vercel
This will:
  1. Build your Next.js application
  2. Upload the build to Vercel
  3. Provide a preview URL
For production deployment:
vercel --prod
4

Configure environment variables

Add environment variables in the Vercel dashboard:
  1. Go to your project settings
  2. Navigate to Environment Variables
  3. Add each variable:
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=pk_live_...
CLERK_SECRET_KEY=sk_live_...
LIVEBLOCKS_SECRET_KEY=sk_...
NEXT_PUBLIC_LIVEBLOCKS_PUBLIC_KEY=pk_...
OPENAI_API_KEY=sk-proj-...
GOOGLE_API_KEY=AIza...
Use production API keys (not test keys) for your production deployment.
5

Configure custom domain

In the Vercel dashboard:
  1. Go to Settings → Domains
  2. Add your custom domain
  3. Configure DNS records as instructed
  4. Wait for SSL certificate provisioning (automatic)

Vercel deployment via GitHub

For continuous deployment:
1

Push to GitHub

Push your Typeset repository to GitHub:
git remote add origin https://github.com/yourusername/typeset.git
git push -u origin main
2

Import to Vercel

  1. Go to vercel.com/new
  2. Click Import Git Repository
  3. Select your repository
  4. Configure project settings (automatic for Next.js)
  5. Add environment variables
  6. Click Deploy
3

Automatic deployments

Vercel will automatically deploy:
  • Production: Commits to main branch
  • Preview: Commits to other branches and pull requests

Vercel configuration

Create vercel.json in your project root for advanced configuration:
vercel.json
{
  "buildCommand": "pnpm build",
  "devCommand": "pnpm dev",
  "installCommand": "pnpm install",
  "framework": "nextjs",
  "regions": ["iad1"],
  "env": {
    "NEXT_PUBLIC_APP_URL": "https://your-domain.com"
  }
}
Configuration options:
  • regions - Deployment regions (e.g., iad1 for US East)
  • buildCommand - Custom build command
  • installCommand - Custom package installation
  • functions - Serverless function configuration

Tectonic binary on Vercel

Vercel’s serverless functions have a 50 MB deployment size limit. The Tectonic binary (~35 MB) fits within this limit.
The Tectonic binary must be included in your deployment:
  1. Ensure bin/tectonic is committed to your repository
  2. Verify the binary is executable:
    chmod +x bin/tectonic
    git add bin/tectonic
    git commit -m "Add Tectonic binary"
    
  3. The binary will be included in the deployment automatically
Vercel limits:
  • Max deployment size: 50 MB (Typeset with Tectonic: ~40 MB)
  • Max function size: 50 MB
  • /tmp storage: 512 MB (for LaTeX package cache)
  • Function timeout: 10 seconds (Hobby), 60 seconds (Pro), 900 seconds (Enterprise)
LaTeX compilation may timeout on Vercel Hobby plan (10 seconds). Upgrade to Pro for 60-second function execution.

Deploy with Docker

Deploy Typeset as a containerized application:
1

Create Dockerfile

Create Dockerfile in your project root:
Dockerfile
FROM node:20-alpine AS base

# Install dependencies only when needed
FROM base AS deps
RUN apk add --no-cache libc6-compat
WORKDIR /app

# Install pnpm
RUN npm install -g pnpm

# Copy package files
COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./
RUN pnpm install --frozen-lockfile

# Builder stage
FROM base AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .

# Build Next.js application
RUN npm install -g pnpm && pnpm build

# Production stage
FROM base AS runner
WORKDIR /app

ENV NODE_ENV=production

RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs

# Copy built application
COPY --from=builder /app/public ./public
COPY --from=builder /app/.next/standalone ./
COPY --from=builder /app/.next/static ./.next/static
COPY --from=builder /app/bin ./bin

# Ensure Tectonic is executable
RUN chmod +x ./bin/tectonic

USER nextjs

EXPOSE 3000

ENV PORT=3000
ENV HOSTNAME="0.0.0.0"

CMD ["node", "server.js"]
2

Update next.config.ts

Enable standalone output for Docker:
next.config.ts
const nextConfig: NextConfig = {
  output: 'standalone',
  pageExtensions: ["js", "jsx", "md", "mdx", "ts", "tsx"],
  // ... rest of config
};
3

Create .dockerignore

.dockerignore
node_modules
.next
.git
.env.local
.vercel
npm-debug.log
yarn-error.log
pnpm-debug.log
4

Build Docker image

docker build -t typeset .
5

Run container

docker run -p 3000:3000 \
  -e NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=pk_live_... \
  -e CLERK_SECRET_KEY=sk_live_... \
  -e LIVEBLOCKS_SECRET_KEY=sk_... \
  -e NEXT_PUBLIC_LIVEBLOCKS_PUBLIC_KEY=pk_... \
  -e OPENAI_API_KEY=sk-proj-... \
  typeset
Or use an environment file:
docker run -p 3000:3000 --env-file .env.production typeset

Docker Compose

For easier container management:
docker-compose.yml
version: '3.8'

services:
  typeset:
    build: .
    ports:
      - "3000:3000"
    env_file:
      - .env.production
    restart: unless-stopped
    volumes:
      - /tmp/tectonic-cache:/tmp/cache
    environment:
      - NODE_ENV=production
Run with:
docker-compose up -d

Deploy Docker to cloud platforms

# Push to Amazon ECR
aws ecr get-login-password --region us-east-1 | \
  docker login --username AWS --password-stdin <account>.dkr.ecr.us-east-1.amazonaws.com

docker tag typeset:latest <account>.dkr.ecr.us-east-1.amazonaws.com/typeset:latest
docker push <account>.dkr.ecr.us-east-1.amazonaws.com/typeset:latest

# Deploy to ECS (configure task definition with environment variables)

Deploy to traditional Node.js server

For deployment on a VPS or dedicated server:
1

Build the application

pnpm build
2

Copy files to server

Transfer these files to your server:
  • .next/ - Built application
  • public/ - Static assets
  • bin/ - Tectonic binary
  • package.json - Dependencies
  • node_modules/ - Or run pnpm install --prod on server
3

Set environment variables

Create /etc/typeset/.env.production:
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=pk_live_...
CLERK_SECRET_KEY=sk_live_...
LIVEBLOCKS_SECRET_KEY=sk_...
NEXT_PUBLIC_LIVEBLOCKS_PUBLIC_KEY=pk_...
OPENAI_API_KEY=sk-proj-...
4

Create systemd service

Create /etc/systemd/system/typeset.service:
[Unit]
Description=Typeset LaTeX Editor
After=network.target

[Service]
Type=simple
User=typeset
WorkingDirectory=/var/www/typeset
EnvironmentFile=/etc/typeset/.env.production
ExecStart=/usr/bin/node /var/www/typeset/.next/standalone/server.js
Restart=always

[Install]
WantedBy=multi-user.target
Enable and start:
sudo systemctl enable typeset
sudo systemctl start typeset
5

Configure Nginx reverse proxy

Create /etc/nginx/sites-available/typeset:
server {
    listen 80;
    server_name your-domain.com;
    
    # Redirect to HTTPS
    return 301 https://$host$request_uri;
}

server {
    listen 443 ssl http2;
    server_name your-domain.com;
    
    ssl_certificate /etc/letsencrypt/live/your-domain.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/your-domain.com/privkey.pem;
    
    location / {
        proxy_pass http://localhost:3000;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
    
    # WebSocket support for Liveblocks
    location /_liveblocks/ {
        proxy_pass http://localhost:3000;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "Upgrade";
        proxy_set_header Host $host;
    }
}
Enable the site:
sudo ln -s /etc/nginx/sites-available/typeset /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl reload nginx

Production checklist

Before going live, verify:
1

Security

  • HTTPS enabled with valid SSL certificate
  • Environment variables are secured (not in source control)
  • Production API keys configured (not test keys)
  • Clerk webhook signing secrets configured
  • CORS properly configured if using separate API domain
2

Performance

  • Next.js production build completed successfully
  • Static assets served from CDN
  • Tectonic binary is executable and cached
  • LaTeX package cache directory has sufficient space
  • Database connection pooling configured (if applicable)
3

Monitoring

  • Error tracking configured (e.g., Sentry)
  • Log aggregation set up
  • Uptime monitoring enabled
  • API usage monitoring for third-party services
  • Disk space alerts for LaTeX cache
4

Testing

  • Authentication flow works end-to-end
  • LaTeX compilation succeeds
  • Real-time collaboration works
  • AI assistant responds correctly
  • PDF rendering works in all browsers
  • Mobile responsiveness verified

Scaling considerations

Horizontal scaling

For high-traffic deployments: Vercel Vercel automatically scales serverless functions. No additional configuration needed. Docker/Kubernetes Scale horizontally with multiple containers:
# Kubernetes deployment
apiVersion: apps/v1
kind: Deployment
metadata:
  name: typeset
spec:
  replicas: 3
  selector:
    matchLabels:
      app: typeset
  template:
    metadata:
      labels:
        app: typeset
    spec:
      containers:
      - name: typeset
        image: typeset:latest
        ports:
        - containerPort: 3000
        env:
        - name: CLERK_SECRET_KEY
          valueFrom:
            secretKeyRef:
              name: typeset-secrets
              key: clerk-secret

Performance optimization

LaTeX package caching Share Tectonic cache across instances:
# Mount shared volume for cache
docker run -v /mnt/shared-cache:/tmp/cache typeset
Static asset CDN Configure CDN for public/ and .next/static/:
next.config.ts
const nextConfig: NextConfig = {
  assetPrefix: process.env.CDN_URL || '',
  // ...
};
Database optimization If using a database (Neon, Supabase, etc.):
  • Use connection pooling
  • Enable read replicas for queries
  • Implement caching with Redis

Troubleshooting

Build fails on Vercel

  1. Check build logs in Vercel dashboard
  2. Verify all dependencies are in package.json
  3. Ensure Node.js version matches (20.x)
  4. Check for TypeScript errors: pnpm tsc --noEmit

Function timeout errors

  1. Upgrade to Vercel Pro for longer function execution
  2. Optimize LaTeX compilation (remove unnecessary packages)
  3. Implement compilation caching
  4. Pre-warm Tectonic cache with common packages

Tectonic binary not found

  1. Verify binary exists in bin/tectonic
  2. Check file permissions: chmod +x bin/tectonic
  3. Ensure binary is committed to Git
  4. Verify platform matches (Linux x86_64)

Collaboration not working

  1. Check Liveblocks API key is correct
  2. Verify WebSocket connections are allowed (check firewall)
  3. Test /api/liveblocks-auth endpoint
  4. Check browser console for errors

High memory usage

  1. Monitor LaTeX package cache size
  2. Implement cache cleanup for old packages
  3. Limit concurrent compilations
  4. Use Vercel Pro for larger memory limits

Next steps

After deployment:
  1. Monitor application performance and errors
  2. Set up automated backups (if using database)
  3. Configure CI/CD for automatic deployments
  4. Implement rate limiting for API routes
  5. Set up staging environment for testing
Join the Typeset community for deployment support and best practices.

Build docs developers (and LLMs) love