Skip to main content
Sandboxed processors run in separate Node.js processes, providing process isolation for job processing. This architecture offers significant advantages for stability, performance, and resource utilization.

Why Use Sandboxed Processors?

From the Bull README, sandboxed processors provide several key advantages:
1

Process Isolation

The process is sandboxed so if it crashes it does not affect the worker
2

Non-Blocking Code

You can run blocking code without affecting the queue (jobs will not stall)
3

Multi-Core Utilization

Much better utilization of multi-core CPUs
4

Reduced Redis Connections

Less connections to Redis
Sandboxed processors are especially valuable for CPU-intensive tasks, unreliable code, or when you need to isolate job processing from queue management.

Basic Usage

Create a separate file for your processor:
// processor.js
module.exports = function (job) {
  // Do some heavy work
  console.log('Processing job:', job.id);
  
  // Perform CPU-intensive operation
  const result = heavyComputation(job.data);
  
  return Promise.resolve(result);
};
Reference the processor file when calling queue.process():
// main.js
const Queue = require('bull');
const queue = new Queue('my-queue');

// Single sandboxed process
queue.process('/path/to/processor.js');

// Add jobs as usual
await queue.add({ data: 'to-process' });

With Concurrency

Run multiple sandboxed processors concurrently:
// Process 5 jobs concurrently in separate processes
queue.process(5, '/path/to/processor.js');
Each concurrent slot runs in its own separate process, so concurrency of 5 means 5 child processes.

Named Processors

Use sandboxed processors for specific job types:
// main.js
const queue = new Queue('multi-task');

// Different processors for different job types
queue.process('encode-video', 2, '/path/to/video-processor.js');
queue.process('send-email', 10, '/path/to/email-processor.js');
queue.process('resize-image', 5, '/path/to/image-processor.js');

// Add named jobs
await queue.add('encode-video', { video: 'input.mp4' });
await queue.add('send-email', { to: 'user@example.com' });
await queue.add('resize-image', { image: 'photo.jpg' });

Processor File Structure

// processor.js
module.exports = function (job) {
  return new Promise((resolve, reject) => {
    // Process the job
    processData(job.data)
      .then(result => resolve(result))
      .catch(err => reject(err));
  });
};

Complete Example

Here’s a full example from the README:
// processor.js
module.exports = function (job) {
  // Do some heavy work
  return Promise.resolve(result);
};
// main.js
const Queue = require('bull');
const queue = new Queue('processing');

// Single process
queue.process('/path/to/my/processor.js');

// With concurrency
queue.process(5, '/path/to/my/processor.js');

// Named processors
queue.process('my processor', 5, '/path/to/my/processor.js');

// Add jobs
await queue.add({ data: 'value' });
await queue.add('my processor', { data: 'value' });

CPU-Intensive Tasks

Sandboxed processors shine with CPU-intensive operations:
// video-encoder.js
const ffmpeg = require('fluent-ffmpeg');

module.exports = async function (job) {
  const { inputPath, outputPath, format } = job.data;
  
  return new Promise((resolve, reject) => {
    ffmpeg(inputPath)
      .format(format)
      .on('progress', (progress) => {
        job.progress(progress.percent);
      })
      .on('end', () => resolve({ outputPath }))
      .on('error', (err) => reject(err))
      .save(outputPath);
  });
};
// main.js
const videoQueue = new Queue('video-encoding');

// Process 2 videos at a time in separate processes
videoQueue.process(2, '/path/to/video-encoder.js');

await videoQueue.add({
  inputPath: './videos/input.mp4',
  outputPath: './videos/output.webm',
  format: 'webm'
});

Error Handling

Sandboxed processors handle errors gracefully:
// processor.js
module.exports = async function (job) {
  try {
    const result = await riskyOperation(job.data);
    return result;
  } catch (error) {
    // Log error details
    await job.log(`Error: ${error.message}`);
    
    // Re-throw to mark job as failed
    throw error;
  }
};
If a sandboxed processor crashes completely (process exit), Bull will automatically mark the job as failed and restart the processor for the next job.

Shared Dependencies

You can require shared modules in processor files:
// processor.js
const axios = require('axios');
const { processData } = require('./utils');
const config = require('./config');

module.exports = async function (job) {
  const response = await axios.get(job.data.url);
  return processData(response.data, config);
};
Each sandboxed processor loads its own copy of dependencies. This increases memory usage but ensures complete isolation.

Communication Between Processes

Sandboxed processors communicate through the job object:
// processor.js
module.exports = async function (job) {
  // Update job data
  await job.update({
    ...job.data,
    processedAt: new Date()
  });
  
  // Report progress
  await job.progress(50);
  
  // Add logs
  await job.log('Halfway done');
  
  return result;
};
// main.js
queue.on('progress', (job, progress) => {
  console.log(`Job ${job.id} is ${progress}% complete`);
});

queue.on('completed', async (job, result) => {
  const logs = await queue.getJobLogs(job.id);
  console.log('Job logs:', logs);
});

Mixing Sandboxed and In-Process

You can mix sandboxed and in-process processors:
const queue = new Queue('mixed');

// Light tasks: in-process
queue.process('light', 10, async (job) => {
  return simpleOperation(job.data);
});

// Heavy tasks: sandboxed
queue.process('heavy', 2, '/path/to/heavy-processor.js');

Best Practices

Use sandboxed processors for:
  • CPU-intensive operations (video encoding, image processing)
  • Unreliable third-party code
  • Memory-intensive tasks
  • Long-running operations
Use in-process processors for:
  • Simple, quick operations
  • Jobs requiring shared state
  • High-throughput, low-CPU tasks
Memory considerations: Each sandboxed process has its own memory space. Monitor memory usage when running many concurrent sandboxed processors.

Debugging Sandboxed Processors

Debugging child processes requires special setup:
// Set NODE_DEBUG environment variable
process.env.NODE_DEBUG = 'bull';

const queue = new Queue('debug');
queue.process('/path/to/processor.js');

// Child process logs will appear in console
For more advanced debugging:
// processor.js
module.exports = async function (job) {
  // Console logs appear in parent process output
  console.log('Processing job:', job.id);
  
  // Use job.log for persistent logs
  await job.log('Debug info: ' + JSON.stringify(job.data));
  
  return result;
};

Performance Comparison

// In-process (blocking)
queue.process(async (job) => {
  // Blocks event loop for 5 seconds
  const result = cpuIntensiveSync(job.data);
  return result;
});
// Other jobs must wait!

// Sandboxed (non-blocking)
queue.process('/path/to/cpu-intensive.js');
// Queue continues processing other operations
Sandboxed processors allow the main queue process to continue managing jobs, scheduling, and handling events while heavy processing happens in child processes.

Graceful Shutdown

const queue = new Queue('graceful');
queue.process(5, '/path/to/processor.js');

process.on('SIGTERM', async () => {
  console.log('Shutting down...');
  
  // Close queue (stops accepting new jobs)
  await queue.close();
  
  // Child processes complete their current jobs automatically
  process.exit(0);
});

Build docs developers (and LLMs) love