Skip to main content

Overview

@apisr/logger provides a powerful, flexible logging system with:
  • Transport-based architecture for multiple output destinations
  • Structured logging with JSON and pretty-print modes
  • Auto-flush support for reliable log delivery
  • Multiple log levels (debug, info, warn, error)
  • Extensible transports for custom logging targets
  • Type-safe logging with TypeScript support

Installation

npm install @apisr/logger

Quick Start

Create a Logger

import { createLogger } from "@apisr/logger";
import { createConsole } from "@apisr/logger/console";

const logger = createLogger({
  name: "MyApp",
  transports: {
    console: createConsole({ mode: "pretty" }),
  },
});

// Use the logger
logger.info("Application started");
logger.warn("This is a warning", { userId: "123" });
logger.error("An error occurred", { error: "Database connection failed" });
logger.debug("Debug information", { query: "SELECT * FROM users" });

Console Transport Modes

const logger = createLogger({
  name: "MyApp",
  transports: {
    console: createConsole({ mode: "pretty" }),
  },
});

logger.info("User logged in", { userId: "123", email: "[email protected]" });
Output:
[src/index.ts:10] INFO  User logged in
│ {
│   "userId": "123",
│   "email": "[email protected]"
│ }

Core Concepts

Log Levels

debug
(message: string, data?: Record<string, unknown>) => void
Debug-level logging for detailed information
info
(message: string, data?: Record<string, unknown>) => void
Info-level logging for general information
warn
(message: string, data?: Record<string, unknown>) => void
Warning-level logging for warning messages
error
(message: string | Error, data?: Record<string, unknown>) => void
Error-level logging for error messages

Structured Logging

Attach structured data to your logs:
logger.info("User created", {
  userId: user.id,
  email: user.email,
  role: user.role,
  createdAt: new Date().toISOString(),
});

logger.error("Database query failed", {
  query: "SELECT * FROM users",
  duration: 1234,
  error: err.message,
});

Error Logging

Log Error objects with full stack traces:
try {
  await riskyOperation();
} catch (err) {
  // Pass Error object directly
  logger.error(err, { context: "riskyOperation" });
}
The Error’s stack trace and name are automatically included in the log data.

Transports

Console Transport

Log to the console with customizable formatting:
import { createConsole } from "@apisr/logger/console";

const consoleTransport = createConsole({
  mode: "pretty", // or "json"
  format: () => {
    // Custom format function
    return "[CUSTOM] Custom log format";
  },
});

const logger = createLogger({
  name: "MyApp",
  transports: {
    console: consoleTransport,
  },
});

Custom Transports

Create your own transport for custom logging destinations:
import { createTransport } from "@apisr/logger";

const fileTransport = createTransport({
  log: ({ timestamp, level, message, data, file }) => {
    const logEntry = JSON.stringify({
      timestamp,
      level,
      message,
      data,
      file,
    });

    // Write to file
    Bun.write("./logs/app.log", logEntry + "\n", { append: true });
  },
  flush: async ({ logs, store }) => {
    // Optional: batch write logs on flush
    const content = logs
      .map((log) => JSON.stringify(log))
      .join("\n");

    await Bun.write("./logs/batch.log", content + "\n", { append: true });
  },
});

const logger = createLogger({
  name: "MyApp",
  transports: {
    console: createConsole({ mode: "pretty" }),
    file: fileTransport,
  },
});

Transport Store

Each transport has an isolated store for state management:
const customTransport = createTransport({
  log: ({ store, message, data }) => {
    // Read from store
    const count = store.get("logCount") || 0;

    // Write to store
    store.set("logCount", count + 1);
    store.set("lastMessage", message);

    console.log(`[${count + 1}] ${message}`, data);
  },
});

Auto-Flush

Automatically flush logs at intervals or on process events:
const logger = createLogger({
  name: "MyApp",
  transports: {
    console: createConsole({ mode: "json" }),
  },
  autoFlush: {
    intervalMs: 5000, // Flush every 5 seconds
    on: ["beforeExit", "SIGINT", "SIGTERM"], // Flush on process exit
  },
});
autoFlush.intervalMs
number
Interval in milliseconds to auto-flush logs
autoFlush.on
Array<'beforeExit' | 'SIGINT' | 'SIGTERM'>
Process events that trigger flush

Manual Flush

Manually flush logs when needed:
await logger.flush();

Targeted Logging

Log to specific transports:
const logger = createLogger({
  name: "MyApp",
  transports: {
    console: createConsole({ mode: "pretty" }),
    file: fileTransport,
    remote: remoteTransport,
  },
});

// Log to all transports
logger.info("General message");

// Log only to console
logger.to("console").info("Console only");

// Log to console and file
logger.to(["console", "file"]).info("Console and file");

Extending Loggers

Create child loggers with modified transports:
const logger = createLogger({
  name: "ParentLogger",
  transports: {
    console: createConsole({ mode: "pretty" }),
    file: fileTransport,
  },
});

const childLogger = logger.extend({
  name: "ChildLogger",
  excludeTransport: ["file"], // Exclude file transport
  mapTransport: ({ name, transport }) => {
    if (name === "console") {
      // Override console transport
      return createConsole({ mode: "json" });
    }
    return transport;
  },
});

childLogger.info("This goes to console only in JSON mode");
extend.name
string
Name for the child logger
extend.excludeTransport
string[]
Transport names to exclude
extend.mapTransport
(ctx) => Transport
Function to transform transports

Advanced Examples

Multi-Transport Logger

import { createLogger, createTransport } from "@apisr/logger";
import { createConsole } from "@apisr/logger/console";

// Console transport for development
const consoleTransport = createConsole({ mode: "pretty" });

// File transport for persistent logs
const fileTransport = createTransport({
  log: ({ timestamp, level, message, data }) => {
    const entry = {
      timestamp: new Date(timestamp).toISOString(),
      level,
      message,
      data,
    };
    Bun.write("./logs/app.log", JSON.stringify(entry) + "\n", {
      append: true,
    });
  },
});

// Remote transport for centralized logging
const remoteTransport = createTransport({
  log: ({ timestamp, level, message, data, store }) => {
    // Buffer logs
    const buffer = store.get("buffer") || [];
    buffer.push({ timestamp, level, message, data });
    store.set("buffer", buffer);
  },
  flush: async ({ logs, store }) => {
    // Send buffered logs to remote server
    const buffer = store.get("buffer") || [];
    if (buffer.length > 0) {
      await fetch("https://logs.example.com/ingest", {
        method: "POST",
        headers: { "Content-Type": "application/json" },
        body: JSON.stringify({ logs: buffer }),
      });
      store.set("buffer", []);
    }
  },
});

const logger = createLogger({
  name: "ProductionApp",
  transports: {
    console: consoleTransport,
    file: fileTransport,
    remote: remoteTransport,
  },
  autoFlush: {
    intervalMs: 10000, // Flush every 10 seconds
    on: ["beforeExit", "SIGTERM"],
  },
});

// Production usage
logger.info("Server started", { port: 3000 });
logger.error("Request failed", { statusCode: 500, path: "/api/users" });

// Ensure logs are flushed before exit
process.on("beforeExit", async () => {
  await logger.flush();
});

Filtered Transport

const errorOnlyTransport = createTransport({
  log: ({ level, message, data }) => {
    // Only log errors
    if (level === "error") {
      console.error(`ERROR: ${message}`, data);
    }
  },
});

const logger = createLogger({
  name: "FilteredLogger",
  transports: {
    all: createConsole({ mode: "pretty" }),
    errors: errorOnlyTransport,
  },
});

logger.info("This goes to 'all' transport only");
logger.error("This goes to both transports");

Performance Monitoring Transport

const performanceTransport = createTransport({
  log: ({ level, message, data, store }) => {
    const stats = store.get("stats") || {
      debug: 0,
      info: 0,
      warn: 0,
      error: 0,
    };

    stats[level]++;
    store.set("stats", stats);

    // Log stats every 100 messages
    const total = Object.values(stats).reduce((a, b) => a + b, 0);
    if (total % 100 === 0) {
      console.log("Log statistics:", stats);
    }
  },
});

API Reference

createLogger

name
string
Logger name for identification
transports
Record<string, Transport>
Map of transport names to transport instances
autoFlush
AutoFlushOptions
Auto-flush configuration

Logger Methods

debug
(message: string, data?: object) => void
Log debug message
info
(message: string, data?: object) => void
Log info message
warn
(message: string, data?: object) => void
Log warning message
error
(message: string | Error, data?: object) => void
Log error message or Error object
to
(transport: string | string[]) => Logger
Target specific transports
flush
() => Promise<void>
Manually flush all transports
extend
(options: ExtendOptions) => Logger
Create child logger with modified configuration

createTransport

log
(ctx: LogContext) => void
Function called for each log entry
flush
(ctx: FlushContext) => void | Promise<void>
Optional function called on flush

createConsole

mode
'json' | 'pretty'
Output format mode
format
() => string
Custom format function

Type Safety

@apisr/logger is fully type-safe:
const logger = createLogger({
  name: "TypedLogger",
  transports: {
    console: createConsole({ mode: "pretty" }),
    file: fileTransport,
  },
});

// ✅ Type-safe transport targeting
logger.to("console").info("Message");
logger.to(["console", "file"]).info("Message");

// ❌ Type error: unknown transport
logger.to("unknown").info("Message");

Performance Tips

Use JSON Mode in Production

JSON mode is faster than pretty mode for production logging.

Batch Flush

Use flush intervals to batch writes and reduce I/O operations.

Async Transports

Make transport flush functions async for non-blocking I/O.

Filter Logs

Filter log levels in transports to reduce processing overhead.

Build docs developers (and LLMs) love