Learn how to use Semola’s type-safe Cache module to improve application performance with Redis.
Overview
The Cache module provides:
- Type-safe Redis caching
- Configurable TTL (time-to-live)
- Custom serialization/deserialization
- Result-based error handling
- Cache-aside pattern support
- Optional key prefixing
Basic usage
import { Cache } from "semola/cache";
type User = {
id: number;
name: string;
email: string;
};
const userCache = new Cache<User>({
redis: new Bun.RedisClient("redis://localhost:6379"),
ttl: 300000, // 5 minutes
});
Cache operations
Setting values
const [error, data] = await userCache.set("user:123", {
id: 123,
name: "John",
email: "john@example.com",
});
if (error) {
console.error("Failed to cache:", error.message);
} else {
console.log("Cached successfully");
}
Getting values
const [error, user] = await userCache.get("user:123");
if (error) {
if (error.type === "NotFoundError") {
console.log("Cache miss");
} else {
console.error("Cache error:", error.message);
}
} else {
console.log("Cache hit:", user);
}
Deleting values
const [error, count] = await userCache.delete("user:123");
if (error) {
console.error("Failed to delete:", error.message);
} else {
console.log(`Deleted ${count} keys`);
}
Cache-aside pattern
The most common caching pattern: check cache first, fetch from database on miss, then cache the result.
import { Cache } from "semola/cache";
import { ok, err } from "semola/errors";
type User = {
id: number;
name: string;
email: string;
};
const userCache = new Cache<User>({
redis: new Bun.RedisClient("redis://localhost:6379"),
ttl: 300000, // 5 minutes
});
async function getUser(id: string) {
// Try cache first
const [cacheError, cachedUser] = await userCache.get(`user:${id}`);
if (!cacheError) {
return ok(cachedUser);
}
// Cache miss - fetch from database
const [dbError, user] = await fetchUserFromDB(id);
if (dbError) {
return err("NotFoundError", "User not found");
}
// Store in cache for next time
await userCache.set(`user:${id}`, user);
return ok(user);
}
TTL strategies
Fixed TTL
const cache = new Cache<User>({
redis: redisClient,
ttl: 60000, // 1 minute for all entries
});
Dynamic TTL based on data
type Session = {
userId: string;
rememberMe: boolean;
};
const sessionCache = new Cache<Session>({
redis: redisClient,
ttl: (_key, session) => {
// Long TTL for "remember me", short for regular sessions
return session.rememberMe ? 86400000 : 3600000;
},
});
Dynamic TTL based on key
type CachedData = {
type: string;
value: string;
};
const cache = new Cache<CachedData>({
redis: redisClient,
ttl: (key, _value) => {
if (key.startsWith("hot:")) {
return 300000; // 5 minutes for hot data
}
return 3600000; // 1 hour for regular data
},
});
No expiration
const cache = new Cache<User>({
redis: redisClient,
// No TTL - data persists until manually deleted
});
TTL must be a non-negative integer. Invalid TTL values (negative, NaN, Infinity, or non-integer) will return InvalidTTLError.
Key prefixing
Organize your cache keys with automatic prefixing:
const usersCache = new Cache<User>({
redis: redisClient,
prefix: "users",
});
await usersCache.set("123", user); // Stored as "users:123"
await usersCache.get("123"); // Reads from "users:123"
await usersCache.delete("123"); // Deletes "users:123"
Benefits of prefixing
- Namespace isolation between different data types
- Easier cache invalidation
- Better Redis key organization
- Prevents key collisions
Custom serialization
import { encode, decode } from "@msgpack/msgpack";
const cache = new Cache<User>({
redis: redisClient,
serializer: (user) => {
const buffer = encode(user);
return Buffer.from(buffer).toString("base64");
},
deserializer: (raw) => {
const buffer = Buffer.from(raw, "base64");
return decode(buffer) as User;
},
});
const cache = new Cache<User>({
redis: redisClient,
serializer: (user) => `${user.id}:${user.name}:${user.email}`,
deserializer: (raw) => {
const [id, name, email] = raw.split(":");
return { id: Number(id), name, email };
},
});
Conditional caching
Disable caching based on environment
const cache = new Cache<User>({
redis: redisClient,
enabled: process.env.CACHE_ENABLED !== "false",
});
When enabled is false:
get() returns NotFoundError (cache miss)
set() returns the value without storing it
delete() returns 0 without deleting
Feature flags
const cache = new Cache<Product>({
redis: redisClient,
enabled: await featureFlags.isEnabled("product-caching"),
});
Error handling
Using error callbacks
const cache = new Cache<User>({
redis: redisClient,
onError: (error) => {
// Log to monitoring service
logger.warn(`Cache ${error.type}: ${error.message}`);
},
});
Error types
CacheError: Redis connection or operation failed
InvalidTTLError: TTL validation failed
NotFoundError: Key not found (normal cache miss)
onError is not called for NotFoundError - that’s a normal cache miss, not an error condition.
Real-world patterns
User profile caching
type UserProfile = {
id: string;
name: string;
email: string;
avatar: string;
role: string;
};
const profileCache = new Cache<UserProfile>({
redis: new Bun.RedisClient("redis://localhost:6379"),
prefix: "profiles",
ttl: 600000, // 10 minutes
onError: (error) => {
logger.error("Profile cache error", error);
},
});
async function getUserProfile(userId: string) {
// Check cache
const [cacheErr, cached] = await profileCache.get(userId);
if (!cacheErr) {
return ok(cached);
}
// Fetch from database
const [dbErr, profile] = await db.users.findById(userId);
if (dbErr) {
return err("NotFoundError", "User not found");
}
// Cache the result
await profileCache.set(userId, profile);
return ok(profile);
}
async function updateUserProfile(userId: string, updates: Partial<UserProfile>) {
// Update database
const [dbErr, updated] = await db.users.update(userId, updates);
if (dbErr) {
return err("UpdateError", "Failed to update profile");
}
// Invalidate cache
await profileCache.delete(userId);
return ok(updated);
}
API response caching
type ApiResponse = {
data: unknown;
timestamp: number;
};
const apiCache = new Cache<ApiResponse>({
redis: redisClient,
prefix: "api",
ttl: (key, response) => {
// Fresh data gets longer TTL
const age = Date.now() - response.timestamp;
if (age < 60000) return 300000; // 5 minutes
return 60000; // 1 minute for older data
},
});
// In your API handler
api.defineRoute({
path: "/users",
method: "GET",
handler: async (c) => {
const cacheKey = `users:${c.req.query.page || 1}`;
// Try cache
const [cacheErr, cached] = await apiCache.get(cacheKey);
if (!cacheErr) {
return c.json(200, cached.data);
}
// Fetch data
const users = await db.users.list();
// Cache response
await apiCache.set(cacheKey, {
data: users,
timestamp: Date.now(),
});
return c.json(200, users);
},
});
Session storage
type Session = {
userId: string;
email: string;
role: string;
createdAt: number;
};
const sessionCache = new Cache<Session>({
redis: redisClient,
prefix: "sessions",
ttl: 3600000, // 1 hour
});
// Create session
async function createSession(user: User) {
const sessionId = crypto.randomUUID();
const session: Session = {
userId: user.id,
email: user.email,
role: user.role,
createdAt: Date.now(),
};
await sessionCache.set(sessionId, session);
return sessionId;
}
// Validate session
async function getSession(sessionId: string) {
const [error, session] = await sessionCache.get(sessionId);
if (error) {
return null;
}
return session;
}
// Logout
async function destroySession(sessionId: string) {
await sessionCache.delete(sessionId);
}
Multi-level caching
Combine in-memory and Redis caching:
const memoryCache = new Map<string, { value: User; expires: number }>();
const redisCache = new Cache<User>({
redis: redisClient,
ttl: 600000,
});
async function getUser(id: string): Promise<User | null> {
// Level 1: Memory cache
const memoryCached = memoryCache.get(id);
if (memoryCached && memoryCached.expires > Date.now()) {
return memoryCached.value;
}
// Level 2: Redis cache
const [redisErr, redisCached] = await redisCache.get(id);
if (!redisErr) {
// Populate memory cache
memoryCache.set(id, {
value: redisCached,
expires: Date.now() + 60000, // 1 minute in memory
});
return redisCached;
}
// Level 3: Database
const user = await db.users.findById(id);
if (!user) return null;
// Populate both caches
await redisCache.set(id, user);
memoryCache.set(id, {
value: user,
expires: Date.now() + 60000,
});
return user;
}
Best practices
Use appropriate TTLs: Set TTL based on data volatility. Frequently changing data needs shorter TTL.
Invalidate on writes: Always invalidate cache when underlying data changes to prevent stale data.
Handle cache failures gracefully: Never let cache errors break your application. Fall back to database on cache failures.
Monitor cache hit rates: Track cache hits vs misses to optimize your caching strategy.
Use key prefixes: Organize keys by type for easier management and selective invalidation.
Next steps