Skip to main content
RLaaS Users Service is a lightweight Spring Boot microservice that sits in front of your APIs and enforces per-user HTTP rate limits. It solves the problem of runaway clients exhausting shared resources by tracking each user’s request count in Redis and rejecting requests that exceed the configured threshold — returning a 429 response with the exact number of seconds the client must wait before retrying.

What it does

Every inbound HTTP request passes through a servlet filter (RateLimiterFilter) before it reaches your application logic. The filter extracts the userId query parameter, then runs an atomic Lua script against Redis to increment and inspect a per-user counter. If the counter is within the allowed window, the request proceeds. If the user has exceeded the limit, the service immediately returns 429 Too Many Requests with a human-readable retry message — no request ever reaches your upstream service. Key behaviors:
  • Atomic counters: Redis Lua scripts ensure increment-and-check is race-condition-free, even across multiple service instances.
  • Two algorithms: choose between a fixed-window counter or a sliding-window counter depending on your tolerance for burst traffic at window boundaries.
  • Stateless service: all state lives in Redis, so you can scale horizontally behind a load balancer without coordination.

Request flow

1

Request arrives

A client sends GET /check?userId={userId} to the service.
2

Filter intercepts

RateLimiterFilter intercepts the request before it reaches the controller and calls RateLimitingAlgorithm.allowRequest(userId).
3

Redis Lua script runs

An atomic Lua script increments the user’s counter in Redis and reads back the current count and TTL.
4

Allow or deny

If allowed is true, the request continues and the service returns 200 User is allowed. If allowed is false, the filter short-circuits and writes User is not allowed...Try after X seconds. with HTTP 429.
RLaaS Users Service is stateless except for Redis — you can run multiple instances behind a load balancer and share the same Redis instance.

Explore the docs

Quickstart

Deploy the service and make your first rate-limited request in under five minutes.

Configuration

Set the algorithm, window size, max requests, and Redis connection URL.

Rate limiting concepts

Understand how per-user counters, windows, and TTLs work together.

API reference

Full reference for the GET /check endpoint, parameters, and response codes.

Build docs developers (and LLMs) love