Skip to main content
The MCP (Model Context Protocol) presentation layer lets AI assistants — such as Claude Desktop — interact directly with the coffee shop backend. An AI client can browse the menu, place orders, drive the barista workflow, and query the order queue, all through a standardised protocol without writing any HTTP calls.

What is MCP?

Model Context Protocol is an open standard that defines how AI models communicate with external tools and data sources. An MCP server exposes three primitives:
  • Tools — callable actions that the AI can invoke (equivalent to function calls).
  • Resources — named data endpoints the AI can read at any time.
  • Prompts — parameterised prompt templates that guide the AI toward specific tasks.
The coffee shop MCP server name is Coffee Orders MCP at version 0.1.0.

Transport modes

The server supports two transport modes. Choose stdio for local development and desktop AI clients; use HTTP when you need a networked or Cloudflare Workers-hosted endpoint.
Classic MCP over standard input/output. This is the transport used by Claude Desktop and most local MCP clients.
bun run mcp:stdio
The process reads JSON-RPC messages from stdin and writes responses to stdout. There is no port to configure.
Both modes expose the same set of tools, resources, and prompts — only the wire transport differs.

Tools

Tools are the primary way an AI model takes action. The coffee shop registers eight tools covering the full order lifecycle.

list_menu

List the current coffee menu. Takes no parameters.

place_order

Create a new coffee order. Accepts the full PlaceOrderRequest schema: customerName, drinkId, size, and optional milk, temperature, shots, notes.

get_order

Fetch one order by orderId.

list_orders

List orders, optionally filtered by status.

start_brewing

Move an order from pending to brewing. Requires orderId.

mark_ready

Move an order from brewing to ready. Requires orderId.

pick_up_order

Move an order from ready to picked-up. Requires orderId.

cancel_order

Cancel a pending or brewing order. Requires orderId.
All tools are defined in actions.ts and wired up in action-tools.ts using Effect’s Tool.make and Toolkit.make APIs:
const CoffeeActionToolkit = Toolkit.make(
  ListMenuTool,
  PlaceOrderTool,
  GetOrderTool,
  ListOrdersTool,
  StartBrewingTool,
  MarkReadyTool,
  PickUpOrderTool,
  CancelOrderTool,
);
Each tool specifies its description, parameters (an Effect Schema), success schema, and failure schema. This gives AI clients a precise, machine-readable description of what every tool does and what it returns.

Resources

Resources are read-only data endpoints identified by URI. An AI client can fetch a resource at any time to ground its context.

coffee://menu

The full coffee menu as a JSON array of MenuItem objects.
  • Name: Coffee Menu
  • MIME type: application/json
  • Description: The current coffee menu

coffee://orders/open

All orders that have not yet been picked up or cancelled.
  • Name: Open Orders
  • MIME type: application/json
  • Description: Orders that have not been picked up or cancelled
The resource filters out orders with status picked-up or cancelled before returning.

coffee://orders/:orderId

A single order identified by its ID. The server provides completions for the orderId parameter — the client receives a list of all known order IDs to suggest as the user types.
  • Name: Coffee Order
  • MIME type: application/json
  • Description: One coffee order by id
  • Completion: orderId — resolved dynamically from listOrders
export const OrderResource = McpServer.resource`coffee://orders/${orderIdParam}`({
  name: "Coffee Order",
  description: "One coffee order by id",
  mimeType: "application/json",
  completion: {
    orderId: () =>
      CoffeeOrderApp.use((app) => app.listOrders({})).pipe(
        Effect.map((orders) => orders.map((order) => order.id)),
      ),
  },
  // ...
});

Prompts

Prompts are parameterised templates that guide the AI toward a specific task. Both prompts include completion lists so clients can offer suggested values as the user types the parameter.

recommend-drink

Ask the AI to recommend a drink from the live menu for a given occasion.
  • Description: Suggest a drink from the available menu
  • Parameter: occasion (string)
  • Completions for occasion: morning rush, afternoon break, late night, decaf
The prompt fetches the current menu and injects it into the template:
Recommend one drink for "<occasion>" from this menu:
<menu JSON>

summarize-open-orders

Ask the AI to summarise the current open order queue from a particular operational perspective.
  • Description: Summarize the current open order queue
  • Parameter: focus (string)
  • Completions for focus: kitchen, pickup, operations
The prompt fetches all non-terminal orders and builds:
Summarize the open order queue for <focus>:
<open orders JSON>

Cloudflare Workers / Miniflare deployment

The MCP HTTP surface can also run inside a Cloudflare Worker. The repository includes a Miniflare-based test that exercises the Worker entrypoint end-to-end:
bun run --cwd backend test src/presentation/mcp/miniflare.worker.test.ts
This covers prompts, resources, and all action tools through the same /mcp HTTP path, giving you a local contract test before deploying to Workers.

Connecting an AI client

1

Start the MCP server

Pick the transport that matches your client.
bun run mcp:stdio
2

Configure Claude Desktop (stdio)

Add an entry to your Claude Desktop MCP configuration file. The exact path depends on your OS; on macOS it is ~/Library/Application Support/Claude/claude_desktop_config.json.
{
  "mcpServers": {
    "coffee-shop": {
      "command": "bun",
      "args": [
        "run",
        "--cwd",
        "/path/to/effect-coffee-shop",
        "mcp:stdio"
      ]
    }
  }
}
Replace /path/to/effect-coffee-shop with the absolute path to your local clone.
3

Restart Claude Desktop

Quit and relaunch Claude Desktop. The Coffee Orders MCP server will appear in the tool list.
4

Try a prompt

In a new Claude conversation, ask: “Use the recommend-drink prompt for a morning rush.”Claude will call the recommend-drink prompt, read the live menu resource, and reply with a personalised suggestion.
For the HTTP transport, point your MCP client at http://localhost:<port>/mcp. The port depends on your server configuration; check the output of bun run mcp:http for the bound address.

Build docs developers (and LLMs) love