Overview
Motia supports both TypeScript and Python, and you can use them together in the same project. This lets you leverage the best tools and libraries from each ecosystem.
Why use multiple languages?
Different languages excel at different tasks:
TypeScript : Great for APIs, real-time features, and type-safe business logic
Python : Ideal for data processing, ML/AI, scientific computing, and leveraging Python’s rich ecosystem
With Motia, you don’t have to choose. Use both.
Project structure
A multi-language Motia project looks like this:
my-motia-project/
├── steps/
│ ├── api/
│ │ ├── orders.step.ts # TypeScript API
│ │ └── users.step.ts
│ ├── processing/
│ │ ├── process_order.step.py # Python processing
│ │ └── analyze_data.step.py
│ └── notifications/
│ └── notify.step.ts
├── package.json # Node dependencies
├── requirements.txt # Python dependencies
└── motia.config.ts
Steps can be in any language - Motia handles communication between them automatically via queues.
Creating your first mixed project
Initialize the project
mkdir my-mixed-project
cd my-mixed-project
npm init -y
npm install motia
pip install motia
Create a TypeScript API step
Create steps/api.step.ts: import type { Handlers , StepConfig } from 'motia'
import { z } from 'zod'
export const config = {
name: 'DataIngestionAPI' ,
description: 'Receive data and send to Python for processing' ,
flows: [ 'data-pipeline' ],
triggers: [
{
type: 'http' ,
method: 'POST' ,
path: '/ingest' ,
bodySchema: z . object ({
data: z . array ( z . number ()),
operation: z . enum ([ 'analyze' , 'transform' , 'summarize' ]),
}),
},
],
enqueues: [ 'process-data' ],
} as const satisfies StepConfig
export const handler : Handlers < typeof config > = async (
{ request },
{ logger , enqueue }
) => {
const { data , operation } = request . body
logger . info ( 'Data received' , {
dataPoints: data . length ,
operation ,
})
// Send to Python for processing
await enqueue ({
topic: 'process-data' ,
data: {
data ,
operation ,
requestId: `req- ${ Date . now () } ` ,
},
})
return {
status: 202 ,
body: {
message: 'Data accepted for processing' ,
status: 'processing' ,
},
}
}
Create a Python processing step
Create steps/process_data_step.py: steps/process_data_step.py
"""Process data using Python's scientific libraries."""
import numpy as np
from typing import Any
from motia import FlowContext, queue
config = {
"name" : "DataProcessor" ,
"description" : "Process data with NumPy and Python" ,
"flows" : [ "data-pipeline" ],
"triggers" : [
queue( "process-data" ),
],
"enqueues" : [ "processing-complete" ],
}
async def handler ( input_data : dict[ str , Any], ctx : FlowContext[Any]) -> None :
"""Process data using NumPy."""
data = np.array(input_data[ "data" ])
operation = input_data[ "operation" ]
request_id = input_data[ "requestId" ]
ctx.logger.info(
"Processing data with Python" ,
{ "operation" : operation, "requestId" : request_id},
)
# Perform operations using NumPy
result = {}
if operation == "analyze" :
result = {
"mean" : float (np.mean(data)),
"median" : float (np.median(data)),
"std" : float (np.std(data)),
"min" : float (np.min(data)),
"max" : float (np.max(data)),
}
elif operation == "transform" :
# Normalize data to 0-1 range
normalized = (data - np.min(data)) / (np.max(data) - np.min(data))
result = { "transformed" : normalized.tolist()}
elif operation == "summarize" :
result = {
"count" : len (data),
"sum" : float (np.sum(data)),
"mean" : float (np.mean(data)),
}
# Store result
await ctx.state.set( "results" , request_id, {
"operation" : operation,
"result" : result,
"processedAt" : ctx.timestamp,
})
# Notify completion
await ctx.enqueue({
"topic" : "processing-complete" ,
"data" : {
"requestId" : request_id,
"operation" : operation,
"result" : result,
},
})
ctx.logger.info(
"Processing complete" ,
{ "requestId" : request_id},
)
Create a TypeScript notification step
Create steps/notify.step.ts: import { queue , step } from 'motia'
import { z } from 'zod'
const resultSchema = z . object ({
requestId: z . string (),
operation: z . string (),
result: z . record ( z . any ()),
})
export const stepConfig = {
name: 'NotifyComplete' ,
description: 'Send completion notifications' ,
flows: [ 'data-pipeline' ],
triggers: [
queue ( 'processing-complete' , { input: resultSchema }),
],
}
export const { config , handler } = step ( stepConfig , async ( _input , ctx ) => {
const { requestId , operation , result } = ctx . getData ()
ctx . logger . info ( 'Sending completion notification' , {
requestId ,
operation ,
})
// Send notification (email, webhook, etc.)
// await sendNotification(...)
ctx . logger . info ( 'Notification sent' , { requestId })
})
Python SSE streaming
You can build streaming endpoints in Python too:
import asyncio
import json
import random
from typing import Any
from motia import MotiaHttpArgs, FlowContext, http
config = {
"name" : "Python Stream" ,
"description" : "Stream data using Python" ,
"flows" : [ "streaming" ],
"triggers" : [http( "GET" , "/python-stream" )],
"enqueues" : [],
}
async def handler (
args : MotiaHttpArgs[Any],
ctx : FlowContext[Any]
) -> None :
"""Stream data to clients."""
response = args.response
# Set up SSE
await response.status( 200 )
await response.headers({
"content-type" : "text/event-stream" ,
"cache-control" : "no-cache" ,
"connection" : "keep-alive" ,
})
ctx.logger.info( "Starting Python stream" )
# Stream 10 items
for i in range ( 10 ):
item = {
"id" : i,
"value" : random.randint( 1 , 100 ),
"timestamp" : ctx.timestamp,
}
data = json.dumps(item)
response.writer.stream.write(
f "event: data \n data: { data } \n\n " .encode( "utf-8" )
)
await asyncio.sleep( 0.5 )
# Send completion
done = json.dumps({ "total" : 10 })
response.writer.stream.write(
f "event: done \n data: { done } \n\n " .encode( "utf-8" )
)
response.close()
ctx.logger.info( "Stream complete" )
Using Python ML libraries
Python excels at machine learning. Here’s an example using scikit-learn:
"""ML prediction using scikit-learn."""
import pickle
import numpy as np
from typing import Any
from motia import FlowContext, queue
# Load your trained model (trained separately)
with open ( 'model.pkl' , 'rb' ) as f:
model = pickle.load(f)
config = {
"name" : "MLPredictor" ,
"description" : "Make predictions using trained ML model" ,
"flows" : [ "ml-pipeline" ],
"triggers" : [queue( "predict" )],
"enqueues" : [ "prediction-complete" ],
}
async def handler ( input_data : dict[ str , Any], ctx : FlowContext[Any]) -> None :
"""Make ML predictions."""
features = np.array(input_data[ "features" ]).reshape( 1 , - 1 )
request_id = input_data[ "requestId" ]
ctx.logger.info( "Making prediction" , { "requestId" : request_id})
# Make prediction
prediction = model.predict(features)[ 0 ]
confidence = model.predict_proba(features)[ 0 ]
result = {
"prediction" : int (prediction),
"confidence" : float ( max (confidence)),
"requestId" : request_id,
}
# Store result
await ctx.state.set( "predictions" , request_id, result)
# Notify completion
await ctx.enqueue({
"topic" : "prediction-complete" ,
"data" : result,
})
ctx.logger.info(
"Prediction complete" ,
{ "requestId" : request_id, "prediction" : prediction},
)
Trigger predictions from TypeScript:
steps/predict-api.step.ts
export const handler : Handlers < typeof config > = async (
{ request },
{ enqueue }
) => {
const { features } = request . body
const requestId = `pred- ${ Date . now () } `
await enqueue ({
topic: 'predict' ,
data: { features , requestId },
})
return {
status: 202 ,
body: {
message: 'Prediction in progress' ,
requestId ,
},
}
}
Sharing data between languages
Use Motia’s state and queues to share data:
Via queues (recommended)
// TypeScript: Send data to Python
await ctx . enqueue ({
topic: 'process-data' ,
data: { items: [ 1 , 2 , 3 ], operation: 'transform' },
})
# Python: Receive and process
async def handler ( input_data : dict , ctx : FlowContext) -> None :
items = input_data[ "items" ]
operation = input_data[ "operation" ]
# Process...
Via state
// TypeScript: Write to state
await ctx . state . set ( 'shared' , 'config' , {
model: 'gpt-4' ,
temperature: 0.7 ,
})
# Python: Read from state
config = await ctx.state.get( "shared" , "config" )
model = config[ "model" ]
temperature = config[ "temperature" ]
Language-specific best practices
TypeScript
Type safety Use Zod schemas for all inputs and outputs. TypeScript ensures type safety at compile time.
const dataSchema = z . object ({
values: z . array ( z . number ()),
operation: z . enum ([ 'analyze' , 'transform' ]),
})
Python
Type hints Use type hints and Pydantic for validation in Python steps.
from pydantic import BaseModel
class DataInput ( BaseModel ):
values: list[ float ]
operation: str
Deployment
When deploying multi-language projects:
Ensure both runtimes are available
FROM node:20
RUN apt-get update && apt-get install -y python3 python3-pip
Install dependencies for both
npm install
pip install -r requirements.txt
Configure Motia to find both
export default {
steps: [ './steps/**/*.step.{ts,py}' ] ,
}
Best practices
Use the right tool Choose TypeScript for APIs and Python for data processing. Play to each language’s strengths.
Communicate via queues Use queues for inter-step communication. Keeps steps decoupled and scalable.
Share schemas Document data schemas that cross language boundaries. Use JSON Schema or similar.
Handle errors consistently Use similar error handling patterns in both languages for easier debugging.
Next steps
Building APIs Deep dive into API development with TypeScript
AI Integration Build AI features using both TypeScript and Python