Documentation Index Fetch the complete documentation index at: https://mintlify.com/MemoriLabs/Memori/llms.txt
Use this file to discover all available pages before exploring further.
This guide helps you diagnose and resolve common issues when working with Memori. For additional support, visit our Discord community or GitHub Issues .
Installation Issues
ModuleNotFoundError: No module named ‘memori’
Python cannot find the Memori module
Problem: Import fails with ModuleNotFoundError.Symptoms: >>> from memori import Memori
ModuleNotFoundError : No module named 'memori'
Solutions:
Verify installation
If not listed, install it:
Check Python environment
Ensure you’re using the correct Python interpreter: which python
python --version
If using virtual environments: source venv/bin/activate # Linux/Mac
venv\Scripts\activate # Windows
pip install memori
Reinstall if necessary
pip uninstall memori
pip install memori
Missing Dependencies for Database Drivers
ImportError: No module named 'psycopg2' or similar
Problem: Database driver not installed.Symptoms: ImportError : No module named 'psycopg2'
ImportError : No module named 'pymongo'
ImportError : No module named 'pymysql'
Solutions: Memori has minimal dependencies by default. Install database drivers as needed: # PostgreSQL (psycopg2)
pip install psycopg2-binary
# PostgreSQL (psycopg3) - recommended for CockroachDB
pip install "memori[cockroachdb]"
# or manually:
pip install "psycopg[binary]>=3.1.0"
# MongoDB
pip install pymongo
# MySQL/MariaDB
pip install pymysql
# Oracle
pip install oracledb
All drivers at once (development): pip install psycopg2-binary psycopg[binary] pymongo pymysql
SSL Certificate Errors
SSL: CERTIFICATE_VERIFY_FAILED
Problem: SSL verification fails when connecting to Memori API or databases.Symptoms: SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed
Solutions:
Update certificates (recommended):
# macOS
pip install --upgrade certifi
# Ubuntu/Debian
sudo apt-get update
sudo apt-get install ca-certificates
# RHEL/CentOS
sudo yum update ca-certificates
Corporate proxy/firewall: Configure SSL verification:
import os
os.environ[ 'REQUESTS_CA_BUNDLE' ] = '/path/to/ca-bundle.crt'
Development only - disable verification (NOT for production):
import urllib3
urllib3.disable_warnings()
Configuration Issues
API Key Not Recognized
MEMORI_API_KEY environment variable issues
Problem: API key not being recognized or applied.Symptoms:
Still hitting IP-based quota limits
Authentication errors
Quota command shows lower limits than expected
Solutions:
Verify API key is set
Should output your API key. If empty, set it: export MEMORI_API_KEY = "your_api_key_here"
Check for whitespace
Remove any extra whitespace: export MEMORI_API_KEY = $( echo $MEMORI_API_KEY | tr -d '[:space:]' )
Restart your application
Environment variables are read at startup. Restart your application after setting the key.
Verify in code
import os
print ( f "API Key: { os.getenv( 'MEMORI_API_KEY' ) } " )
Make persistent (Linux/Mac)
Add to your shell profile: echo 'export MEMORI_API_KEY="your_key"' >> ~/.bashrc
source ~/.bashrc
Connection String Issues
Database connection string errors
Problem: Cannot connect to database using connection string.Symptoms: OperationalError: could not connect to server
ConnectionError: connection refused
Solutions:
Verify connection string format:
# PostgreSQL
"postgresql://user:password@host:5432/database"
# CockroachDB
"postgresql://user:password@host:26257/defaultdb?sslmode=verify-full"
# MongoDB
"mongodb://user:password@host:27017/database"
# MySQL
"mysql://user:password@host:3306/database"
Check URL encoding for special characters:
from urllib.parse import quote_plus
password = "p@ssw0rd!#$"
encoded_password = quote_plus(password)
connection_string = f "postgresql://user: { encoded_password } @host/db"
Test connectivity separately:
# PostgreSQL
psql "postgresql://user:password@host:5432/database"
# MongoDB
mongosh "mongodb://user:password@host:27017/database"
Verify network access:
# Test if port is accessible
nc -zv host 5432 # PostgreSQL
nc -zv host 26257 # CockroachDB
nc -zv host 27017 # MongoDB
Embedding Model Download Failures
Cannot download sentence-transformers models
Runtime Issues
Memory Not Being Recalled
LLM interactions not recalling previous context
Problem: Memori not recalling memories from previous interactions.Symptoms:
LLM doesn’t remember previous conversations
No context from past sessions
Memories created but not retrieved
Solutions:
Verify attribution is set
from memori import Memori
from openai import OpenAI
mem = Memori()
client = mem.llm.register(OpenAI())
# REQUIRED: Set attribution before interactions
mem.attribution( entity_id = "user_123" , process_id = "agent" )
# Verify it's set
print ( f "Entity: { mem.config.entity_id } " )
print ( f "Process: { mem.config.process_id } " )
print ( f "Session: { mem.config.session_id } " )
Check recall configuration
# Ensure recall limits are reasonable
print ( f "Embeddings limit: { mem.config.recall_embeddings_limit } " )
print ( f "Facts limit: { mem.config.recall_facts_limit } " )
print ( f "Relevance threshold: { mem.config.recall_relevance_threshold } " )
# Increase if too restrictive
mem.config.recall_embeddings_limit = 2000
mem.config.recall_relevance_threshold = 0.05
Verify memories are being created
Check quota status
If quota is exceeded, new memories won’t be created.
High latency or slow response times
Problem: Memori operations are slower than expected.Symptoms:
High latency (>500ms) for memory recall
Slow LLM response times
Application feels sluggish
Solutions:
Optimize recall settings:
mem.config.recall_embeddings_limit = 500 # Reduce search space
mem.config.recall_facts_limit = 3 # Fewer facts
mem.config.recall_relevance_threshold = 0.2 # Stricter filtering
Use faster embedding model:
export MEMORI_EMBEDDINGS_MODEL = "all-MiniLM-L6-v2"
python -m memori setup
Check database performance (BYODB):
-- PostgreSQL: Check slow queries
SELECT query, mean_exec_time
FROM pg_stat_statements
ORDER BY mean_exec_time DESC
LIMIT 10 ;
-- Ensure indexes exist
\di memories
Use connection pooling:
import psycopg2.pool
pool = psycopg2.pool.ThreadedConnectionPool(
minconn = 5 ,
maxconn = 20 ,
host = "localhost" ,
database = "memori"
)
mem = Memori().storage.register(pool.getconn())
Profile your code:
import time
start = time.time()
response = client.chat.completions.create( ... )
print ( f "LLM call: { time.time() - start :.3f} s" )
See Performance Tuning for more optimization strategies.
Memory Leaks
Application memory usage growing over time
Problem: Python process memory usage continuously increases.Symptoms:
High RAM usage
OOM (Out of Memory) errors after running for extended periods
Slow degradation of performance
Solutions:
Reduce thread pool size:
from concurrent.futures import ThreadPoolExecutor
mem.config.thread_pool_executor = ThreadPoolExecutor( max_workers = 5 )
Explicitly close connections:
# For BYODB
connection = psycopg2.connect( ... )
mem = Memori().storage.register(connection)
# When done
connection.close()
Process in batches:
for batch in chunked(items, 100 ):
process_batch(batch)
# Reset cache periodically
mem.config.reset_cache()
Monitor with memory profiler:
pip install memory-profiler
python -m memory_profiler your_script.py
LLM Provider Issues
OpenAI Integration Issues
OpenAI client not working with Memori
Problem: OpenAI client fails or doesn’t integrate properly.Symptoms: AttributeError: 'OpenAI' object has no attribute '_memori'
TypeError: unsupported operand type(s)
Solutions:
Verify OpenAI version:
pip show openai
# Requires openai>=1.0.0
pip install --upgrade openai
Correct initialization order:
from memori import Memori
from openai import OpenAI
# Correct: Register client with Memori
client = OpenAI()
mem = Memori().llm.register(client)
# Set attribution before use
mem.attribution( entity_id = "user" , process_id = "agent" )
# Now use client
response = client.chat.completions.create( ... )
Check API key:
Test without Memori first:
from openai import OpenAI
client = OpenAI()
response = client.chat.completions.create(
model = "gpt-4o-mini" ,
messages = [{ "role" : "user" , "content" : "test" }]
)
print (response.choices[ 0 ].message.content)
Anthropic Integration Issues
Anthropic Claude client issues
Problem: Anthropic client not working with Memori.Solutions:
Verify Anthropic SDK version:
pip show anthropic
pip install --upgrade anthropic
Correct usage:
from memori import Memori
from anthropic import Anthropic
client = Anthropic()
mem = Memori().llm.register(client)
mem.attribution( entity_id = "user_123" , process_id = "claude_agent" )
response = client.messages.create(
model = "claude-3-5-sonnet-20241022" ,
max_tokens = 1024 ,
messages = [{ "role" : "user" , "content" : "Hello" }]
)
Check API key:
Streaming Not Working
Streaming responses fail or are incomplete
Problem: Streaming LLM responses don’t work properly with Memori.Symptoms:
Stream stops prematurely
Incomplete responses
Memory not recorded for streamed content
Solutions:
Ensure proper iteration:
stream = client.chat.completions.create(
model = "gpt-4o-mini" ,
messages = [{ "role" : "user" , "content" : "Tell a story" }],
stream = True
)
# Iterate through ALL chunks
for chunk in stream:
if chunk.choices[ 0 ].delta.content:
print (chunk.choices[ 0 ].delta.content, end = "" )
# Memory recorded after stream completes
Don’t manually close stream early:
# Wrong: Closing stream early
for i, chunk in enumerate (stream):
if i > 10 :
break # Memory won't be recorded!
# Correct: Consume entire stream
for chunk in stream:
process(chunk)
Async streaming:
from openai import AsyncOpenAI
client = AsyncOpenAI()
mem = Memori().llm.register(client)
stream = await client.chat.completions.create(
model = "gpt-4o-mini" ,
messages = [{ "role" : "user" , "content" : "Hello" }],
stream = True
)
async for chunk in stream:
if chunk.choices[ 0 ].delta.content:
print (chunk.choices[ 0 ].delta.content, end = "" )
Database-Specific Issues
PostgreSQL Connection Issues
Cannot connect to PostgreSQL/CockroachDB
Problem: PostgreSQL or CockroachDB connection fails.Common errors: psycopg2.OperationalError: could not connect to server
psycopg2.OperationalError: FATAL: password authentication failed
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED]
Solutions:
Verify connection parameters:
import psycopg2
connection = psycopg2.connect(
host = "localhost" ,
port = 5432 ,
database = "memori" ,
user = "postgres" ,
password = "your_password" ,
sslmode = "prefer" # or "require" for CockroachDB
)
Test connection separately:
psql -h localhost -p 5432 -U postgres -d memori
CockroachDB SSL requirements:
# CockroachDB requires SSL
connection = psycopg2.connect(
"postgresql://user:password@host:26257/defaultdb?sslmode=verify-full"
)
Check firewall:
# Test port accessibility
nc -zv localhost 5432
Verify PostgreSQL is running:
# Linux
sudo systemctl status postgresql
# macOS
brew services list | grep postgresql
# Docker
docker ps | grep postgres
MongoDB Connection Issues
Cannot connect to MongoDB
Problem: MongoDB connection fails.Common errors: pymongo.errors.ServerSelectionTimeoutError
pymongo.errors.ConfigurationError
Solutions:
Verify connection string:
from pymongo import MongoClient
# Standard MongoDB
client = MongoClient( "mongodb://localhost:27017/" )
# MongoDB Atlas
client = MongoClient(
"mongodb+srv://user:password@cluster.mongodb.net/?retryWrites=true&w=majority"
)
# With authentication
client = MongoClient(
host = "localhost" ,
port = 27017 ,
username = "user" ,
password = "password" ,
authSource = "admin"
)
Test connection:
mongosh "mongodb://localhost:27017"
Check MongoDB is running:
# Linux
sudo systemctl status mongod
# macOS
brew services list | grep mongodb
# Docker
docker ps | grep mongo
Verify network access:
SQLite Permission Issues
SQLite database permission errors
Problem: Cannot create or write to SQLite database.Symptoms: sqlite3.OperationalError: unable to open database file
PermissionError: [Errno 13] Permission denied
Solutions:
Check file permissions:
ls -l memori.db
chmod 644 memori.db # Read/write for owner
Ensure directory exists and is writable:
mkdir -p ~/.memori
chmod 755 ~/.memori
Use absolute path:
import sqlite3
import os
from memori import Memori
db_path = os.path.expanduser( "~/.memori/memori.db" )
os.makedirs(os.path.dirname(db_path), exist_ok = True )
connection = sqlite3.connect(db_path)
mem = Memori().storage.register(connection)
Check disk space:
CockroachDB Cluster Management
Cluster Creation Fails
CockroachDB cluster start command fails
Problem: python -m memori cockroachdb cluster start fails.Symptoms: RuntimeError: the cluster failed to come online
ModuleNotFoundError: No module named 'psycopg'
Solutions:
Install required dependencies:
pip install "memori[cockroachdb]"
# or manually:
pip install "psycopg[binary]>=3.1.0"
Wait longer (can take 2+ minutes):
The cluster creation process can take several minutes. Be patient.
Check network connectivity:
curl -I https://api.memorilabs.ai
Verify API access:
Try again:
If it fails, the cluster is automatically cleaned up. Simply retry:
python -m memori cockroachdb cluster start
Cannot Access Cluster
Lost connection string or cannot access cluster
Problem: Lost CockroachDB connection string or cannot connect.Solutions: Important: Memori Labs cannot recover lost connection strings. The connection string is displayed only once during cluster creation for security reasons.
If you lost your connection string:
Delete the old cluster:
python -m memori cockroachdb cluster delete
Create a new cluster:
python -m memori cockroachdb cluster start
Save the connection string immediately in a secure password manager.
Getting Help
When reporting issues, include:
# System information
python --version
pip show memori
pip show openai anthropic google-genai # If using
# Environment
echo $MEMORI_API_KEY | cut -c1-10 # First 10 chars only
echo $MEMORI_EMBEDDINGS_MODEL
# Test basic functionality
python -m memori quota
# Enable debug logging
export MEMORI_DEBUG = 1
python your_script.py 2>&1 | tee debug.log
Enable Debug Logging
import logging
# Enable Memori debug logs
logging.basicConfig( level = logging. DEBUG )
logger = logging.getLogger( "memori" )
logger.setLevel(logging. DEBUG )
# Your code here
Common Error Messages
Error Message Likely Cause Quick Fix ModuleNotFoundError: No module named 'memori'Not installed pip install memoriNo attribution setMissing attribution mem.attribution(entity_id=..., process_id=...)could not connect to serverDatabase unreachable Check connection string, network CERTIFICATE_VERIFY_FAILEDSSL certificate issue Update certificates quota exceededHit quota limit Sign up for API key or upgrade No module named 'psycopg'Missing driver pip install memori[cockroachdb]
Support Channels
Discord Community Get help from the community and Memori team
GitHub Issues Report bugs and request features
Documentation Browse complete documentation
Email Support Contact the Memori team directly
Next Steps
CLI Usage Master the Memori CLI
Performance Tuning Optimize your deployment
Quota Management Monitor and manage quota
Contributing Contribute to Memori