Skip to main content

Overview

Smarty is SmartEat AI’s intelligent nutritional assistant, powered by LangChain and LangGraph. Smarty helps users manage their meal plans, swap recipes, and get personalized dietary advice through natural conversation.
Smarty uses Ollama with the Llama 3.1 model for local, private AI inference without external API dependencies.

Architecture

The agent is built with LangGraph, a framework for stateful, multi-step AI workflows.

Core Components

File: backend/app/services/agent/workflow.pyDefines the agent’s state graph:
from langgraph.graph import StateGraph, END
from langgraph.checkpoint.memory import MemorySaver
from langgraph.prebuilt import ToolNode

def build_graph():
    tool_node = ToolNode(nutrition_tools)
    workflow = StateGraph(DietGraphState)
    
    # Nodes
    workflow.add_node("nutricionista", agent.build_agent)
    workflow.add_node("tools", tool_node)
    
    # Edges
    workflow.set_entry_point("nutricionista")
    workflow.add_conditional_edges("nutricionista", should_continue, {
        "tools": "tools",
        END: END
    })
    workflow.add_edge("tools", "nutricionista")
    
    checkpointer = MemorySaver()
    return workflow.compile(checkpointer=checkpointer)

Agent Tools

Smarty has access to 7 specialized tools for meal planning:

search_recipes_by_criteria

Search recipes by name, ingredients, or nutritional requirements

generate_weekly_plan

Create a complete 7-day meal plan based on user profile

get_current_plan_summary

Display the user’s active meal plan with all meals

suggest_recipe_alternatives

Find 3 similar recipes for a specific meal

replace_meal_in_plan

Swap a meal with a chosen alternative

update_user_preference

Update dietary preferences or restrictions

get_user_profile_summary

View user’s dietary profile and goals

Tool Workflow Example

When a user says: β€œChange my Wednesday lunch to something lighter”
1

Agent Understands Intent

Smarty parses the request and identifies:
  • Action: Replace meal
  • Day: Wednesday (day 3)
  • Meal: Lunch
  • Criteria: Lower calories
2

Call suggest_recipe_alternatives

suggest_recipe_alternatives(
    user_id=123,
    day_of_week=3,
    meal_type="lunch"
)
Returns 3 nutritionally similar but lower-calorie options.
3

Present Options

Smarty shows:
Here are 3 lighter lunch alternatives for Wednesday:

1. Grilled Chicken Salad (320 cal)
2. Quinoa Buddha Bowl (350 cal)
3. Turkey Wrap (280 cal)

Which would you like?
4

User Chooses

User replies: β€œOption 2”
5

Call replace_meal_in_plan

replace_meal_in_plan(
    user_id=123,
    day_of_week=3,
    meal_type="lunch",
    new_recipe_name="Quinoa Buddha Bowl"
)
Updates the database and confirms the change.

State Management

DietGraphState Schema

class DietGraphState(TypedDict):
    messages: Annotated[list[AnyMessage], add_messages]
    profile: ProfileResponse
  • messages: Conversation history with automatic deduplication
  • profile: User’s dietary profile (calories, restrictions, preferences)

Memory Optimization

For environments with limited GPU (8GB VRAM):
# Ollama configuration
OLLAMA_CONTEXT_LENGTH = 32768  # docker-compose
num_ctx = 16384                 # config_ollama
MAX_CONTEXT_TOKENS = 10000      # memory.py
GPU Memory Management: The agent is optimized for 8GB VRAM. Exceeding context limits will cause OOM errors. Monitor token usage in production.

Conversation Flow

First Interaction

User: Hi!
Smarty: Hello! I'm Smarty, your nutritional assistant. I can help you:
- Generate personalized meal plans
- Swap meals in your plan
- Search for recipes
- Update your dietary preferences

What would you like to do today?

Meal Swap Flow

User: I don't like the breakfast for tomorrow
Smarty: [calls get_current_plan_summary]
       [calls suggest_recipe_alternatives]
       
       Your current breakfast for Thursday is "Oatmeal with Berries".
       Here are 3 alternatives:
       
       1. Greek Yogurt Parfait (250 cal)
       2. Avocado Toast (320 cal)
       3. Smoothie Bowl (280 cal)
       
       Which one would you prefer?

User: Number 3
Smarty: [calls replace_meal_in_plan]
       
       βœ… Done! I've updated your Thursday breakfast to 
       "Smoothie Bowl". Enjoy!

LangChain Integration

from langchain_ollama import ChatOllama, OllamaEmbeddings

OLLAMA_CONFIG = {
    "model": "llama3.1",
    "base_url": "http://ollama:11434",
    "temperature": 0,
    "num_ctx": 16384,
    "num_predict": 4096,
}

llm = ChatOllama(**OLLAMA_CONFIG)

API Integration

The agent is exposed via REST API:
@router.post("/chat")
async def chat(
    request: ChatRequest,
    current_user: User = Depends(get_current_user),
    db: Session = Depends(get_db)
):
    profile = ProfileService.get_user_profile(db, current_user.id)
    
    response = app_graph.invoke(
        {
            "messages": [{"role": "user", "content": request.message}],
            "profile": profile
        },
        config={"configurable": {"thread_id": f"user_{current_user.id}"}}
    )
    
    return {"response": response["messages"][-1].content}

Performance Optimization

Lazy Loading

Model and tools are loaded once at startup

Message Trimming

Automatic context window management

Tool Result Caching

Database queries are cached per session

Async Processing

Non-blocking API calls with FastAPI

Debugging and Monitoring

import logging

logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)

# Logs appear in console:
# πŸ€– Agent initialized with 7 tools
# πŸ“Š Estado: 8 msgs, 3421 tokens, tools: ['get_current_plan_summary']
# βœ… Optimized: 8 -> 6 messages, 3421 -> 2100 tokens
Set logging.DEBUG to see token counts and optimization steps in real-time.

Future Enhancements

1

Persistent Memory

Replace MemorySaver with PostgreSQL or Redis for conversation persistence across sessions
2

Multi-Language Support

Currently English-only; add i18n for Spanish, French, etc.
3

Voice Interface

Integrate with speech-to-text for voice-based meal planning
4

Proactive Suggestions

Agent initiates conversations: β€œIt’s Sunday, want to generate next week’s plan?”

Related Documentation

Build docs developers (and LLMs) love