Overview
The Repository Client provides enhanced Autogen agents with Streamlit UI integration, file tracking, and message processing capabilities. This module includes specialized agent classes and utilities for building interactive AI applications.
Core Components
TrackableUserProxyAgent
An enhanced UserProxyAgent with Streamlit integration and file tracking.
class TrackableUserProxyAgent(UserProxyAgent):
def __init__(self, *args, **kwargs)
Features:
- Automatic Streamlit session state integration
- File creation monitoring and display
- Enhanced code execution with path handling
- Custom message processing and display
Example:
from src.services.agents.agent_client import TrackableUserProxyAgent
executor = TrackableUserProxyAgent(
name="executor",
human_input_mode="NEVER",
code_execution_config={
"work_dir": "coding",
"use_docker": False
}
)
Key Methods
execute_code_blocks
def execute_code_blocks(code_blocks)
Overrides default code execution to handle path issues and add project root to Python path.
chat_messages_for_summary
def chat_messages_for_summary(agent: Agent) -> list[dict[str, Any]]
Returns messages for conversation summarization, excluding incomplete tool calls.
TrackableAssistantAgent
An enhanced AssistantAgent with Streamlit support and file display capabilities.
class TrackableAssistantAgent(AssistantAgent):
def __init__(self, *args, **kwargs)
Features:
- Streamlit integration for message display
- Automatic file tracking and presentation
- Context-aware message processing
Example:
from src.services.agents.agent_client import TrackableAssistantAgent
assistant = TrackableAssistantAgent(
name="assistant",
llm_config=llm_config,
system_message="You are a helpful AI assistant."
)
Key Methods
display_files
Displays newly created files in the Streamlit interface with preview capabilities.
EnhancedMessageProcessor
Static utility class for processing and displaying agent messages in Streamlit.
Core Display Methods
streamlit_display_message
@staticmethod
def streamlit_display_message(
st,
message: Union[Dict, str],
sender_name: str,
receiver_name: str,
llm_config: Dict,
sender_role=None,
save_to_history: bool = True,
timestamp: str = None,
)
Display agent messages with rich formatting in Streamlit.
Streamlit module instance
Message to display (supports OpenAI message format)
Name of the message sender
Name of the message receiver
LLM configuration dictionary
Role of the sender (“user” or “assistant”)
Whether to save this message to session history
ISO format timestamp for the message
Example:
import streamlit as st
from src.services.agents.agent_client import EnhancedMessageProcessor
EnhancedMessageProcessor.streamlit_display_message(
st=st,
message={"content": "Analysis complete!", "role": "assistant"},
sender_name="AI Assistant",
receiver_name="User",
llm_config=llm_config,
sender_role="assistant"
)
File Display Methods
display_files_batch
@staticmethod
def display_files_batch(
work_dir: str,
previous_files_info: Dict[str, Dict],
st
) -> Tuple[Dict[str, Dict], List[str]]
Display newly created files in batch with preview support.
Working directory to monitor for files
Previous file information from get_directory_files()
Streamlit module instance
return
Tuple[Dict[str, Dict], List[str]]
Returns tuple of (current_files_info, new_file_paths)
Supported file types:
- Images: PNG, JPG, JPEG (with preview)
- Data: CSV, XLSX, JSON (with table preview)
- Documents: PDF, HTML, TXT (with content preview)
- Code: PY, JS, etc. (with syntax highlighting)
Example:
import streamlit as st
# Initialize
if "local_files_info" not in st.session_state:
st.session_state.local_files_info = {}
# Display new files
current_files, new_files = EnhancedMessageProcessor.display_files_batch(
work_dir="coding/workspace",
previous_files_info=st.session_state.local_files_info,
st=st
)
# Update state
st.session_state.local_files_info = current_files
if new_files:
st.success(f"Created {len(new_files)} new files!")
display_files_compact
@staticmethod
def display_files_compact(new_files: List[str], st)
Display files in a compact one-line layout with thumbnails.
Example:
EnhancedMessageProcessor.display_files_compact(
new_files=["chart.png", "data.csv", "analysis.py"],
st=st
)
@staticmethod
def process_tool_calls(tool_calls, st)
Display tool/function calls with formatted parameters and execution status.
Example:
tool_calls = [
{
"function": {
"name": "search_web",
"arguments": '{"query": "latest AI news"}'
}
}
]
EnhancedMessageProcessor.process_tool_calls(tool_calls, st)
History Management
create_display_info
@staticmethod
def create_display_info(
message_content: str,
sender_name: str,
receiver_name: str,
sender_role: str,
llm_config: Dict = None,
new_files: List[str] = None
) -> Dict
Create a standardized display information dictionary.
Dictionary containing:
message: Message content and role
sender_info: Sender details
receiver_name: Receiver name
llm_config: LLM configuration
sender_role: Sender role
timestamp: ISO format timestamp
new_files: JSON array of new files
replay_display_messages
@staticmethod
def replay_display_messages(st, display_messages: List[Dict])
Replay saved conversation history with file displays.
Example:
# Save conversation
if "display_messages" not in st.session_state:
st.session_state.display_messages = []
# Replay later
EnhancedMessageProcessor.replay_display_messages(
st=st,
display_messages=st.session_state.display_messages
)
Utility Functions
detect_new_files
@staticmethod
def detect_new_files(
work_dir: str,
previous_files_info: Dict[str, Dict] = None
) -> Tuple[Dict[str, Dict], List[str]]
Detect new files created in a directory.
fliter_message
@staticmethod
def fliter_message(content: str) -> str
Filter and clean message content by removing internal paths.
Helper Functions
check_openai_message
def check_openai_message(message: dict, st) -> bool
Validate OpenAI message format and filter system messages.
File Conversion Utilities
read_html_as_image
def read_html_as_image(file) -> Image
Convert HTML file to image for preview.
convert_pdf_to_images
def convert_pdf_to_images(pdf_file) -> List[Image]
Convert PDF pages to images for preview.
Complete Example
import streamlit as st
from src.services.agents.agent_client import (
TrackableUserProxyAgent,
TrackableAssistantAgent,
EnhancedMessageProcessor
)
# Initialize session state
if "messages" not in st.session_state:
st.session_state.messages = []
if "display_messages" not in st.session_state:
st.session_state.display_messages = []
if "local_files_info" not in st.session_state:
st.session_state.local_files_info = {}
# Create agents
assistant = TrackableAssistantAgent(
name="assistant",
llm_config=llm_config,
system_message="You are a helpful coding assistant."
)
executor = TrackableUserProxyAgent(
name="executor",
human_input_mode="NEVER",
code_execution_config={
"work_dir": "coding",
"use_docker": False
}
)
# Replay conversation history
if st.session_state.display_messages:
EnhancedMessageProcessor.replay_display_messages(
st=st,
display_messages=st.session_state.display_messages
)
# Handle new input
if user_input := st.chat_input("Ask me anything..."):
# User message
with st.chat_message("user"):
st.write(user_input)
# Execute task (agents handle display automatically)
result = await executor.a_initiate_chat(
assistant,
message=user_input
)
Notes
- Agents automatically integrate with Streamlit session state when available
- File tracking uses the
file_monitor module for robust detection
- Message display supports OpenAI message format including tool calls
- All file previews are automatically generated based on file type
- Conversation history can be replayed with full fidelity including files