CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-flaml

A fast library for automated machine learning and tuning

Pending

Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

Overview
Eval results
Files

autogen.mddocs/

Multi-Agent Conversations

Framework for building conversational AI applications with multiple agents that can collaborate, execute code, interact with humans, and solve complex problems through structured dialogue. The autogen module enables creating sophisticated AI agent systems with customizable behaviors and interaction patterns.

Capabilities

Core Agent Classes

ConversableAgent

Base conversational agent class that provides the foundation for all agent interactions.

class ConversableAgent:
    def __init__(self, name, system_message=None, llm_config=None, 
                 max_consecutive_auto_reply=None, human_input_mode="ALWAYS",
                 code_execution_config=False, **kwargs):
        """
        Initialize conversable agent.
        
        Args:
            name (str): Agent name identifier
            system_message (str): System message defining agent behavior
            llm_config (dict): Language model configuration
            max_consecutive_auto_reply (int): Maximum consecutive auto-replies
            human_input_mode (str): Human interaction mode - 'ALWAYS', 'TERMINATE', 'NEVER'
            code_execution_config (dict or bool): Code execution configuration
            **kwargs: Additional agent parameters
        """
        
    def send(self, message, recipient, request_reply=True, silent=False):
        """
        Send message to another agent.
        
        Args:
            message (str): Message content to send
            recipient (ConversableAgent): Recipient agent
            request_reply (bool): Whether to request a reply
            silent (bool): Whether to suppress output
        """
        
    def receive(self, message, sender, request_reply=None, silent=False):
        """
        Receive message from another agent.
        
        Args:
            message (str): Received message content
            sender (ConversableAgent): Sender agent
            request_reply (bool): Whether reply is requested
            silent (bool): Whether to suppress output
        """
        
    def initiate_chat(self, recipient, message=None, clear_history=True, silent=False):
        """
        Initiate conversation with another agent.
        
        Args:
            recipient (ConversableAgent): Agent to chat with
            message (str): Initial message
            clear_history (bool): Whether to clear chat history
            silent (bool): Whether to suppress output
        """
        
    def register_reply(self, trigger, reply_func=None, position=0, 
                      config=None, reset_config=None):
        """
        Register reply function for specific triggers.
        
        Args:
            trigger (callable or class): Trigger condition for reply
            reply_func (callable): Function to generate reply
            position (int): Position in reply function list
            config (dict): Configuration for reply function
            reset_config (callable): Function to reset configuration
        """
        
    def update_system_message(self, system_message):
        """
        Update agent's system message.
        
        Args:
            system_message (str): New system message
        """
        
    def reset(self):
        """Reset agent state and clear conversation history."""
        
    @property
    def system_message(self):
        """Current system message."""
        
    @property
    def chat_messages(self):
        """Dictionary of chat message histories with other agents."""
        
    def last_message(self, agent=None):
        """
        Get last message from conversation.
        
        Args:
            agent (ConversableAgent): Specific agent to get message from
            
        Returns:
            dict: Last message content
        """

Specialized Agent Classes

class AssistantAgent(ConversableAgent):
    """
    AI assistant agent with default system message for helpful assistance.
    Inherits all ConversableAgent capabilities with assistant-specific defaults.
    """
    
    def __init__(self, name, llm_config, **kwargs):
        """
        Initialize assistant agent.
        
        Args:
            name (str): Agent name
            llm_config (dict): Language model configuration
            **kwargs: Additional ConversableAgent parameters
        """

class UserProxyAgent(ConversableAgent):
    """
    User proxy agent that acts on behalf of users with human input capabilities.
    Can execute code and interact with humans when configured.
    """
    
    def __init__(self, name, is_termination_msg=None, max_consecutive_auto_reply=None,
                 human_input_mode="ALWAYS", code_execution_config=None, **kwargs):
        """
        Initialize user proxy agent.
        
        Args:
            name (str): Agent name
            is_termination_msg (callable): Function to detect termination messages
            max_consecutive_auto_reply (int): Max consecutive auto-replies
            human_input_mode (str): Human interaction mode
            code_execution_config (dict): Code execution settings
            **kwargs: Additional ConversableAgent parameters
        """

class Agent:
    """Abstract base agent class defining the agent interface."""

Group Conversations

Classes for managing multi-agent group conversations and coordination.

class GroupChat:
    def __init__(self, agents, messages=[], max_round=10, admin_name="Admin"):
        """
        Initialize group chat.
        
        Args:
            agents (list): List of participating agents
            messages (list): Initial conversation messages
            max_round (int): Maximum number of conversation rounds
            admin_name (str): Name of admin managing the chat
        """
        
    @property
    def agent_names(self):
        """List of agent names in the group."""
        
    def reset(self):
        """Reset group chat state."""
        
    def append(self, message, speaker):
        """
        Append message to group conversation.
        
        Args:
            message (dict): Message to append
            speaker (ConversableAgent): Agent who sent the message
        """
        
    def select_speaker(self, last_speaker, selector):
        """
        Select next speaker in group conversation.
        
        Args:
            last_speaker (ConversableAgent): Previous speaker
            selector (ConversableAgent): Agent selecting next speaker
            
        Returns:
            ConversableAgent: Next speaker
        """

class GroupChatManager(ConversableAgent):
    """
    Group chat manager that coordinates multi-agent conversations.
    Extends ConversableAgent with group management capabilities.
    """
    
    def __init__(self, groupchat, name="chat_manager", **kwargs):
        """
        Initialize group chat manager.
        
        Args:
            groupchat (GroupChat): Group chat instance to manage
            name (str): Manager agent name
            **kwargs: Additional ConversableAgent parameters
        """

OpenAI Integration

Classes and utilities for integrating with OpenAI language models.

class Completion:
    """OpenAI completion interface for text generation."""
    
    @staticmethod
    def create(engine=None, model=None, prompt=None, **kwargs):
        """
        Create completion request.
        
        Args:
            engine (str): OpenAI engine name
            model (str): Model name
            prompt (str): Input prompt
            **kwargs: Additional completion parameters
            
        Returns:
            dict: Completion response
        """

class ChatCompletion:
    """OpenAI chat completion interface for conversational AI."""
    
    @staticmethod
    def create(model=None, messages=None, **kwargs):
        """
        Create chat completion request.
        
        Args:
            model (str): Model name
            messages (list): Conversation messages
            **kwargs: Additional chat parameters
            
        Returns:
            dict: Chat completion response
        """

Configuration Utilities

Helper functions for managing language model configurations.

def get_config_list(api_keys=None, api_bases=None, api_versions=None,
                   api_types=None, models=None):
    """
    Generate configuration list for multiple language models.
    
    Args:
        api_keys (list): API keys for different services
        api_bases (list): API base URLs
        api_versions (list): API versions
        api_types (list): API types ('openai', 'azure', etc.)
        models (list): Model names
        
    Returns:
        list: Configuration list for language models
    """

def config_list_gpt4_gpt35(api_key=None, api_base=None, api_version=None):
    """
    Create configuration list for GPT-4 and GPT-3.5 models.
    
    Args:
        api_key (str): OpenAI API key
        api_base (str): API base URL
        api_version (str): API version
        
    Returns:
        list: Configuration list for GPT models
    """

def config_list_openai_aoai(openai_api_key=None, aoai_api_key=None, 
                           aoai_api_base=None, aoai_api_version=None):
    """
    Create configuration list for OpenAI and Azure OpenAI.
    
    Args:
        openai_api_key (str): OpenAI API key
        aoai_api_key (str): Azure OpenAI API key
        aoai_api_base (str): Azure OpenAI base URL
        aoai_api_version (str): Azure OpenAI API version
        
    Returns:
        list: Combined configuration list
    """

def config_list_from_models(model_list, api_key=None, api_base=None, api_version=None):
    """
    Create configuration list from model names.
    
    Args:
        model_list (list): List of model names
        api_key (str): API key
        api_base (str): API base URL
        api_version (str): API version
        
    Returns:
        list: Configuration list for specified models
    """

def config_list_from_json(json_file=None, file_location=None, filter_dict=None):
    """
    Load configuration list from JSON file.
    
    Args:
        json_file (str): Path to JSON configuration file
        file_location (str): Directory containing JSON file
        filter_dict (dict): Filter criteria for configurations
        
    Returns:
        list: Configuration list from JSON
    """

Constants

DEFAULT_MODEL = "gpt-4"  # Default language model
FAST_MODEL = "gpt-3.5-turbo"  # Fast language model for efficient operations

Usage Examples

Basic Two-Agent Conversation

from flaml.autogen import AssistantAgent, UserProxyAgent

# Configure language model
llm_config = {
    "model": "gpt-4",
    "api_key": "your-openai-api-key",
    "temperature": 0.7
}

# Create agents
assistant = AssistantAgent(
    name="assistant",
    llm_config=llm_config
)

user_proxy = UserProxyAgent(
    name="user_proxy",
    human_input_mode="NEVER",  # No human input required
    code_execution_config={
        "work_dir": "coding",
        "use_docker": False
    }
)

# Start conversation
user_proxy.initiate_chat(
    assistant,
    message="Write a Python function to calculate the factorial of a number."
)

Group Conversation with Multiple Agents

from flaml.autogen import AssistantAgent, UserProxyAgent, GroupChat, GroupChatManager

# Create multiple agents with different roles
coder = AssistantAgent(
    name="coder",
    system_message="You are an expert Python programmer.",
    llm_config=llm_config
)

reviewer = AssistantAgent(
    name="reviewer", 
    system_message="You are a code reviewer who checks for bugs and improvements.",
    llm_config=llm_config
)

user_proxy = UserProxyAgent(
    name="user_proxy",
    system_message="You execute code and provide feedback.",
    code_execution_config={"work_dir": "coding"}
)

# Create group chat
groupchat = GroupChat(
    agents=[user_proxy, coder, reviewer],
    messages=[],
    max_round=12
)

manager = GroupChatManager(groupchat=groupchat, llm_config=llm_config)

# Start group conversation
user_proxy.initiate_chat(
    manager,
    message="Create a Python class for a binary search tree with insert and search methods."
)

Custom Agent with Specialized Behavior

from flaml.autogen import ConversableAgent

class DataAnalystAgent(ConversableAgent):
    """Custom agent specialized for data analysis tasks."""
    
    def __init__(self, name, **kwargs):
        system_message = """You are a data analyst expert. You help with:
        1. Data cleaning and preprocessing
        2. Statistical analysis and visualization
        3. Machine learning model recommendations
        Always provide code examples and explain your reasoning."""
        
        super().__init__(
            name=name,
            system_message=system_message,
            **kwargs
        )

# Use custom agent
analyst = DataAnalystAgent(
    name="data_analyst",
    llm_config=llm_config
)

user_proxy.initiate_chat(
    analyst,
    message="I have a dataset with missing values. How should I handle them?"
)

Code Execution Configuration

# Advanced code execution setup
code_execution_config = {
    "work_dir": "agent_workspace",
    "use_docker": True,
    "timeout": 120,
    "last_n_messages": 3
}

user_proxy = UserProxyAgent(
    name="executor",
    human_input_mode="TERMINATE",  # Ask human before terminating
    code_execution_config=code_execution_config,
    is_termination_msg=lambda msg: "TERMINATE" in msg.get("content", "")
)

Multi-Model Configuration

from flaml.autogen.oai import config_list_gpt4_gpt35

# Use multiple models for redundancy
config_list = config_list_gpt4_gpt35(api_key="your-api-key")

# Agent with fallback models
assistant = AssistantAgent(
    name="multi_model_assistant",
    llm_config={
        "config_list": config_list,
        "temperature": 0.5,
        "timeout": 60
    }
)

Custom Reply Functions

def custom_reply(recipient, messages, sender, config):
    """Custom reply function with specific logic."""
    last_msg = messages[-1]["content"]
    
    if "math" in last_msg.lower():
        return True, "I'll solve this mathematical problem step by step."
    return False, None

# Register custom reply
assistant.register_reply(
    trigger=lambda sender, recipient, messages, **kwargs: "math" in messages[-1]["content"].lower(),
    reply_func=custom_reply
)

OpenAI Configuration Utilities

Utility functions for configuring language models and managing API configurations.

def get_config_list(config_list=None, api_type=None, **kwargs):
    """
    Get configuration list for language models.
    
    Args:
        config_list (list): Existing configuration list
        api_type (str): API type ('openai' or 'azure')
        **kwargs: Additional configuration parameters
        
    Returns:
        list: Configuration list for language models
    """

def config_list_from_models(model_list, **kwargs):
    """
    Create configuration list from model names.
    
    Args:
        model_list (list): List of model names
        **kwargs: Configuration parameters (api_key, base_url, etc.)
        
    Returns:
        list: Configuration list
    """

def config_list_from_json(json_file, filter_dict=None):
    """
    Load configuration list from JSON file.
    
    Args:
        json_file (str): Path to JSON configuration file
        filter_dict (dict): Filter criteria for configurations
        
    Returns:
        list: Filtered configuration list
    """

def config_list_gpt4_gpt35(api_key=None, base_url=None):
    """
    Create configuration for GPT-4 and GPT-3.5 models.
    
    Args:
        api_key (str): OpenAI API key
        base_url (str): Base URL for API
        
    Returns:
        list: Configuration list for GPT models
    """

def config_list_openai_aoai(**kwargs):
    """
    Create configuration for OpenAI and Azure OpenAI services.
    
    Args:
        **kwargs: Configuration parameters
        
    Returns:
        list: Configuration list for both services
    """

Enhanced Completion APIs

Wrapper classes for OpenAI completion APIs with additional functionality.

class Completion:
    """Enhanced completion API with tuning and optimization features."""
    
    @staticmethod
    def create(prompt, config_list=None, **kwargs):
        """Create completion with enhanced features."""
        
    @staticmethod
    def tune(data, metric, mode="max", **kwargs):
        """Tune completion parameters for optimal performance."""

class ChatCompletion:
    """Enhanced chat completion API with conversation management."""
    
    @staticmethod
    def create(messages, config_list=None, **kwargs):
        """Create chat completion with enhanced features."""

Integration Features

  • Multi-Model Support: Use different language models (GPT-4, GPT-3.5, Azure OpenAI)
  • Code Execution: Safe code execution in sandboxed environments
  • Human-in-the-Loop: Configurable human interaction points
  • Conversation Management: Persistent chat histories and state management
  • Custom Behaviors: Extensible agent classes with custom reply functions
  • Group Coordination: Sophisticated multi-agent conversation management
  • Error Handling: Robust error handling and recovery mechanisms
  • Configuration Management: Flexible API configuration and model selection utilities

Install with Tessl CLI

npx tessl i tessl/pypi-flaml

docs

autogen.md

automl.md

default-estimators.md

index.md

online-learning.md

tuning.md

tile.json