or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

autogen.mdautoml.mddefault-estimators.mdindex.mdonline-learning.mdtuning.md

autogen.mddocs/

0

# Multi-Agent Conversations

1

2

Framework for building conversational AI applications with multiple agents that can collaborate, execute code, interact with humans, and solve complex problems through structured dialogue. The autogen module enables creating sophisticated AI agent systems with customizable behaviors and interaction patterns.

3

4

## Capabilities

5

6

### Core Agent Classes

7

8

#### ConversableAgent

9

10

Base conversational agent class that provides the foundation for all agent interactions.

11

12

```python { .api }

13

class ConversableAgent:

14

def __init__(self, name, system_message=None, llm_config=None,

15

max_consecutive_auto_reply=None, human_input_mode="ALWAYS",

16

code_execution_config=False, **kwargs):

17

"""

18

Initialize conversable agent.

19

20

Args:

21

name (str): Agent name identifier

22

system_message (str): System message defining agent behavior

23

llm_config (dict): Language model configuration

24

max_consecutive_auto_reply (int): Maximum consecutive auto-replies

25

human_input_mode (str): Human interaction mode - 'ALWAYS', 'TERMINATE', 'NEVER'

26

code_execution_config (dict or bool): Code execution configuration

27

**kwargs: Additional agent parameters

28

"""

29

30

def send(self, message, recipient, request_reply=True, silent=False):

31

"""

32

Send message to another agent.

33

34

Args:

35

message (str): Message content to send

36

recipient (ConversableAgent): Recipient agent

37

request_reply (bool): Whether to request a reply

38

silent (bool): Whether to suppress output

39

"""

40

41

def receive(self, message, sender, request_reply=None, silent=False):

42

"""

43

Receive message from another agent.

44

45

Args:

46

message (str): Received message content

47

sender (ConversableAgent): Sender agent

48

request_reply (bool): Whether reply is requested

49

silent (bool): Whether to suppress output

50

"""

51

52

def initiate_chat(self, recipient, message=None, clear_history=True, silent=False):

53

"""

54

Initiate conversation with another agent.

55

56

Args:

57

recipient (ConversableAgent): Agent to chat with

58

message (str): Initial message

59

clear_history (bool): Whether to clear chat history

60

silent (bool): Whether to suppress output

61

"""

62

63

def register_reply(self, trigger, reply_func=None, position=0,

64

config=None, reset_config=None):

65

"""

66

Register reply function for specific triggers.

67

68

Args:

69

trigger (callable or class): Trigger condition for reply

70

reply_func (callable): Function to generate reply

71

position (int): Position in reply function list

72

config (dict): Configuration for reply function

73

reset_config (callable): Function to reset configuration

74

"""

75

76

def update_system_message(self, system_message):

77

"""

78

Update agent's system message.

79

80

Args:

81

system_message (str): New system message

82

"""

83

84

def reset(self):

85

"""Reset agent state and clear conversation history."""

86

87

@property

88

def system_message(self):

89

"""Current system message."""

90

91

@property

92

def chat_messages(self):

93

"""Dictionary of chat message histories with other agents."""

94

95

def last_message(self, agent=None):

96

"""

97

Get last message from conversation.

98

99

Args:

100

agent (ConversableAgent): Specific agent to get message from

101

102

Returns:

103

dict: Last message content

104

"""

105

```

106

107

#### Specialized Agent Classes

108

109

```python { .api }

110

class AssistantAgent(ConversableAgent):

111

"""

112

AI assistant agent with default system message for helpful assistance.

113

Inherits all ConversableAgent capabilities with assistant-specific defaults.

114

"""

115

116

def __init__(self, name, llm_config, **kwargs):

117

"""

118

Initialize assistant agent.

119

120

Args:

121

name (str): Agent name

122

llm_config (dict): Language model configuration

123

**kwargs: Additional ConversableAgent parameters

124

"""

125

126

class UserProxyAgent(ConversableAgent):

127

"""

128

User proxy agent that acts on behalf of users with human input capabilities.

129

Can execute code and interact with humans when configured.

130

"""

131

132

def __init__(self, name, is_termination_msg=None, max_consecutive_auto_reply=None,

133

human_input_mode="ALWAYS", code_execution_config=None, **kwargs):

134

"""

135

Initialize user proxy agent.

136

137

Args:

138

name (str): Agent name

139

is_termination_msg (callable): Function to detect termination messages

140

max_consecutive_auto_reply (int): Max consecutive auto-replies

141

human_input_mode (str): Human interaction mode

142

code_execution_config (dict): Code execution settings

143

**kwargs: Additional ConversableAgent parameters

144

"""

145

146

class Agent:

147

"""Abstract base agent class defining the agent interface."""

148

```

149

150

### Group Conversations

151

152

Classes for managing multi-agent group conversations and coordination.

153

154

```python { .api }

155

class GroupChat:

156

def __init__(self, agents, messages=[], max_round=10, admin_name="Admin"):

157

"""

158

Initialize group chat.

159

160

Args:

161

agents (list): List of participating agents

162

messages (list): Initial conversation messages

163

max_round (int): Maximum number of conversation rounds

164

admin_name (str): Name of admin managing the chat

165

"""

166

167

@property

168

def agent_names(self):

169

"""List of agent names in the group."""

170

171

def reset(self):

172

"""Reset group chat state."""

173

174

def append(self, message, speaker):

175

"""

176

Append message to group conversation.

177

178

Args:

179

message (dict): Message to append

180

speaker (ConversableAgent): Agent who sent the message

181

"""

182

183

def select_speaker(self, last_speaker, selector):

184

"""

185

Select next speaker in group conversation.

186

187

Args:

188

last_speaker (ConversableAgent): Previous speaker

189

selector (ConversableAgent): Agent selecting next speaker

190

191

Returns:

192

ConversableAgent: Next speaker

193

"""

194

195

class GroupChatManager(ConversableAgent):

196

"""

197

Group chat manager that coordinates multi-agent conversations.

198

Extends ConversableAgent with group management capabilities.

199

"""

200

201

def __init__(self, groupchat, name="chat_manager", **kwargs):

202

"""

203

Initialize group chat manager.

204

205

Args:

206

groupchat (GroupChat): Group chat instance to manage

207

name (str): Manager agent name

208

**kwargs: Additional ConversableAgent parameters

209

"""

210

```

211

212

### OpenAI Integration

213

214

Classes and utilities for integrating with OpenAI language models.

215

216

```python { .api }

217

class Completion:

218

"""OpenAI completion interface for text generation."""

219

220

@staticmethod

221

def create(engine=None, model=None, prompt=None, **kwargs):

222

"""

223

Create completion request.

224

225

Args:

226

engine (str): OpenAI engine name

227

model (str): Model name

228

prompt (str): Input prompt

229

**kwargs: Additional completion parameters

230

231

Returns:

232

dict: Completion response

233

"""

234

235

class ChatCompletion:

236

"""OpenAI chat completion interface for conversational AI."""

237

238

@staticmethod

239

def create(model=None, messages=None, **kwargs):

240

"""

241

Create chat completion request.

242

243

Args:

244

model (str): Model name

245

messages (list): Conversation messages

246

**kwargs: Additional chat parameters

247

248

Returns:

249

dict: Chat completion response

250

"""

251

```

252

253

### Configuration Utilities

254

255

Helper functions for managing language model configurations.

256

257

```python { .api }

258

def get_config_list(api_keys=None, api_bases=None, api_versions=None,

259

api_types=None, models=None):

260

"""

261

Generate configuration list for multiple language models.

262

263

Args:

264

api_keys (list): API keys for different services

265

api_bases (list): API base URLs

266

api_versions (list): API versions

267

api_types (list): API types ('openai', 'azure', etc.)

268

models (list): Model names

269

270

Returns:

271

list: Configuration list for language models

272

"""

273

274

def config_list_gpt4_gpt35(api_key=None, api_base=None, api_version=None):

275

"""

276

Create configuration list for GPT-4 and GPT-3.5 models.

277

278

Args:

279

api_key (str): OpenAI API key

280

api_base (str): API base URL

281

api_version (str): API version

282

283

Returns:

284

list: Configuration list for GPT models

285

"""

286

287

def config_list_openai_aoai(openai_api_key=None, aoai_api_key=None,

288

aoai_api_base=None, aoai_api_version=None):

289

"""

290

Create configuration list for OpenAI and Azure OpenAI.

291

292

Args:

293

openai_api_key (str): OpenAI API key

294

aoai_api_key (str): Azure OpenAI API key

295

aoai_api_base (str): Azure OpenAI base URL

296

aoai_api_version (str): Azure OpenAI API version

297

298

Returns:

299

list: Combined configuration list

300

"""

301

302

def config_list_from_models(model_list, api_key=None, api_base=None, api_version=None):

303

"""

304

Create configuration list from model names.

305

306

Args:

307

model_list (list): List of model names

308

api_key (str): API key

309

api_base (str): API base URL

310

api_version (str): API version

311

312

Returns:

313

list: Configuration list for specified models

314

"""

315

316

def config_list_from_json(json_file=None, file_location=None, filter_dict=None):

317

"""

318

Load configuration list from JSON file.

319

320

Args:

321

json_file (str): Path to JSON configuration file

322

file_location (str): Directory containing JSON file

323

filter_dict (dict): Filter criteria for configurations

324

325

Returns:

326

list: Configuration list from JSON

327

"""

328

```

329

330

### Constants

331

332

```python { .api }

333

DEFAULT_MODEL = "gpt-4" # Default language model

334

FAST_MODEL = "gpt-3.5-turbo" # Fast language model for efficient operations

335

```

336

337

### Usage Examples

338

339

#### Basic Two-Agent Conversation

340

```python

341

from flaml.autogen import AssistantAgent, UserProxyAgent

342

343

# Configure language model

344

llm_config = {

345

"model": "gpt-4",

346

"api_key": "your-openai-api-key",

347

"temperature": 0.7

348

}

349

350

# Create agents

351

assistant = AssistantAgent(

352

name="assistant",

353

llm_config=llm_config

354

)

355

356

user_proxy = UserProxyAgent(

357

name="user_proxy",

358

human_input_mode="NEVER", # No human input required

359

code_execution_config={

360

"work_dir": "coding",

361

"use_docker": False

362

}

363

)

364

365

# Start conversation

366

user_proxy.initiate_chat(

367

assistant,

368

message="Write a Python function to calculate the factorial of a number."

369

)

370

```

371

372

#### Group Conversation with Multiple Agents

373

```python

374

from flaml.autogen import AssistantAgent, UserProxyAgent, GroupChat, GroupChatManager

375

376

# Create multiple agents with different roles

377

coder = AssistantAgent(

378

name="coder",

379

system_message="You are an expert Python programmer.",

380

llm_config=llm_config

381

)

382

383

reviewer = AssistantAgent(

384

name="reviewer",

385

system_message="You are a code reviewer who checks for bugs and improvements.",

386

llm_config=llm_config

387

)

388

389

user_proxy = UserProxyAgent(

390

name="user_proxy",

391

system_message="You execute code and provide feedback.",

392

code_execution_config={"work_dir": "coding"}

393

)

394

395

# Create group chat

396

groupchat = GroupChat(

397

agents=[user_proxy, coder, reviewer],

398

messages=[],

399

max_round=12

400

)

401

402

manager = GroupChatManager(groupchat=groupchat, llm_config=llm_config)

403

404

# Start group conversation

405

user_proxy.initiate_chat(

406

manager,

407

message="Create a Python class for a binary search tree with insert and search methods."

408

)

409

```

410

411

#### Custom Agent with Specialized Behavior

412

```python

413

from flaml.autogen import ConversableAgent

414

415

class DataAnalystAgent(ConversableAgent):

416

"""Custom agent specialized for data analysis tasks."""

417

418

def __init__(self, name, **kwargs):

419

system_message = """You are a data analyst expert. You help with:

420

1. Data cleaning and preprocessing

421

2. Statistical analysis and visualization

422

3. Machine learning model recommendations

423

Always provide code examples and explain your reasoning."""

424

425

super().__init__(

426

name=name,

427

system_message=system_message,

428

**kwargs

429

)

430

431

# Use custom agent

432

analyst = DataAnalystAgent(

433

name="data_analyst",

434

llm_config=llm_config

435

)

436

437

user_proxy.initiate_chat(

438

analyst,

439

message="I have a dataset with missing values. How should I handle them?"

440

)

441

```

442

443

#### Code Execution Configuration

444

```python

445

# Advanced code execution setup

446

code_execution_config = {

447

"work_dir": "agent_workspace",

448

"use_docker": True,

449

"timeout": 120,

450

"last_n_messages": 3

451

}

452

453

user_proxy = UserProxyAgent(

454

name="executor",

455

human_input_mode="TERMINATE", # Ask human before terminating

456

code_execution_config=code_execution_config,

457

is_termination_msg=lambda msg: "TERMINATE" in msg.get("content", "")

458

)

459

```

460

461

#### Multi-Model Configuration

462

```python

463

from flaml.autogen.oai import config_list_gpt4_gpt35

464

465

# Use multiple models for redundancy

466

config_list = config_list_gpt4_gpt35(api_key="your-api-key")

467

468

# Agent with fallback models

469

assistant = AssistantAgent(

470

name="multi_model_assistant",

471

llm_config={

472

"config_list": config_list,

473

"temperature": 0.5,

474

"timeout": 60

475

}

476

)

477

```

478

479

#### Custom Reply Functions

480

```python

481

def custom_reply(recipient, messages, sender, config):

482

"""Custom reply function with specific logic."""

483

last_msg = messages[-1]["content"]

484

485

if "math" in last_msg.lower():

486

return True, "I'll solve this mathematical problem step by step."

487

return False, None

488

489

# Register custom reply

490

assistant.register_reply(

491

trigger=lambda sender, recipient, messages, **kwargs: "math" in messages[-1]["content"].lower(),

492

reply_func=custom_reply

493

)

494

```

495

496

### OpenAI Configuration Utilities

497

498

Utility functions for configuring language models and managing API configurations.

499

500

```python { .api }

501

def get_config_list(config_list=None, api_type=None, **kwargs):

502

"""

503

Get configuration list for language models.

504

505

Args:

506

config_list (list): Existing configuration list

507

api_type (str): API type ('openai' or 'azure')

508

**kwargs: Additional configuration parameters

509

510

Returns:

511

list: Configuration list for language models

512

"""

513

514

def config_list_from_models(model_list, **kwargs):

515

"""

516

Create configuration list from model names.

517

518

Args:

519

model_list (list): List of model names

520

**kwargs: Configuration parameters (api_key, base_url, etc.)

521

522

Returns:

523

list: Configuration list

524

"""

525

526

def config_list_from_json(json_file, filter_dict=None):

527

"""

528

Load configuration list from JSON file.

529

530

Args:

531

json_file (str): Path to JSON configuration file

532

filter_dict (dict): Filter criteria for configurations

533

534

Returns:

535

list: Filtered configuration list

536

"""

537

538

def config_list_gpt4_gpt35(api_key=None, base_url=None):

539

"""

540

Create configuration for GPT-4 and GPT-3.5 models.

541

542

Args:

543

api_key (str): OpenAI API key

544

base_url (str): Base URL for API

545

546

Returns:

547

list: Configuration list for GPT models

548

"""

549

550

def config_list_openai_aoai(**kwargs):

551

"""

552

Create configuration for OpenAI and Azure OpenAI services.

553

554

Args:

555

**kwargs: Configuration parameters

556

557

Returns:

558

list: Configuration list for both services

559

"""

560

```

561

562

### Enhanced Completion APIs

563

564

Wrapper classes for OpenAI completion APIs with additional functionality.

565

566

```python { .api }

567

class Completion:

568

"""Enhanced completion API with tuning and optimization features."""

569

570

@staticmethod

571

def create(prompt, config_list=None, **kwargs):

572

"""Create completion with enhanced features."""

573

574

@staticmethod

575

def tune(data, metric, mode="max", **kwargs):

576

"""Tune completion parameters for optimal performance."""

577

578

class ChatCompletion:

579

"""Enhanced chat completion API with conversation management."""

580

581

@staticmethod

582

def create(messages, config_list=None, **kwargs):

583

"""Create chat completion with enhanced features."""

584

```

585

586

## Integration Features

587

588

- **Multi-Model Support**: Use different language models (GPT-4, GPT-3.5, Azure OpenAI)

589

- **Code Execution**: Safe code execution in sandboxed environments

590

- **Human-in-the-Loop**: Configurable human interaction points

591

- **Conversation Management**: Persistent chat histories and state management

592

- **Custom Behaviors**: Extensible agent classes with custom reply functions

593

- **Group Coordination**: Sophisticated multi-agent conversation management

594

- **Error Handling**: Robust error handling and recovery mechanisms

595

- **Configuration Management**: Flexible API configuration and model selection utilities