or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

channels.mdconfiguration.mderrors.mdfunctional-api.mdindex.mdmessage-graph.mdpregel.mdstate-graph.mdtypes-primitives.md

message-graph.mddocs/

0

# Message Handling

1

2

LangGraph provides specialized utilities for message-based workflows, including message merging, state management, and the deprecated MessageGraph class. These utilities simplify building conversational agents and chat applications.

3

4

## Imports

5

6

```python

7

from langgraph.graph import MessagesState, add_messages, MessageGraph

8

```

9

10

For advanced message operations:

11

12

```python

13

from langgraph.graph.message import push_message, REMOVE_ALL_MESSAGES

14

```

15

16

Import RemoveMessage from langchain_core:

17

18

```python

19

from langchain_core.messages import RemoveMessage

20

```

21

22

## Types

23

24

### Messages

25

26

Type alias for message inputs to `add_messages` function.

27

28

```python { .api }

29

Messages = list[MessageLikeRepresentation] | MessageLikeRepresentation

30

```

31

32

Can be either a single message or a list of messages. `MessageLikeRepresentation` includes:

33

- Message objects from LangChain Core (BaseMessage, HumanMessage, AIMessage, SystemMessage, etc.)

34

- Dictionary representations of messages

35

- Tuples of (role, content) for simple messages

36

37

### REMOVE_ALL_MESSAGES

38

39

Special constant used to clear all messages in the message list.

40

41

```python { .api }

42

REMOVE_ALL_MESSAGES = "__remove_all__"

43

```

44

45

**Usage:**

46

```python

47

# Clear all messages

48

def clear_messages(state):

49

return {"messages": [REMOVE_ALL_MESSAGES]}

50

```

51

52

## Capabilities

53

54

### Message State

55

56

#### MessagesState TypedDict

57

58

Pre-defined state schema for message-based workflows with automatic message merging.

59

60

```python { .api }

61

class MessagesState(TypedDict):

62

"""

63

TypedDict with a messages field using add_messages reducer.

64

65

Provides a standard state schema for conversational workflows.

66

The messages field automatically merges new messages using the

67

add_messages reducer function.

68

69

Attributes:

70

messages: Annotated[list[AnyMessage], add_messages]

71

List of messages with automatic merging by ID

72

73

Usage:

74

from langgraph.graph import StateGraph, MessagesState, START, END

75

76

# Use MessagesState as state schema

77

graph = StateGraph(MessagesState)

78

79

def chatbot(state: MessagesState) -> dict:

80

# Access messages

81

messages = state["messages"]

82

83

# Return new messages (will be merged)

84

return {"messages": [AIMessage(content="Hello!")]}

85

86

graph.add_node("chat", chatbot)

87

graph.add_edge(START, "chat")

88

graph.add_edge("chat", END)

89

90

app = graph.compile()

91

92

# Messages are automatically merged

93

result = app.invoke({

94

"messages": [HumanMessage(content="Hi")]

95

})

96

# result["messages"] has both HumanMessage and AIMessage

97

"""

98

99

messages: Annotated[list[AnyMessage], add_messages]

100

```

101

102

### Message Merging

103

104

#### RemoveMessage Class

105

106

Special marker class for deleting messages by ID in message lists.

107

108

```python { .api }

109

class RemoveMessage:

110

"""

111

Marker to delete a message from the message list.

112

113

Used with add_messages reducer to remove messages by ID. This class

114

is imported from langchain_core.messages and used in LangGraph message

115

workflows.

116

117

Attributes:

118

id: str - ID of the message to remove

119

120

Usage:

121

from langchain_core.messages import RemoveMessage

122

123

# Delete specific message

124

state = {"messages": [

125

HumanMessage(content="Keep", id="msg-1"),

126

HumanMessage(content="Delete", id="msg-2")

127

]}

128

129

# Return RemoveMessage to delete

130

return {"messages": [RemoveMessage(id="msg-2")]}

131

# Result: only msg-1 remains

132

133

Note:

134

RemoveMessage is from langchain_core, not langgraph, but is

135

commonly used in LangGraph message workflows with add_messages.

136

"""

137

138

def __init__(self, *, id: str):

139

"""

140

Initialize a RemoveMessage marker.

141

142

Parameters:

143

id: str - ID of the message to remove

144

"""

145

146

id: str

147

```

148

149

#### Removing All Messages

150

151

You can remove all messages at once by using `RemoveMessage` with the special ID `"__remove_all__"`:

152

153

```python

154

from langchain_core.messages import RemoveMessage, HumanMessage, AIMessage

155

156

messages = [HumanMessage(content="msg1"), AIMessage(content="msg2")]

157

result = add_messages(messages, [RemoveMessage(id="__remove_all__")])

158

# result == []

159

```

160

161

#### add_messages Function

162

163

Reducer function that merges two lists of messages, updating existing messages by ID.

164

165

```python { .api }

166

def add_messages(

167

left: Messages,

168

right: Messages,

169

*,

170

format: Literal["langchain-openai"] | None = None

171

) -> Messages:

172

"""

173

Merge two lists of messages, updating existing messages by ID.

174

175

Used as a reducer function for message state fields. When multiple

176

nodes write to a messages field, this function merges the updates

177

intelligently:

178

- New messages (no ID) are appended

179

- Messages with IDs that exist in left are updated

180

- Messages with new IDs are appended

181

- Messages can be deleted by passing None or RemoveMessage

182

183

Parameters:

184

left: Messages - Current message list (existing state)

185

right: Messages - New messages to merge (updates)

186

format: Optional[Literal["langchain-openai"]]

187

Format for message serialization. Use "langchain-openai"

188

for compatibility with OpenAI format.

189

190

Returns:

191

Messages - Merged message list

192

193

Message Types:

194

- Single message: BaseMessage instance

195

- List of messages: list[BaseMessage]

196

- Tuple: (action, message) where action is "add", "update", "delete"

197

- Dict: Message as dictionary

198

- RemoveMessage: Special marker to delete a message

199

200

Usage:

201

from typing import TypedDict, Annotated

202

from langgraph.graph import add_messages

203

204

class State(TypedDict):

205

messages: Annotated[list, add_messages]

206

207

# Messages with same ID are updated

208

left = [HumanMessage(content="Hi", id="1")]

209

right = [HumanMessage(content="Hello", id="1")]

210

result = add_messages(left, right)

211

# result == [HumanMessage(content="Hello", id="1")]

212

213

# New messages are appended

214

left = [HumanMessage(content="Hi", id="1")]

215

right = [AIMessage(content="Hello", id="2")]

216

result = add_messages(left, right)

217

# result has both messages

218

219

# Delete by ID

220

from langchain_core.messages import RemoveMessage

221

left = [HumanMessage(content="Hi", id="1")]

222

right = [RemoveMessage(id="1")]

223

result = add_messages(left, right)

224

# result == []

225

"""

226

```

227

228

#### push_message Function

229

230

Manually write a message to the messages stream mode. This is an advanced feature for fine-grained control over message streaming.

231

232

```python { .api }

233

def push_message(

234

message: MessageLikeRepresentation | BaseMessageChunk,

235

*,

236

state_key: str | None = "messages"

237

) -> AnyMessage:

238

"""

239

Write a message manually to the `messages` / `messages-tuple` stream mode.

240

241

This is an advanced feature that allows direct control over message streaming

242

within node or task execution. The message will be written to the channel

243

specified in state_key and emitted to the stream.

244

245

Parameters:

246

message: MessageLikeRepresentation | BaseMessageChunk

247

The message to push. Must have an ID.

248

state_key: str | None

249

The state key/channel to write the message to. Defaults to "messages".

250

If None, the message is only emitted to the stream without writing to state.

251

252

Returns:

253

AnyMessage - The processed message with ID

254

255

Raises:

256

ValueError: If the message does not have an ID

257

258

Usage:

259

from langgraph.graph.message import push_message

260

from langchain_core.messages import AIMessage

261

262

def streaming_node(state):

263

# Create message with ID

264

message = AIMessage(content="Streaming...", id="msg-123")

265

266

# Push to stream immediately

267

push_message(message)

268

269

return {"messages": [message]}

270

271

Note:

272

- Message must have an ID (message.id cannot be None)

273

- Requires proper runtime context (must be called within node/task)

274

- Automatically writes to the specified state_key channel

275

- Emits to messages stream mode for real-time streaming

276

"""

277

```

278

279

### MessageGraph (Deprecated)

280

281

#### MessageGraph Class

282

283

Deprecated graph class specialized for message-based workflows.

284

285

```python { .api }

286

class MessageGraph(StateGraph):

287

"""

288

DEPRECATED in v1.0.0 - Use StateGraph with MessagesState instead.

289

290

A StateGraph where nodes receive and return messages.

291

292

Previously provided a specialized interface for message-based workflows.

293

Now deprecated in favor of using StateGraph with MessagesState or a

294

custom state schema with a messages field.

295

296

Migration:

297

# Old way (deprecated)

298

from langgraph.graph import MessageGraph

299

300

graph = MessageGraph()

301

graph.add_node("chat", chatbot_node)

302

303

# New way (recommended)

304

from langgraph.graph import StateGraph, MessagesState

305

306

graph = StateGraph(MessagesState)

307

graph.add_node("chat", chatbot_node)

308

309

All methods are identical to StateGraph:

310

- add_node(node, action, ...)

311

- add_edge(start_key, end_key)

312

- add_conditional_edges(source, path, path_map)

313

- compile(checkpointer, ...)

314

"""

315

316

def __init__(self):

317

"""

318

Initialize a MessageGraph.

319

320

DEPRECATED: Use StateGraph(MessagesState) instead.

321

"""

322

```

323

324

## Usage Examples

325

326

### Basic Chatbot with MessagesState

327

328

```python

329

from typing import TypedDict

330

from langgraph.graph import StateGraph, MessagesState, START, END

331

from langchain_core.messages import HumanMessage, AIMessage

332

333

def chatbot(state: MessagesState) -> dict:

334

"""Simple chatbot that echoes messages."""

335

# Get last message

336

last_message = state["messages"][-1]

337

338

# Generate response

339

response = AIMessage(

340

content=f"Echo: {last_message.content}"

341

)

342

343

# Return new message (will be merged with existing)

344

return {"messages": [response]}

345

346

# Create graph with MessagesState

347

graph = StateGraph(MessagesState)

348

graph.add_node("chat", chatbot)

349

graph.add_edge(START, "chat")

350

graph.add_edge("chat", END)

351

352

app = graph.compile()

353

354

# Run conversation

355

result = app.invoke({

356

"messages": [HumanMessage(content="Hello!")]

357

})

358

359

print(result["messages"])

360

# [HumanMessage(content="Hello!"), AIMessage(content="Echo: Hello!")]

361

```

362

363

### Multi-Turn Conversation with History

364

365

```python

366

from langgraph.graph import StateGraph, MessagesState, START, END

367

from langgraph.checkpoint.memory import MemorySaver

368

from langchain_core.messages import HumanMessage, AIMessage

369

370

def conversational_agent(state: MessagesState) -> dict:

371

"""Agent that maintains conversation history."""

372

messages = state["messages"]

373

374

# Access full conversation history

375

history = "\n".join([

376

f"{m.__class__.__name__}: {m.content}"

377

for m in messages

378

])

379

380

# Generate context-aware response

381

last_msg = messages[-1].content

382

response = AIMessage(

383

content=f"I remember we discussed: {len(messages)} messages. You said: {last_msg}"

384

)

385

386

return {"messages": [response]}

387

388

# Create graph with checkpointing for history

389

graph = StateGraph(MessagesState)

390

graph.add_node("agent", conversational_agent)

391

graph.add_edge(START, "agent")

392

graph.add_edge("agent", END)

393

394

checkpointer = MemorySaver()

395

app = graph.compile(checkpointer=checkpointer)

396

397

# Conversation with thread

398

config = {"configurable": {"thread_id": "conversation-1"}}

399

400

# First message

401

result1 = app.invoke({

402

"messages": [HumanMessage(content="Hello!")]

403

}, config)

404

405

# Second message - includes history

406

result2 = app.invoke({

407

"messages": [HumanMessage(content="How are you?")]

408

}, config)

409

410

print(result2["messages"])

411

# All messages from both turns preserved

412

```

413

414

### Using add_messages with Custom State

415

416

```python

417

from typing import TypedDict, Annotated

418

from langgraph.graph import StateGraph, add_messages, START, END

419

from langchain_core.messages import HumanMessage, AIMessage

420

421

class CustomState(TypedDict):

422

messages: Annotated[list, add_messages]

423

user_id: str

424

context: dict

425

426

def personalized_chat(state: CustomState) -> dict:

427

"""Chatbot with additional state."""

428

user_id = state["user_id"]

429

context = state["context"]

430

last_message = state["messages"][-1]

431

432

# Generate personalized response

433

response = AIMessage(

434

content=f"Hello {user_id}! You said: {last_message.content}"

435

)

436

437

return {"messages": [response]}

438

439

graph = StateGraph(CustomState)

440

graph.add_node("chat", personalized_chat)

441

graph.add_edge(START, "chat")

442

graph.add_edge("chat", END)

443

444

app = graph.compile()

445

446

result = app.invoke({

447

"messages": [HumanMessage(content="Hi")],

448

"user_id": "alice",

449

"context": {"preference": "friendly"}

450

})

451

```

452

453

### Updating Messages by ID

454

455

```python

456

from langgraph.graph import StateGraph, MessagesState, START, END

457

from langchain_core.messages import HumanMessage, AIMessage

458

459

def editor_node(state: MessagesState) -> dict:

460

"""Edit existing messages by ID."""

461

messages = state["messages"]

462

463

# Find message to edit

464

message_to_edit = messages[0]

465

466

# Create updated version with same ID

467

updated_message = HumanMessage(

468

content=message_to_edit.content + " (edited)",

469

id=message_to_edit.id

470

)

471

472

# Return updated message - will replace original

473

return {"messages": [updated_message]}

474

475

graph = StateGraph(MessagesState)

476

graph.add_node("edit", editor_node)

477

graph.add_edge(START, "edit")

478

graph.add_edge("edit", END)

479

480

app = graph.compile()

481

482

# Create message with ID

483

msg = HumanMessage(content="Original", id="msg-1")

484

result = app.invoke({"messages": [msg]})

485

486

print(result["messages"][0].content)

487

# "Original (edited)"

488

```

489

490

### Deleting Messages

491

492

```python

493

from langgraph.graph import StateGraph, MessagesState, START, END

494

from langchain_core.messages import HumanMessage, AIMessage, RemoveMessage

495

496

def cleanup_node(state: MessagesState) -> dict:

497

"""Remove old messages."""

498

messages = state["messages"]

499

500

# Keep only last 5 messages

501

if len(messages) > 5:

502

messages_to_remove = [

503

RemoveMessage(id=msg.id)

504

for msg in messages[:-5]

505

]

506

return {"messages": messages_to_remove}

507

508

return {}

509

510

graph = StateGraph(MessagesState)

511

graph.add_node("cleanup", cleanup_node)

512

graph.add_edge(START, "cleanup")

513

graph.add_edge("cleanup", END)

514

515

app = graph.compile()

516

517

# Create message history

518

messages = [

519

HumanMessage(content=f"Message {i}", id=f"msg-{i}")

520

for i in range(10)

521

]

522

523

result = app.invoke({"messages": messages})

524

print(len(result["messages"])) # 5 (last 5 kept)

525

```

526

527

### Parallel Message Generation

528

529

```python

530

from typing import TypedDict

531

from langgraph.graph import StateGraph, MessagesState, START, END

532

from langgraph.types import Send

533

from langchain_core.messages import HumanMessage, AIMessage

534

535

def fan_out(state: MessagesState) -> list[Send]:

536

"""Create parallel message generation tasks."""

537

last_msg = state["messages"][-1].content

538

539

# Create multiple parallel processing tasks

540

return [

541

Send("responder", {"prompt": f"Respond to: {last_msg}", "style": "formal"}),

542

Send("responder", {"prompt": f"Respond to: {last_msg}", "style": "casual"}),

543

Send("responder", {"prompt": f"Respond to: {last_msg}", "style": "technical"})

544

]

545

546

def responder(state: dict) -> dict:

547

"""Generate response in specific style."""

548

style = state["style"]

549

prompt = state["prompt"]

550

551

response = AIMessage(

552

content=f"[{style}] Response to: {prompt}"

553

)

554

555

return {"messages": [response]}

556

557

def aggregate(state: MessagesState) -> dict:

558

"""Final aggregation."""

559

return state

560

561

graph = StateGraph(MessagesState)

562

graph.add_node("responder", responder)

563

graph.add_node("aggregate", aggregate)

564

565

graph.add_conditional_edges(START, fan_out)

566

graph.add_edge("responder", "aggregate")

567

graph.add_edge("aggregate", END)

568

569

app = graph.compile()

570

571

result = app.invoke({

572

"messages": [HumanMessage(content="Hello!")]

573

})

574

575

# result["messages"] has original + 3 parallel responses

576

print(len(result["messages"])) # 4

577

```

578

579

### Message Formatting

580

581

```python

582

from langgraph.graph import add_messages

583

from langchain_core.messages import HumanMessage, AIMessage

584

585

# Using format parameter for OpenAI compatibility

586

messages_left = [HumanMessage(content="Hi", id="1")]

587

messages_right = [AIMessage(content="Hello", id="2")]

588

589

# Merge with langchain-openai format

590

result = add_messages(

591

messages_left,

592

messages_right,

593

format="langchain-openai"

594

)

595

596

# Messages formatted for OpenAI API

597

```

598

599

### Migrating from MessageGraph

600

601

```python

602

# OLD (Deprecated)

603

# from langgraph.graph import MessageGraph

604

#

605

# graph = MessageGraph()

606

# graph.add_node("chat", chatbot)

607

# graph.set_entry_point("chat")

608

# graph.set_finish_point("chat")

609

# app = graph.compile()

610

611

# NEW (Recommended)

612

from langgraph.graph import StateGraph, MessagesState, START, END

613

614

graph = StateGraph(MessagesState)

615

graph.add_node("chat", chatbot)

616

graph.add_edge(START, "chat")

617

graph.add_edge("chat", END)

618

app = graph.compile()

619

```

620

621

## Message Types

622

623

LangGraph works with LangChain message types:

624

625

```python

626

from langchain_core.messages import (

627

HumanMessage, # User input

628

AIMessage, # AI/assistant response

629

SystemMessage, # System instructions

630

FunctionMessage, # Function call result (deprecated)

631

ToolMessage, # Tool execution result

632

ChatMessage # Generic message with role

633

)

634

```

635

636

## Notes

637

638

- `MessagesState` is the recommended state schema for conversational workflows

639

- `add_messages` intelligently merges messages by ID, enabling updates and deletions

640

- Messages without IDs are always appended

641

- Messages with matching IDs replace existing messages

642

- `RemoveMessage` can delete messages by ID

643

- `MessageGraph` is deprecated - use `StateGraph(MessagesState)` instead

644

- The `format` parameter in `add_messages` enables compatibility with different message formats

645

- Message history is automatically preserved when using a checkpointer

646

- All LangChain message types are supported

647

- Messages can include metadata, additional_kwargs, and other custom fields

648