0
# LLM Integration
1
2
Comprehensive integration with large language models including OpenAI, Anthropic, and other providers in LlamaIndex.TS.
3
4
## Import
5
6
```typescript
7
import { OpenAI, Settings } from "llamaindex";
8
// Or specific LLM providers
9
import { OpenAI, Anthropic, Groq } from "llamaindex/llms";
10
```
11
12
## Overview
13
14
LlamaIndex.TS provides extensive support for various LLM providers with unified interfaces for chat, completion, streaming, and tool calling capabilities. The framework abstracts provider differences while exposing provider-specific features.
15
16
## Base LLM Interface
17
18
```typescript { .api }
19
interface LLM {
20
chat(messages: ChatMessage[], options?: LLMChatParams): Promise<ChatResponse>;
21
complete(prompt: string, options?: LLMCompletionParams): Promise<CompletionResponse>;
22
metadata: LLMMetadata;
23
}
24
25
interface LLMMetadata {
26
model: string;
27
temperature?: number;
28
topP?: number;
29
topK?: number;
30
maxTokens?: number;
31
contextWindow: number;
32
tokenizer?: (text: string) => string[];
33
}
34
```
35
36
## Chat Types
37
38
```typescript { .api }
39
interface ChatMessage {
40
role: MessageType;
41
content: MessageContent;
42
}
43
44
type MessageType = "system" | "user" | "assistant" | "tool";
45
type MessageContent = string | MessageContentDetail[];
46
47
interface MessageContentDetail {
48
type: "text" | "image_url" | "audio" | "video" | "file";
49
text?: string;
50
image_url?: { url: string };
51
audio?: { data: string };
52
}
53
54
interface ChatResponse {
55
message: ChatMessage;
56
raw?: any;
57
delta?: string;
58
}
59
```
60
61
## OpenAI Integration
62
63
```typescript { .api }
64
class OpenAI implements LLM {
65
constructor(options?: {
66
model?: string;
67
temperature?: number;
68
topP?: number;
69
maxTokens?: number;
70
apiKey?: string;
71
additionalChatOptions?: any;
72
});
73
74
chat(messages: ChatMessage[], options?: LLMChatParams): Promise<ChatResponse>;
75
complete(prompt: string, options?: LLMCompletionParams): Promise<CompletionResponse>;
76
77
metadata: LLMMetadata;
78
}
79
```
80
81
## Basic Usage
82
83
```typescript
84
import { OpenAI, Settings } from "llamaindex";
85
86
// Configure global LLM
87
Settings.llm = new OpenAI({
88
model: "gpt-4",
89
temperature: 0.1,
90
maxTokens: 2048,
91
});
92
93
// Use with chat
94
const response = await Settings.llm.chat([
95
{ role: "system", content: "You are a helpful assistant." },
96
{ role: "user", content: "Explain quantum computing" },
97
]);
98
99
console.log(response.message.content);
100
```
101
102
## Provider Examples
103
104
### Anthropic
105
106
```typescript
107
import { Anthropic } from "llamaindex";
108
109
const claude = new Anthropic({
110
model: "claude-3-opus-20240229",
111
temperature: 0.0,
112
maxTokens: 4096,
113
});
114
115
Settings.llm = claude;
116
```
117
118
### Tool Calling
119
120
```typescript
121
// LLMs with tool calling support
122
const toolCallLLM = new OpenAI({
123
model: "gpt-4",
124
additionalChatOptions: {
125
tools: [
126
{
127
type: "function",
128
function: {
129
name: "get_weather",
130
description: "Get weather information",
131
parameters: {
132
type: "object",
133
properties: {
134
location: { type: "string" },
135
},
136
},
137
},
138
},
139
],
140
},
141
});
142
```
143
144
## Best Practices
145
146
```typescript
147
// Environment-specific configuration
148
if (process.env.NODE_ENV === "production") {
149
Settings.llm = new OpenAI({
150
model: "gpt-4",
151
temperature: 0.0, // Consistent responses
152
});
153
} else {
154
Settings.llm = new OpenAI({
155
model: "gpt-3.5-turbo", // Cheaper for development
156
temperature: 0.1,
157
});
158
}
159
```