OpenAI models support for Spring AI, providing comprehensive integration for chat completion, embeddings, image generation, audio transcription, text-to-speech, and content moderation capabilities within Spring Boot applications.
OpenAI integration for Spring AI, providing Java APIs for chat completion, embeddings, image generation, audio transcription/synthesis, and content moderation within Spring Boot applications.
org.springframework.aispring-ai-openai<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-openai</artifactId>
<version>1.1.2</version>
</dependency>import org.springframework.ai.openai.OpenAiChatModel;
import org.springframework.ai.openai.api.OpenAiApi;
import org.springframework.ai.chat.prompt.Prompt;
// Create API client
var openAiApi = OpenAiApi.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.build();
// Create chat model
var chatModel = OpenAiChatModel.builder()
.openAiApi(openAiApi)
.build();
// Generate response
var response = chatModel.call(new Prompt("Explain quantum computing"));
System.out.println(response.getResult().getOutput().getContent());High-level implementations of Spring AI interfaces:
| Model | Purpose | Reference |
|---|---|---|
OpenAiChatModel | Chat completions (GPT-4, GPT-3.5, O-series) | Chat Models |
OpenAiEmbeddingModel | Vector embeddings for semantic search | Embeddings |
OpenAiImageModel | Image generation (DALL-E 2/3) | Images |
OpenAiAudioSpeechModel | Text-to-speech synthesis | Audio |
OpenAiAudioTranscriptionModel | Audio transcription (Whisper) | Audio |
OpenAiModerationModel | Content moderation | Moderation |
Low-level REST clients for direct API access:
| Client | Purpose | Reference |
|---|---|---|
OpenAiApi | Chat & embeddings | API Clients |
OpenAiImageApi | Image generation | API Clients |
OpenAiAudioApi | Audio operations | API Clients |
OpenAiModerationApi | Content moderation | API Clients |
OpenAiFileApi | File management | API Clients |
┌─────────────────────────────────────────┐
│ Spring Boot Application │
├─────────────────────────────────────────┤
│ Model Layer (OpenAiChatModel, etc.) │
│ ↓ Spring AI Interfaces │
├─────────────────────────────────────────┤
│ Options Layer (OpenAiChatOptions, etc.)│
│ ↓ Builder Pattern │
├─────────────────────────────────────────┤
│ API Client Layer (OpenAiApi, etc.) │
│ ↓ REST Communication │
├─────────────────────────────────────────┤
│ OpenAI REST API │
└─────────────────────────────────────────┘Features:
var chatModel = OpenAiChatModel.builder()
.openAiApi(openAiApi)
.defaultOptions(OpenAiChatOptions.builder()
.model(OpenAiApi.ChatModel.GPT_4_O.getValue())
.temperature(0.7)
.build())
.build();chatModel.stream(new Prompt("Write a story"))
.subscribe(response -> {
System.out.print(response.getResult().getOutput().getContent());
});var options = OpenAiChatOptions.builder()
.tools(List.of(weatherTool))
.toolCallbacks(Map.of("get_weather", weatherCallback))
.build();var chatModel = OpenAiChatModel.builder()
.openAiApi(openAiApi)
.observationRegistry(observationRegistry)
.build();spring.ai.openai.api-key=${OPENAI_API_KEY}
spring.ai.openai.chat.options.model=gpt-4o
spring.ai.openai.chat.options.temperature=0.7@RestController
public class ChatController {
private final OpenAiChatModel chatModel;
public ChatController(OpenAiChatModel chatModel) {
this.chatModel = chatModel;
}
@PostMapping("/chat")
public String chat(@RequestBody String message) {
return chatModel.call(new Prompt(message))
.getResult().getOutput().getContent();
}
}public enum OpenAiApi.ChatModel {
// Reasoning: O4_MINI, O3, O3_MINI, O1, O1_MINI, O1_PRO
// GPT-5: GPT_5, GPT_5_MINI, GPT_5_NANO
// GPT-4o: GPT_4_O, GPT_4_O_MINI, GPT_4_O_AUDIO_PREVIEW
// GPT-4.1: GPT_4_1, GPT_4_1_MINI, GPT_4_1_NANO
// Legacy: GPT_4_TURBO, GPT_4, GPT_3_5_TURBO
}public enum OpenAiApi.EmbeddingModel {
TEXT_EMBEDDING_ADA_002, // 1536 dimensions (fixed)
TEXT_EMBEDDING_3_SMALL, // Up to 1536 dimensions
TEXT_EMBEDDING_3_LARGE // Up to 3072 dimensions
}public enum OpenAiImageApi.ImageModel {
DALL_E_2, // Multiple images, lower cost
DALL_E_3 // Higher quality, single image
}import org.springframework.ai.openai.api.common.OpenAiApiClientErrorException;
try {
var response = chatModel.call(new Prompt("Hello"));
} catch (OpenAiApiClientErrorException e) {
switch (e.getStatusCode()) {
case 401 -> System.err.println("Invalid API key");
case 429 -> System.err.println("Rate limit exceeded");
case 400 -> System.err.println("Invalid request");
default -> System.err.println("API error: " + e.getMessage());
}
}→ Complete Error Handling Guide
// Models
import org.springframework.ai.openai.OpenAiChatModel;
import org.springframework.ai.openai.OpenAiEmbeddingModel;
import org.springframework.ai.openai.OpenAiImageModel;
// Options
import org.springframework.ai.openai.OpenAiChatOptions;
import org.springframework.ai.openai.OpenAiEmbeddingOptions;
// API Clients
import org.springframework.ai.openai.api.OpenAiApi;// Temperature: 0.0 (deterministic) to 2.0 (creative)
// Max Tokens: Limit response length
// Top P: Alternative to temperature (0.0-1.0)
// Stop: Sequences to stop generationMonitor via response headers:
x-ratelimit-limit-requestsx-ratelimit-remaining-requestsx-ratelimit-limit-tokensx-ratelimit-remaining-tokensvar usage = response.getMetadata().getUsage();
System.out.println("Total tokens: " + usage.getTotalTokens());Install with Tessl CLI
npx tessl i tessl/maven-org-springframework-ai--spring-ai-openai