CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-org-springframework-ai--spring-ai-model

Core model interfaces and abstractions for Spring AI framework providing portable API for chat, embeddings, images, audio, and tool calling across multiple AI providers

Overview
Eval results
Files

quick-start.mddocs/guides/

Quick Start Guide

This guide will help you get started with Spring AI Model in minutes.

Prerequisites

  • Java 17 or later
  • Maven or Gradle
  • Spring Boot 3.2 or later

Installation

Maven

Add to your pom.xml:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-model</artifactId>
    <version>1.1.2</version>
</dependency>

<!-- Add a provider implementation (e.g., OpenAI) -->
<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-openai-spring-boot-starter</artifactId>
    <version>1.1.2</version>
</dependency>

Gradle

Add to your build.gradle:

implementation 'org.springframework.ai:spring-ai-model:1.1.2'
implementation 'org.springframework.ai:spring-ai-openai-spring-boot-starter:1.1.2'

Configuration

application.properties

spring.ai.openai.api-key=${OPENAI_API_KEY}
spring.ai.openai.chat.options.model=gpt-4
spring.ai.openai.chat.options.temperature=0.7

application.yml

spring:
  ai:
    openai:
      api-key: ${OPENAI_API_KEY}
      chat:
        options:
          model: gpt-4
          temperature: 0.7

Your First Chat Application

Step 1: Create a Service

import org.springframework.ai.chat.model.ChatModel;
import org.springframework.stereotype.Service;

@Service
public class AiService {
    private final ChatModel chatModel;
    
    public AiService(ChatModel chatModel) {
        this.chatModel = chatModel;
    }
    
    public String ask(String question) {
        return chatModel.call(question);
    }
}

Step 2: Create a Controller

import org.springframework.web.bind.annotation.*;

@RestController
@RequestMapping("/api/ai")
public class AiController {
    private final AiService aiService;
    
    public AiController(AiService aiService) {
        this.aiService = aiService;
    }
    
    @PostMapping("/chat")
    public String chat(@RequestBody String question) {
        return aiService.ask(question);
    }
}

Step 3: Run Your Application

mvn spring-boot:run

Test with:

curl -X POST http://localhost:8080/api/ai/chat \
  -H "Content-Type: text/plain" \
  -d "What is the capital of France?"

Common Use Cases

1. Simple Chat

@Autowired
private ChatModel chatModel;

public String simpleChat(String userInput) {
    return chatModel.call(userInput);
}

2. Chat with System Instructions

import org.springframework.ai.chat.prompt.Prompt;
import org.springframework.ai.chat.messages.*;

public String chatWithContext(String userInput) {
    List<Message> messages = List.of(
        new SystemMessage("You are a helpful coding assistant."),
        new UserMessage(userInput)
    );
    
    Prompt prompt = new Prompt(messages);
    ChatResponse response = chatModel.call(prompt);
    return response.getResult().getOutput().getText();
}

3. Streaming Responses

import reactor.core.publisher.Flux;

public Flux<String> streamChat(String userInput) {
    Prompt prompt = new Prompt(userInput);
    
    return chatModel.stream(prompt)
        .map(response -> response.getResult().getOutput().getText());
}

4. Chat with Options

import org.springframework.ai.chat.prompt.ChatOptions;

public String chatWithOptions(String userInput) {
    ChatOptions options = ChatOptions.builder()
        .temperature(0.8)
        .maxTokens(1000)
        .build();
    
    Prompt prompt = new Prompt(userInput, options);
    return chatModel.call(prompt).getResult().getOutput().getText();
}

5. Generate Embeddings

@Autowired
private EmbeddingModel embeddingModel;

public float[] generateEmbedding(String text) {
    return embeddingModel.embed(text);
}

public List<float[]> batchEmbed(List<String> texts) {
    return embeddingModel.embed(texts);
}

6. Generate Images

@Autowired
private ImageModel imageModel;

public String generateImage(String prompt) {
    ImagePrompt imagePrompt = new ImagePrompt(prompt);
    ImageResponse response = imageModel.call(imagePrompt);
    return response.getResult().getOutput().getUrl();
}

7. Content Moderation

@Autowired
private ModerationModel moderationModel;

public boolean isContentSafe(String content) {
    ModerationPrompt prompt = new ModerationPrompt(content);
    ModerationResponse response = moderationModel.call(prompt);
    return !response.getResult().getOutput().isFlagged();
}

8. Function Calling

@Component
public class WeatherTools {
    @Tool(description = "Get current weather for a city")
    public String getWeather(@ToolParam(description = "City name") String city) {
        // Your implementation
        return "{\"temp\": 72, \"condition\": \"sunny\"}";
    }
}

// Use with chat
MethodToolCallbackProvider provider = new MethodToolCallbackProvider(weatherTools);
ChatOptions options = ChatOptions.builder()
    .toolCallbacks(provider.getToolCallbacks())
    .build();

9. Structured Output

record Person(String name, int age, String occupation) {}

BeanOutputConverter<Person> converter = new BeanOutputConverter<>(Person.class);

String prompt = "Extract person info from: John is a 30-year-old engineer\n\n" + 
                converter.getFormat();

String response = chatModel.call(prompt);
Person person = converter.convert(response);

10. Conversation Memory

@Autowired
private ChatMemory chatMemory;

public String chatWithMemory(String userId, String userInput) {
    // Get history
    List<Message> history = chatMemory.get(userId);
    history.add(new UserMessage(userInput));
    
    // Call model
    Prompt prompt = new Prompt(history);
    ChatResponse response = chatModel.call(prompt);
    
    // Save to memory
    chatMemory.add(userId, new UserMessage(userInput));
    chatMemory.add(userId, response.getResult().getOutput());
    
    return response.getResult().getOutput().getText();
}

Configuration Examples

Bean Configuration

@Configuration
public class AiConfig {
    
    @Bean
    public ChatOptions defaultChatOptions() {
        return ChatOptions.builder()
            .temperature(0.7)
            .maxTokens(2000)
            .build();
    }
    
    @Bean
    public ChatMemory chatMemory() {
        return new MessageWindowChatMemory(
            new InMemoryChatMemoryRepository(),
            50  // Keep last 50 messages
        );
    }
}

Multi-Provider Configuration

@Configuration
public class MultiProviderConfig {
    
    @Bean
    @Qualifier("openai")
    public ChatModel openAiModel(OpenAiChatModel.OpenAiChatModelBuilder builder) {
        return builder.build();
    }
    
    @Bean
    @Qualifier("anthropic")
    public ChatModel anthropicModel(AnthropicChatModel.AnthropicChatModelBuilder builder) {
        return builder.build();
    }
}

// Usage
@Service
public class MultiModelService {
    @Autowired @Qualifier("openai")
    private ChatModel openAiModel;
    
    @Autowired @Qualifier("anthropic")
    private ChatModel anthropicModel;
}

Error Handling

public String robustChat(String input) {
    try {
        return chatModel.call(input);
    } catch (Exception e) {
        log.error("Chat failed: {}", e.getMessage());
        return "Sorry, I encountered an error. Please try again.";
    }
}

Monitoring Usage

ChatResponse response = chatModel.call(prompt);

Usage usage = response.getMetadata().getUsage();
System.out.println("Tokens used: " + usage.getTotalTokens());
System.out.println("Prompt tokens: " + usage.getPromptTokens());
System.out.println("Completion tokens: " + usage.getCompletionTokens());

// Check rate limits
RateLimit rateLimit = response.getMetadata().getRateLimit();
if (rateLimit != null) {
    System.out.println("Requests remaining: " + rateLimit.getRequestsRemaining());
}

Next Steps

  • Explore Real-World Scenarios
  • Learn Integration Patterns
  • Review Best Practices
  • Browse Reference Documentation

Common Issues & Solutions

Issue: ChatModel bean not found

  • Solution: Ensure provider starter dependency is added (e.g., spring-ai-openai-spring-boot-starter)

Issue: API key not configured

  • Solution: Set environment variable or configure in application.properties

Issue: Rate limit errors

  • Solution: Monitor RateLimit metadata and implement backoff strategies

Issue: Token limit exceeded

  • Solution: Monitor Usage metadata and truncate inputs if needed

Additional Resources

  • Spring AI Documentation: https://docs.spring.io/spring-ai/reference/
  • Provider-specific documentation in respective starter modules
  • Complete API Reference

Install with Tessl CLI

npx tessl i tessl/maven-org-springframework-ai--spring-ai-model

docs

index.md

tile.json