Quarkus extension for integrating Anthropic Claude LLM models into Quarkus applications via LangChain4j
The quarkus-langchain4j-anthropic extension provides CDI beans for ChatModel and StreamingChatModel that can be injected into your application components. The extension automatically creates these beans based on configuration at build time.
package dev.langchain4j.model.chat;
/**
* Synchronous chat model interface from LangChain4j
*/
interface ChatModel {
/**
* Send a single message and get a complete response
*/
String chat(String userMessage);
/**
* Send multiple messages and get a structured response
*/
dev.langchain4j.model.chat.response.ChatResponse chat(
dev.langchain4j.data.message.ChatMessage... messages
);
/**
* Send a chat request and get a structured response
*/
dev.langchain4j.model.chat.response.ChatResponse chat(
dev.langchain4j.model.chat.request.ChatRequest request
);
}package dev.langchain4j.model.chat;
/**
* Streaming chat model interface from LangChain4j
*/
interface StreamingChatModel {
/**
* Stream a single message response
*/
void chat(
String userMessage,
dev.langchain4j.model.chat.response.StreamingChatResponseHandler handler
);
/**
* Stream multiple message responses
*/
void chat(
dev.langchain4j.data.message.ChatMessage message,
dev.langchain4j.model.chat.response.StreamingChatResponseHandler handler
);
/**
* Stream a chat request response
*/
void chat(
dev.langchain4j.model.chat.request.ChatRequest request,
dev.langchain4j.model.chat.response.StreamingChatResponseHandler handler
);
}The simplest way to use the extension is to inject the default model:
import jakarta.inject.Inject;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.StreamingChatModel;
@ApplicationScoped
public class MyService {
@Inject
ChatModel chatModel;
@Inject
StreamingChatModel streamingChatModel;
public String askQuestion(String question) {
return chatModel.chat(question);
}
}The default model uses the configuration properties prefixed with quarkus.langchain4j.anthropic (without any name qualifier).
For applications requiring multiple models with different configurations, use named model injection with the @ModelName qualifier:
package io.quarkiverse.langchain4j;
@Qualifier
@Retention(RUNTIME)
@Target({METHOD, FIELD, PARAMETER, TYPE})
public @interface ModelName {
/**
* The name of the model configuration
*/
String value();
}Configuration:
# Fast model for simple queries
quarkus.langchain4j.anthropic.fast.api-key=sk-ant-...
quarkus.langchain4j.anthropic.fast.chat-model.model-name=claude-3-haiku-20240307
quarkus.langchain4j.anthropic.fast.chat-model.temperature=0.3
# Smart model for complex reasoning
quarkus.langchain4j.anthropic.smart.api-key=sk-ant-...
quarkus.langchain4j.anthropic.smart.chat-model.model-name=claude-opus-4-20250514
quarkus.langchain4j.anthropic.smart.chat-model.temperature=0.7
quarkus.langchain4j.anthropic.smart.chat-model.thinking.type=enabled
quarkus.langchain4j.anthropic.smart.chat-model.thinking.budget-tokens=10000Injection:
import jakarta.inject.Inject;
import io.quarkiverse.langchain4j.ModelName;
import dev.langchain4j.model.chat.ChatModel;
@ApplicationScoped
public class MultiModelService {
@Inject
@ModelName("fast")
ChatModel fastModel;
@Inject
@ModelName("smart")
ChatModel smartModel;
public String simpleQuery(String question) {
// Use fast, cheap model for simple queries
return fastModel.chat(question);
}
public String complexReasoning(String problem) {
// Use advanced model with thinking for complex problems
return smartModel.chat(problem);
}
}The injected ChatModel and StreamingChatModel beans are application-scoped singletons. They are thread-safe and can be injected into any CDI bean.
The extension integrates with LangChain4j's declarative AI services pattern:
import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.UserMessage;
import io.quarkiverse.langchain4j.RegisterAiService;
@RegisterAiService
public interface Assistant {
/**
* Simple method with implicit user message
*/
String chat(String userMessage);
/**
* Method with system and user messages
*/
@SystemMessage("You are a helpful coding assistant.")
@UserMessage("Explain this code: {code}")
String explainCode(String code);
/**
* Method with dynamic system message
*/
String ask(@SystemMessage String systemMessage,
@UserMessage String userMessage);
}Usage:
@ApplicationScoped
public class MyService {
@Inject
Assistant assistant;
public void example() {
String response = assistant.chat("Hello!");
String explanation = assistant.explainCode("def factorial(n): return 1 if n == 0 else n * factorial(n-1)");
}
}The @RegisterAiService annotation automatically creates a CDI bean that uses the default ChatModel.
Note: Named model selection for declarative AI services should be done via dependency injection and manual wiring if needed. The @ModelName qualifier is applied at the injection point, not on the interface itself.
For dynamic model selection based on runtime conditions:
import jakarta.enterprise.inject.Any;
import jakarta.enterprise.inject.Instance;
import jakarta.inject.Inject;
import dev.langchain4j.model.chat.ChatModel;
import io.quarkiverse.langchain4j.ModelName;
@ApplicationScoped
public class DynamicModelService {
@Inject
@Any
Instance<ChatModel> allModels;
public String chat(String message, String modelName) {
ChatModel model = allModels
.select(ModelName.Literal.of(modelName))
.get();
return model.chat(message);
}
}This allows selecting models dynamically based on configuration, user preferences, or request characteristics.
To disable automatic bean creation at build time:
quarkus.langchain4j.anthropic.chat-model.enabled=falseThis is useful when you want to create custom beans programmatically or when the extension isn't needed.
The extension manages the lifecycle of the Anthropic client automatically:
No manual resource management is required.
The extension integrates with Quarkus observability features:
quarkus-micrometer is presentquarkus-opentelemetry is presentIn Quarkus Dev Mode, configuration changes are automatically detected:
application.properties trigger bean recreationIn tests, you can easily mock or replace injected models:
import io.quarkus.test.junit.QuarkusTest;
import io.quarkus.test.junit.mockito.InjectMock;
import dev.langchain4j.model.chat.ChatModel;
@QuarkusTest
public class MyServiceTest {
@InjectMock
ChatModel chatModel;
@Inject
MyService myService;
@Test
public void testChat() {
when(chatModel.chat(anyString())).thenReturn("Mocked response");
assertEquals("Mocked response", myService.askQuestion("Test"));
}
}The extension fully supports GraalVM native image compilation:
For advanced use cases, you can create custom producer methods:
import jakarta.enterprise.context.ApplicationScoped;
import jakarta.enterprise.inject.Produces;
import io.quarkiverse.langchain4j.ModelName;
import dev.langchain4j.model.anthropic.AnthropicChatModel;
import dev.langchain4j.model.chat.ChatModel;
@ApplicationScoped
public class CustomModelProducer {
@Produces
@ApplicationScoped
@ModelName("custom")
public ChatModel customModel() {
return AnthropicChatModel.builder()
.apiKey("sk-ant-...")
.baseUrl("https://api.anthropic.com/v1/")
.version("2023-06-01")
.modelName("claude-opus-4-20250514")
.maxTokens(2048)
.temperature(0.7)
.timeout(java.time.Duration.ofSeconds(30))
.logRequests(true)
.build();
}
}Note: This approach provides maximum flexibility but bypasses the extension's automatic configuration and uses the standard LangChain4j AnthropicChatModel builder, which does not accept a custom client instance.
The extension creates the following CDI bean types:
@ApplicationScoped
dev.langchain4j.model.chat.ChatModel
@ApplicationScoped
dev.langchain4j.model.chat.StreamingChatModel@ApplicationScoped
@ModelName("fast")
dev.langchain4j.model.chat.ChatModel
@ApplicationScoped
@ModelName("fast")
dev.langchain4j.model.chat.StreamingChatModel
@ApplicationScoped
@ModelName("smart")
dev.langchain4j.model.chat.ChatModel
@ApplicationScoped
@ModelName("smart")
dev.langchain4j.model.chat.StreamingChatModelWhen enable-integration=false is set in configuration, the extension creates disabled model beans that throw exceptions:
quarkus.langchain4j.anthropic.enable-integration=falseThis is useful for:
The disabled beans implement the same interfaces but throw exceptions when methods are called, making it clear that the integration is disabled.
The extension validates configuration at startup:
If validation fails, the application startup fails with a clear error message indicating the configuration problem.
import jakarta.inject.Inject;
import jakarta.enterprise.context.ApplicationScoped;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.StreamingChatModel;
import dev.langchain4j.model.chat.response.*;
import dev.langchain4j.data.message.*;
import io.quarkiverse.langchain4j.ModelName;
@ApplicationScoped
public class ComprehensiveService {
// Default models
@Inject
ChatModel defaultChatModel;
@Inject
StreamingChatModel defaultStreamingModel;
// Named models
@Inject
@ModelName("fast")
ChatModel fastModel;
@Inject
@ModelName("smart")
ChatModel smartModel;
@Inject
@ModelName("smart")
StreamingChatModel smartStreamingModel;
// Simple chat
public String simpleChat(String message) {
return defaultChatModel.chat(message);
}
// Streaming chat
public void streamingChat(String message) {
defaultStreamingModel.chat(message, new StreamingChatResponseHandler() {
@Override
public void onPartialResponse(PartialResponse response,
PartialResponseContext context) {
System.out.print(response.text());
}
@Override
public void onCompleteResponse(ChatResponse response) {
System.out.println("\nComplete!");
}
@Override
public void onError(Throwable error) {
error.printStackTrace();
}
});
}
// Multi-turn conversation
public String conversation(List<String> messages) {
List<ChatMessage> chatMessages = messages.stream()
.map(UserMessage::from)
.collect(Collectors.toList());
ChatResponse response = smartModel.chat(
chatMessages.toArray(new ChatMessage[0])
);
return response.aiMessage().text();
}
// Model selection based on complexity
public String adaptiveChat(String message, boolean complex) {
ChatModel model = complex ? smartModel : fastModel;
return model.chat(message);
}
}Install with Tessl CLI
npx tessl i tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-anthropic@1.7.0