Quarkus extension for integrating Anthropic Claude LLM models into Quarkus applications via LangChain4j
The quarkus-langchain4j-anthropic extension provides seamless integration between Quarkus applications and Anthropic's Claude family of Large Language Models through the LangChain4j framework. This extension enables developers to incorporate Claude models into Quarkus applications with support for declarative AI services, CDI-based model injection, streaming responses, and native compilation.
Package Name: quarkus-langchain4j-anthropic
Group ID: io.quarkiverse.langchain4j
Artifact ID: quarkus-langchain4j-anthropic
Package Type: Maven (Quarkus Extension)
Language: Java
Version: 1.7.4
License: Apache-2.0
Installation:
<dependency>
<groupId>io.quarkiverse.langchain4j</groupId>
<artifactId>quarkus-langchain4j-anthropic</artifactId>
<version>1.7.4</version>
</dependency>Additional Dependencies:
For declarative AI services (@RegisterAiService), also add:
<dependency>
<groupId>io.quarkiverse.langchain4j</groupId>
<artifactId>quarkus-langchain4j-core</artifactId>
<version>1.7.4</version>
</dependency>The extension follows the Quarkus CDI injection pattern. Models are injected, not imported:
import jakarta.inject.Inject;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.StreamingChatModel;
import io.quarkiverse.langchain4j.ModelName;Set your API key in application.properties:
quarkus.langchain4j.anthropic.api-key=sk-ant-...
quarkus.langchain4j.anthropic.chat-model.model-name=claude-opus-4-20250514Or use environment variable:
export QUARKUS_LANGCHAIN4J_ANTHROPIC_API_KEY=sk-ant-...import jakarta.inject.Inject;
import dev.langchain4j.model.chat.ChatModel;
public class MyService {
@Inject
ChatModel chatModel;
public String askClaude(String question) {
return chatModel.chat(question);
}
}import jakarta.inject.Inject;
import dev.langchain4j.model.chat.StreamingChatModel;
import dev.langchain4j.model.chat.response.*;
public class MyStreamingService {
@Inject
StreamingChatModel streamingChatModel;
public void streamChat(String prompt) {
streamingChatModel.chat(prompt, new StreamingChatResponseHandler() {
@Override
public void onPartialResponse(PartialResponse response,
PartialResponseContext context) {
System.out.print(response.text());
}
@Override
public void onCompleteResponse(ChatResponse response) {
System.out.println("\nComplete!");
}
@Override
public void onError(Throwable error) {
error.printStackTrace();
}
});
}
}Inject ChatModel and StreamingChatModel beans using standard CDI patterns. Supports both default and named model configurations for multi-model applications.
Key APIs:
// Default model injection
@Inject
ChatModel chatModel;
@Inject
StreamingChatModel streamingChatModel;
// Named model injection
@Inject
@ModelName("fast-model")
ChatModel fastModel;
@Inject
@ModelName("smart-model")
ChatModel smartModel;CDI Injection and Model Management
Comprehensive configuration system based on SmallRye Config with support for connection settings, model parameters, prompt caching, extended thinking, and logging options. Supports both default and named model configurations.
Key Configuration Interfaces:
// Root configuration
interface LangChain4jAnthropicConfig {
AnthropicConfig defaultConfig();
Map<String, AnthropicConfig> namedConfig();
}
// Model configuration group
interface AnthropicConfig {
String baseUrl();
String apiKey();
String version();
Optional<Duration> timeout();
Boolean enableIntegration();
ChatModelConfig chatModel();
}
// Chat model parameters
interface ChatModelConfig {
String modelName();
Integer maxTokens();
OptionalDouble temperature();
OptionalDouble topP();
OptionalInt topK();
Boolean cacheSystemMessages();
Boolean cacheTools();
ThinkingConfig thinking();
}Direct access to the Anthropic REST API through QuarkusAnthropicClient for advanced use cases requiring custom request handling, beta feature access, or direct API control.
Key APIs:
// Client class
class QuarkusAnthropicClient extends AnthropicClient {
AnthropicCreateMessageResponse createMessage(
AnthropicCreateMessageRequest request
);
void createMessage(
AnthropicCreateMessageRequest request,
AnthropicCreateMessageOptions options,
StreamingChatResponseHandler handler
);
static void setLogCurlHint(boolean logCurl);
static void setDisableBetaHint(boolean disableBetaHint);
}
// Builder
class QuarkusAnthropicClient.Builder
extends AnthropicClient.Builder<QuarkusAnthropicClient, Builder> {
boolean logCurl;
boolean disableBetaHeader;
QuarkusAnthropicClient build();
}Extended capabilities including Claude's extended thinking mode for complex reasoning, prompt caching for cost reduction, streaming responses with partial updates, and tool/function calling support.
Key APIs:
// Thinking configuration
interface ThinkingConfig {
Optional<String> type();
Optional<Integer> budgetTokens();
Optional<Boolean> returnThinking();
Optional<Boolean> sendThinking();
Optional<Boolean> interleaved();
}
// Streaming handler
interface StreamingChatResponseHandler {
void onPartialResponse(PartialResponse response,
PartialResponseContext context);
void onPartialThinking(PartialThinking thinking,
PartialThinkingContext context);
void onPartialToolCall(PartialToolCall toolCall,
PartialToolCallContext context);
void onCompleteToolCall(CompleteToolCall toolCall);
void onCompleteResponse(ChatResponse response);
void onError(Throwable error);
}The extension supports all Anthropic Claude models:
claude-opus-4-20250514, claude-sonnet-4-20250514claude-3-5-sonnet-20241022claude-3-opus-20240229, claude-3-sonnet-20240229, claude-3-haiku-20240307 (default)This Quarkus extension follows the standard Quarkus extension two-module architecture pattern, providing seamless integration between Quarkus applications and Anthropic's Claude models through the LangChain4j framework.
quarkus-langchain4j-anthropic)The runtime module contains all components included in the application at runtime:
QuarkusAnthropicClient: REST-based client implementation extending LangChain4j's AnthropicClient, using Quarkus REST Client Reactive for HTTP communicationAnthropicRestApi: JAX-RS interface defining REST endpoints for Anthropic APILangChain4jAnthropicConfig, AnthropicConfig, ChatModelConfig, and ThinkingConfig for comprehensive configuration management via SmallRye ConfigAnthropicRecorder: Runtime recorder that creates and configures CDI beans based on runtime configurationquarkus-langchain4j-anthropic-deployment)The deployment module handles build-time processing and is only used during compilation:
AnthropicProcessor: Build step processor that generates synthetic CDI beans at build timeLangChain4jAnthropicBuildConfig and ChatModelBuildConfig for controlling bean creationlangchain4j-anthropic feature and integrates with Quarkus build infrastructureThe extension uses Quarkus's synthetic bean generation to create CDI beans:
AnthropicProcessor analyzes configuration and registers bean creation stepsAnthropicRecorder executes to create bean suppliers based on runtime configuration@ModelName qualifierThe extension creates beans for both ChatModel and StreamingChatModel interfaces from LangChain4j, enabling seamless integration with LangChain4j's declarative AI services pattern.
The extension integrates multiple Quarkus and external frameworks:
Multi<AnthropicStreamingData>)ChatModel and StreamingChatModel interfaces from dev.langchain4j.model.chat packagedev.langchain4j.model.anthropic.internal.api packageThe extension fully supports GraalVM native image compilation:
ChatModel or StreamingChatModel via CDIAnthropicChatModel or AnthropicStreamingChatModel wraps the clientQuarkusAnthropicClient formats requests according to Anthropic API specificationsAnthropicRestApi interface executed by Quarkus REST ClientWhen Quarkus observability extensions are present, the extension automatically integrates:
quarkus-micrometer): Automatic collection of request metricsquarkus-opentelemetry): Distributed tracing for LLM callsThe extension uses a layered configuration approach:
enabled flag)quarkus.langchain4j.anthropic.api-key)quarkus.langchain4j.anthropic.fast.api-key)%dev, %prod, etc.)This architecture ensures maximum flexibility while maintaining type safety and compile-time validation.
The extension integrates with:
# Required
quarkus.langchain4j.anthropic.api-key=sk-ant-...
# Model selection
quarkus.langchain4j.anthropic.chat-model.model-name=claude-opus-4-20250514
# Model parameters
quarkus.langchain4j.anthropic.chat-model.max-tokens=2048
quarkus.langchain4j.anthropic.chat-model.temperature=0.7
quarkus.langchain4j.anthropic.chat-model.top-p=1.0
quarkus.langchain4j.anthropic.chat-model.top-k=40
# Extended thinking
quarkus.langchain4j.anthropic.chat-model.thinking.type=enabled
quarkus.langchain4j.anthropic.chat-model.thinking.budget-tokens=8000
quarkus.langchain4j.anthropic.chat-model.thinking.return-thinking=true
# Prompt caching
quarkus.langchain4j.anthropic.chat-model.cache-system-messages=true
quarkus.langchain4j.anthropic.chat-model.cache-tools=true
# Logging
quarkus.langchain4j.anthropic.log-requests=true
quarkus.langchain4j.anthropic.log-responses=false// Default model
@Inject ChatModel model;
// Named model
@Inject @ModelName("name") ChatModel model;
// Streaming
@Inject StreamingChatModel streamingModel;io.quarkiverse.langchain4j.anthropic - Core client and REST APIio.quarkiverse.langchain4j.anthropic.runtime.config - Configuration interfacesdev.langchain4j.model.chat - LangChain4j chat model interfacesdev.langchain4j.model.anthropic - Anthropic model implementationsdev.langchain4j.model.anthropic.internal.api - Request/response classesInstall with Tessl CLI
npx tessl i tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-anthropic