CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-anthropic

Quarkus extension for integrating Anthropic Claude LLM models into Quarkus applications via LangChain4j

Overview
Eval results
Files

index.mddocs/

Quarkus LangChain4j Anthropic Extension

The quarkus-langchain4j-anthropic extension provides seamless integration between Quarkus applications and Anthropic's Claude family of Large Language Models through the LangChain4j framework. This extension enables developers to incorporate Claude models into Quarkus applications with support for declarative AI services, CDI-based model injection, streaming responses, and native compilation.

Package Information

  • Package Name: quarkus-langchain4j-anthropic

  • Group ID: io.quarkiverse.langchain4j

  • Artifact ID: quarkus-langchain4j-anthropic

  • Package Type: Maven (Quarkus Extension)

  • Language: Java

  • Version: 1.7.4

  • License: Apache-2.0

  • Installation:

    <dependency>
        <groupId>io.quarkiverse.langchain4j</groupId>
        <artifactId>quarkus-langchain4j-anthropic</artifactId>
        <version>1.7.4</version>
    </dependency>
  • Additional Dependencies: For declarative AI services (@RegisterAiService), also add:

    <dependency>
        <groupId>io.quarkiverse.langchain4j</groupId>
        <artifactId>quarkus-langchain4j-core</artifactId>
        <version>1.7.4</version>
    </dependency>

Core Imports

The extension follows the Quarkus CDI injection pattern. Models are injected, not imported:

import jakarta.inject.Inject;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.StreamingChatModel;
import io.quarkiverse.langchain4j.ModelName;

Basic Usage

Configuration

Set your API key in application.properties:

quarkus.langchain4j.anthropic.api-key=sk-ant-...
quarkus.langchain4j.anthropic.chat-model.model-name=claude-opus-4-20250514

Or use environment variable:

export QUARKUS_LANGCHAIN4J_ANTHROPIC_API_KEY=sk-ant-...

Simple Chat Example

import jakarta.inject.Inject;
import dev.langchain4j.model.chat.ChatModel;

public class MyService {
    @Inject
    ChatModel chatModel;

    public String askClaude(String question) {
        return chatModel.chat(question);
    }
}

Streaming Chat Example

import jakarta.inject.Inject;
import dev.langchain4j.model.chat.StreamingChatModel;
import dev.langchain4j.model.chat.response.*;

public class MyStreamingService {
    @Inject
    StreamingChatModel streamingChatModel;

    public void streamChat(String prompt) {
        streamingChatModel.chat(prompt, new StreamingChatResponseHandler() {
            @Override
            public void onPartialResponse(PartialResponse response,
                                         PartialResponseContext context) {
                System.out.print(response.text());
            }

            @Override
            public void onCompleteResponse(ChatResponse response) {
                System.out.println("\nComplete!");
            }

            @Override
            public void onError(Throwable error) {
                error.printStackTrace();
            }
        });
    }
}

Capabilities

CDI Injection and Model Management

Inject ChatModel and StreamingChatModel beans using standard CDI patterns. Supports both default and named model configurations for multi-model applications.

Key APIs:

// Default model injection
@Inject
ChatModel chatModel;

@Inject
StreamingChatModel streamingChatModel;

// Named model injection
@Inject
@ModelName("fast-model")
ChatModel fastModel;

@Inject
@ModelName("smart-model")
ChatModel smartModel;

CDI Injection and Model Management

Configuration

Comprehensive configuration system based on SmallRye Config with support for connection settings, model parameters, prompt caching, extended thinking, and logging options. Supports both default and named model configurations.

Key Configuration Interfaces:

// Root configuration
interface LangChain4jAnthropicConfig {
    AnthropicConfig defaultConfig();
    Map<String, AnthropicConfig> namedConfig();
}

// Model configuration group
interface AnthropicConfig {
    String baseUrl();
    String apiKey();
    String version();
    Optional<Duration> timeout();
    Boolean enableIntegration();
    ChatModelConfig chatModel();
}

// Chat model parameters
interface ChatModelConfig {
    String modelName();
    Integer maxTokens();
    OptionalDouble temperature();
    OptionalDouble topP();
    OptionalInt topK();
    Boolean cacheSystemMessages();
    Boolean cacheTools();
    ThinkingConfig thinking();
}

Configuration Reference

Low-Level Client API

Direct access to the Anthropic REST API through QuarkusAnthropicClient for advanced use cases requiring custom request handling, beta feature access, or direct API control.

Key APIs:

// Client class
class QuarkusAnthropicClient extends AnthropicClient {
    AnthropicCreateMessageResponse createMessage(
        AnthropicCreateMessageRequest request
    );

    void createMessage(
        AnthropicCreateMessageRequest request,
        AnthropicCreateMessageOptions options,
        StreamingChatResponseHandler handler
    );

    static void setLogCurlHint(boolean logCurl);
    static void setDisableBetaHint(boolean disableBetaHint);
}

// Builder
class QuarkusAnthropicClient.Builder
    extends AnthropicClient.Builder<QuarkusAnthropicClient, Builder> {
    boolean logCurl;
    boolean disableBetaHeader;
    QuarkusAnthropicClient build();
}

Low-Level Client API

Advanced Features

Extended capabilities including Claude's extended thinking mode for complex reasoning, prompt caching for cost reduction, streaming responses with partial updates, and tool/function calling support.

Key APIs:

// Thinking configuration
interface ThinkingConfig {
    Optional<String> type();
    Optional<Integer> budgetTokens();
    Optional<Boolean> returnThinking();
    Optional<Boolean> sendThinking();
    Optional<Boolean> interleaved();
}

// Streaming handler
interface StreamingChatResponseHandler {
    void onPartialResponse(PartialResponse response,
                          PartialResponseContext context);
    void onPartialThinking(PartialThinking thinking,
                          PartialThinkingContext context);
    void onPartialToolCall(PartialToolCall toolCall,
                          PartialToolCallContext context);
    void onCompleteToolCall(CompleteToolCall toolCall);
    void onCompleteResponse(ChatResponse response);
    void onError(Throwable error);
}

Advanced Features

Supported Claude Models

The extension supports all Anthropic Claude models:

  • Claude 4: claude-opus-4-20250514, claude-sonnet-4-20250514
  • Claude 3.5: claude-3-5-sonnet-20241022
  • Claude 3: claude-3-opus-20240229, claude-3-sonnet-20240229, claude-3-haiku-20240307 (default)

Extension Architecture

This Quarkus extension follows the standard Quarkus extension two-module architecture pattern, providing seamless integration between Quarkus applications and Anthropic's Claude models through the LangChain4j framework.

Module Structure

Runtime Module (quarkus-langchain4j-anthropic)

The runtime module contains all components included in the application at runtime:

  • QuarkusAnthropicClient: REST-based client implementation extending LangChain4j's AnthropicClient, using Quarkus REST Client Reactive for HTTP communication
  • AnthropicRestApi: JAX-RS interface defining REST endpoints for Anthropic API
  • Configuration Interfaces: LangChain4jAnthropicConfig, AnthropicConfig, ChatModelConfig, and ThinkingConfig for comprehensive configuration management via SmallRye Config
  • AnthropicRecorder: Runtime recorder that creates and configures CDI beans based on runtime configuration
  • GraalVM Substitutions: Native image support through substitution classes for seamless GraalVM compilation

Deployment Module (quarkus-langchain4j-anthropic-deployment)

The deployment module handles build-time processing and is only used during compilation:

  • AnthropicProcessor: Build step processor that generates synthetic CDI beans at build time
  • Build-time Configuration: LangChain4jAnthropicBuildConfig and ChatModelBuildConfig for controlling bean creation
  • Feature Registration: Registers the langchain4j-anthropic feature and integrates with Quarkus build infrastructure
  • Provider Discovery: Registers Anthropic as a chat model provider in the LangChain4j ecosystem

CDI Bean Creation

The extension uses Quarkus's synthetic bean generation to create CDI beans:

  1. Build Time: AnthropicProcessor analyzes configuration and registers bean creation steps
  2. Runtime Init: AnthropicRecorder executes to create bean suppliers based on runtime configuration
  3. Bean Lifecycle: Beans are application-scoped singletons, thread-safe, and automatically managed
  4. Named Models: Supports multiple named configurations, each creating separate bean instances with @ModelName qualifier

The extension creates beans for both ChatModel and StreamingChatModel interfaces from LangChain4j, enabling seamless integration with LangChain4j's declarative AI services pattern.

Integration Architecture

The extension integrates multiple Quarkus and external frameworks:

  • Quarkus REST Client Reactive: Provides non-blocking HTTP communication with Anthropic API
  • SmallRye Config: Configuration management with support for environment variables, profiles, and precedence
  • SmallRye Mutiny: Reactive streams support for streaming responses (Multi<AnthropicStreamingData>)
  • Jackson with Snake Case: JSON serialization/deserialization following Anthropic API conventions
  • LangChain4j Core: Implements ChatModel and StreamingChatModel interfaces from dev.langchain4j.model.chat package
  • LangChain4j Anthropic: Uses request/response classes from dev.langchain4j.model.anthropic.internal.api package
  • CDI: Full dependency injection support with qualifiers for named models

Native Image Support

The extension fully supports GraalVM native image compilation:

  • Reflection Configuration: Automatically handled by Quarkus build infrastructure
  • Substitution Classes: GraalVM substitutions in the runtime module handle native compilation requirements
  • REST Client Optimization: REST client is configured for optimal native image performance
  • Build-time vs Runtime: Configuration is properly separated between build and runtime phases

Request Flow

  1. User Code: Injects ChatModel or StreamingChatModel via CDI
  2. Model Wrapper: LangChain4j's AnthropicChatModel or AnthropicStreamingChatModel wraps the client
  3. Client Layer: QuarkusAnthropicClient formats requests according to Anthropic API specifications
  4. REST Layer: AnthropicRestApi interface executed by Quarkus REST Client
  5. HTTP Transport: Quarkus REST Client Reactive handles HTTP communication
  6. Response Processing: Streaming or synchronous responses processed and returned through LangChain4j interfaces

Observability Integration

When Quarkus observability extensions are present, the extension automatically integrates:

  • Metrics (via quarkus-micrometer): Automatic collection of request metrics
  • Tracing (via quarkus-opentelemetry): Distributed tracing for LLM calls
  • Logging: Structured logging through Quarkus logging infrastructure with optional request/response/cURL logging

Configuration Architecture

The extension uses a layered configuration approach:

  1. Build-time Config: Controls bean creation (enabled flag)
  2. Runtime Config: Connection settings, API keys, model parameters loaded at runtime
  3. Default Configuration: Properties without name prefix (e.g., quarkus.langchain4j.anthropic.api-key)
  4. Named Configurations: Properties with name prefix (e.g., quarkus.langchain4j.anthropic.fast.api-key)
  5. Config Inheritance: Named configurations can inherit from default configuration
  6. Profile Support: Full support for Quarkus profiles (%dev, %prod, etc.)

This architecture ensures maximum flexibility while maintaining type safety and compile-time validation.

Dependencies and Integration

The extension integrates with:

  • Quarkus REST Client Reactive: HTTP client infrastructure
  • SmallRye Config: Configuration management
  • SmallRye Mutiny: Reactive streams for streaming responses
  • Jackson: JSON serialization with snake_case convention
  • LangChain4j Core: Model interfaces and abstractions
  • LangChain4j Anthropic: Anthropic-specific request/response classes
  • Quarkus CDI: Dependency injection and lifecycle management
  • GraalVM: Native image compilation support

Quick Reference

Essential Configuration Properties

# Required
quarkus.langchain4j.anthropic.api-key=sk-ant-...

# Model selection
quarkus.langchain4j.anthropic.chat-model.model-name=claude-opus-4-20250514

# Model parameters
quarkus.langchain4j.anthropic.chat-model.max-tokens=2048
quarkus.langchain4j.anthropic.chat-model.temperature=0.7
quarkus.langchain4j.anthropic.chat-model.top-p=1.0
quarkus.langchain4j.anthropic.chat-model.top-k=40

# Extended thinking
quarkus.langchain4j.anthropic.chat-model.thinking.type=enabled
quarkus.langchain4j.anthropic.chat-model.thinking.budget-tokens=8000
quarkus.langchain4j.anthropic.chat-model.thinking.return-thinking=true

# Prompt caching
quarkus.langchain4j.anthropic.chat-model.cache-system-messages=true
quarkus.langchain4j.anthropic.chat-model.cache-tools=true

# Logging
quarkus.langchain4j.anthropic.log-requests=true
quarkus.langchain4j.anthropic.log-responses=false

Common Injection Patterns

// Default model
@Inject ChatModel model;

// Named model
@Inject @ModelName("name") ChatModel model;

// Streaming
@Inject StreamingChatModel streamingModel;

Key Java Packages

  • io.quarkiverse.langchain4j.anthropic - Core client and REST API
  • io.quarkiverse.langchain4j.anthropic.runtime.config - Configuration interfaces
  • dev.langchain4j.model.chat - LangChain4j chat model interfaces
  • dev.langchain4j.model.anthropic - Anthropic model implementations
  • dev.langchain4j.model.anthropic.internal.api - Request/response classes

Install with Tessl CLI

npx tessl i tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-anthropic@1.7.0

docs

advanced-features.md

cdi-injection.md

client-api.md

configuration.md

index.md

tile.json