CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-azure-openai

Quarkus extension for Azure OpenAI integration with LangChain4j, providing ChatModel, StreamingChatModel, EmbeddingModel, and ImageModel implementations with Azure-specific authentication and configuration support.

Overview
Eval results
Files

Quarkus LangChain4j Azure OpenAI Extension

This Quarkus extension provides seamless integration between Quarkus applications and Azure OpenAI services through the LangChain4j framework. It enables developers to leverage Azure-hosted OpenAI language models for chat completion, image generation, and text embedding capabilities with enterprise-grade features including Azure-specific authentication, retry mechanisms, proxy support, and integration with the Quarkus observability stack.

Package Information

  • Package Name: io.quarkiverse.langchain4j:quarkus-langchain4j-azure-openai
  • Package Type: Maven
  • Language: Java
  • Installation:
    <dependency>
        <groupId>io.quarkiverse.langchain4j</groupId>
        <artifactId>quarkus-langchain4j-azure-openai</artifactId>
        <version>1.7.4</version>
    </dependency>

Core Imports

import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiChatModel;
import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiStreamingChatModel;
import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiEmbeddingModel;
import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiImageModel;

For declarative AI services with CDI injection:

import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.StreamingChatModel;
import dev.langchain4j.model.embedding.EmbeddingModel;
import dev.langchain4j.model.image.ImageModel;
import io.quarkiverse.langchain4j.RegisterAiService;
import io.quarkiverse.langchain4j.ModelName;

Basic Usage

Declarative Configuration via application.properties

The simplest way to use Azure OpenAI models in Quarkus is through configuration:

# Azure OpenAI configuration
quarkus.langchain4j.azure-openai.api-key=your-api-key
quarkus.langchain4j.azure-openai.resource-name=your-resource-name
quarkus.langchain4j.azure-openai.deployment-name=your-deployment-name

# Or use direct endpoint
# quarkus.langchain4j.azure-openai.endpoint=https://your-resource.openai.azure.com/openai/deployments/your-deployment

# Optional: Configure chat model parameters
quarkus.langchain4j.azure-openai.chat-model.temperature=0.7
quarkus.langchain4j.azure-openai.chat-model.max-tokens=1000

Then inject the model in your application:

import dev.langchain4j.model.chat.ChatModel;
import jakarta.inject.Inject;

public class MyService {
    @Inject
    ChatModel chatModel;

    public String chat(String message) {
        return chatModel.generate(message);
    }
}

Programmatic Model Creation

For programmatic control, use the builder pattern:

import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiChatModel;
import java.time.Duration;

AzureOpenAiChatModel chatModel = AzureOpenAiChatModel.builder()
    .endpoint("https://your-resource.openai.azure.com/openai/deployments/your-deployment")
    .apiKey("your-api-key")
    .apiVersion("2024-10-21")
    .temperature(0.7)
    .maxTokens(1000)
    .timeout(Duration.ofSeconds(60))
    .build();

Capabilities

This extension provides four main model types, each with comprehensive configuration options and builder APIs:

Chat Models

Synchronous and streaming chat completion models for conversational AI applications.

public class AzureOpenAiChatModel implements ChatModel {
    public ChatResponse doChat(ChatRequest chatRequest);
    public static Builder builder();
}

public class AzureOpenAiStreamingChatModel implements StreamingChatModel {
    public void doChat(ChatRequest chatRequest, StreamingChatResponseHandler handler);
    public static Builder builder();
}

Chat models support:

  • Synchronous and streaming responses
  • Tool/function calling (API version >= 2023-12-01)
  • Temperature, top-p, and seed parameters for response control
  • Presence and frequency penalties
  • Max tokens and response format configuration
  • Model listeners for observability
  • Azure AD token or API key authentication

Chat Models Documentation

Embedding Models

Generate vector embeddings from text for semantic search, RAG, and similarity tasks.

public class AzureOpenAiEmbeddingModel implements EmbeddingModel {
    public Response<List<Embedding>> embedAll(List<TextSegment> textSegments);
    public static Builder builder();
}

Embedding models support:

  • Batch processing (up to 16 segments per request)
  • Automatic retry with configurable attempts
  • Token usage tracking

Embedding Models Documentation

Image Models

Generate images from text prompts using Azure OpenAI DALL-E models.

public class AzureOpenAiImageModel implements ImageModel {
    public Response<Image> generate(String prompt);
    public Response<List<Image>> generate(String prompt, int n);
    public static Builder builder();
}

Image models support:

  • Single and multiple image generation
  • Image size, quality, and style configuration
  • Image persistence to local filesystem
  • Base64 or URL response formats
  • DALL-E 2 and DALL-E 3 models

Image Models Documentation

Configuration System

Quarkus configuration interfaces for declarative setup via application.properties with support for multiple named configurations.

Configuration system supports:

  • Declarative configuration via application.properties
  • Multiple named model configurations
  • Per-model-type configuration overrides (chat, embedding, image)
  • Environment-specific settings
  • Proxy configuration
  • Request/response logging

Configuration Documentation

Authentication

Azure OpenAI supports two authentication methods (mutually exclusive):

  1. API Key Authentication: Provide API key via configuration or builder

    quarkus.langchain4j.azure-openai.api-key=your-key
  2. Azure Active Directory: Provide AD token via configuration or builder

    quarkus.langchain4j.azure-openai.ad-token=your-token

Endpoint Configuration

Two methods to specify the Azure OpenAI endpoint:

  1. Direct endpoint URL:

    quarkus.langchain4j.azure-openai.endpoint=https://resource.openai.azure.com/openai/deployments/deployment
  2. Composite (resource + domain + deployment):

    quarkus.langchain4j.azure-openai.resource-name=your-resource
    quarkus.langchain4j.azure-openai.domain-name=openai.azure.com
    quarkus.langchain4j.azure-openai.deployment-name=your-deployment

    The endpoint is constructed as: https://{resource-name}.{domain-name}/openai/deployments/{deployment-name}

Named Configurations

Configure multiple Azure OpenAI instances for different use cases:

# Default configuration
quarkus.langchain4j.azure-openai.api-key=default-key
quarkus.langchain4j.azure-openai.resource-name=default-resource
quarkus.langchain4j.azure-openai.deployment-name=default-deployment

# Named configuration for a specific model
quarkus.langchain4j.azure-openai.creative.api-key=creative-key
quarkus.langchain4j.azure-openai.creative.resource-name=creative-resource
quarkus.langchain4j.azure-openai.creative.deployment-name=creative-deployment
quarkus.langchain4j.azure-openai.creative.chat-model.temperature=0.9

Use named configurations with CDI injection:

@Inject
@ModelName("creative")
ChatModel creativeModel;

Or in AI service definitions:

@RegisterAiService(modelName = "creative")
public interface CreativeAssistant {
    String generate(String prompt);
}

API Version Compatibility

The API version determines available features:

  • 2023-12-01 and later: Support tools parameter for function calling
  • Earlier versions: Use deprecated functions parameter (automatically handled)

Default API version is 2024-10-21, configurable via:

quarkus.langchain4j.azure-openai.api-version=2024-10-21

Key Features

  • Native Compilation: Full support for Quarkus native executables
  • Declarative AI Services: Use @RegisterAiService for automatic CDI bean creation
  • Tool Support: Function calling with automatic parameter extraction
  • Observability: Integration with Quarkus metrics and tracing
  • Error Handling: Automatic retry mechanisms with configurable attempts
  • Proxy Support: HTTP/HTTPS proxy configuration for enterprise environments
  • Logging: Request/response logging with optional cURL format
  • Model Listeners: Extensible listener system for custom behavior and monitoring

Install with Tessl CLI

npx tessl i tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-azure-openai@1.7.0
Workspace
tessl
Visibility
Public
Created
Last updated
Describes
mavenpkg:maven/io.quarkiverse.langchain4j/quarkus-langchain4j-azure-openai@1.7.x