Quarkus extension for Azure OpenAI integration with LangChain4j, providing ChatModel, StreamingChatModel, EmbeddingModel, and ImageModel implementations with Azure-specific authentication and configuration support.
This Quarkus extension provides seamless integration between Quarkus applications and Azure OpenAI services through the LangChain4j framework. It enables developers to leverage Azure-hosted OpenAI language models for chat completion, image generation, and text embedding capabilities with enterprise-grade features including Azure-specific authentication, retry mechanisms, proxy support, and integration with the Quarkus observability stack.
<dependency>
<groupId>io.quarkiverse.langchain4j</groupId>
<artifactId>quarkus-langchain4j-azure-openai</artifactId>
<version>1.7.4</version>
</dependency>import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiChatModel;
import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiStreamingChatModel;
import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiEmbeddingModel;
import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiImageModel;For declarative AI services with CDI injection:
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.StreamingChatModel;
import dev.langchain4j.model.embedding.EmbeddingModel;
import dev.langchain4j.model.image.ImageModel;
import io.quarkiverse.langchain4j.RegisterAiService;
import io.quarkiverse.langchain4j.ModelName;The simplest way to use Azure OpenAI models in Quarkus is through configuration:
# Azure OpenAI configuration
quarkus.langchain4j.azure-openai.api-key=your-api-key
quarkus.langchain4j.azure-openai.resource-name=your-resource-name
quarkus.langchain4j.azure-openai.deployment-name=your-deployment-name
# Or use direct endpoint
# quarkus.langchain4j.azure-openai.endpoint=https://your-resource.openai.azure.com/openai/deployments/your-deployment
# Optional: Configure chat model parameters
quarkus.langchain4j.azure-openai.chat-model.temperature=0.7
quarkus.langchain4j.azure-openai.chat-model.max-tokens=1000Then inject the model in your application:
import dev.langchain4j.model.chat.ChatModel;
import jakarta.inject.Inject;
public class MyService {
@Inject
ChatModel chatModel;
public String chat(String message) {
return chatModel.generate(message);
}
}For programmatic control, use the builder pattern:
import io.quarkiverse.langchain4j.azure.openai.AzureOpenAiChatModel;
import java.time.Duration;
AzureOpenAiChatModel chatModel = AzureOpenAiChatModel.builder()
.endpoint("https://your-resource.openai.azure.com/openai/deployments/your-deployment")
.apiKey("your-api-key")
.apiVersion("2024-10-21")
.temperature(0.7)
.maxTokens(1000)
.timeout(Duration.ofSeconds(60))
.build();This extension provides four main model types, each with comprehensive configuration options and builder APIs:
Synchronous and streaming chat completion models for conversational AI applications.
public class AzureOpenAiChatModel implements ChatModel {
public ChatResponse doChat(ChatRequest chatRequest);
public static Builder builder();
}
public class AzureOpenAiStreamingChatModel implements StreamingChatModel {
public void doChat(ChatRequest chatRequest, StreamingChatResponseHandler handler);
public static Builder builder();
}Chat models support:
Generate vector embeddings from text for semantic search, RAG, and similarity tasks.
public class AzureOpenAiEmbeddingModel implements EmbeddingModel {
public Response<List<Embedding>> embedAll(List<TextSegment> textSegments);
public static Builder builder();
}Embedding models support:
Embedding Models Documentation
Generate images from text prompts using Azure OpenAI DALL-E models.
public class AzureOpenAiImageModel implements ImageModel {
public Response<Image> generate(String prompt);
public Response<List<Image>> generate(String prompt, int n);
public static Builder builder();
}Image models support:
Quarkus configuration interfaces for declarative setup via application.properties with support for multiple named configurations.
Configuration system supports:
Azure OpenAI supports two authentication methods (mutually exclusive):
API Key Authentication: Provide API key via configuration or builder
quarkus.langchain4j.azure-openai.api-key=your-keyAzure Active Directory: Provide AD token via configuration or builder
quarkus.langchain4j.azure-openai.ad-token=your-tokenTwo methods to specify the Azure OpenAI endpoint:
Direct endpoint URL:
quarkus.langchain4j.azure-openai.endpoint=https://resource.openai.azure.com/openai/deployments/deploymentComposite (resource + domain + deployment):
quarkus.langchain4j.azure-openai.resource-name=your-resource
quarkus.langchain4j.azure-openai.domain-name=openai.azure.com
quarkus.langchain4j.azure-openai.deployment-name=your-deploymentThe endpoint is constructed as: https://{resource-name}.{domain-name}/openai/deployments/{deployment-name}
Configure multiple Azure OpenAI instances for different use cases:
# Default configuration
quarkus.langchain4j.azure-openai.api-key=default-key
quarkus.langchain4j.azure-openai.resource-name=default-resource
quarkus.langchain4j.azure-openai.deployment-name=default-deployment
# Named configuration for a specific model
quarkus.langchain4j.azure-openai.creative.api-key=creative-key
quarkus.langchain4j.azure-openai.creative.resource-name=creative-resource
quarkus.langchain4j.azure-openai.creative.deployment-name=creative-deployment
quarkus.langchain4j.azure-openai.creative.chat-model.temperature=0.9Use named configurations with CDI injection:
@Inject
@ModelName("creative")
ChatModel creativeModel;Or in AI service definitions:
@RegisterAiService(modelName = "creative")
public interface CreativeAssistant {
String generate(String prompt);
}The API version determines available features:
tools parameter for function callingfunctions parameter (automatically handled)Default API version is 2024-10-21, configurable via:
quarkus.langchain4j.azure-openai.api-version=2024-10-21@RegisterAiService for automatic CDI bean creationInstall with Tessl CLI
npx tessl i tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-azure-openai