CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-ollama-deployment

Quarkus extension deployment module for integrating Ollama LLM models with Quarkus applications through the LangChain4j framework

Overview
Eval results
Files

build-step-processing.mddocs/

Build Step Processing

The OllamaProcessor class contains build steps that participate in the Quarkus build chain. These methods are annotated with @BuildStep and execute during the application build to configure Ollama integration, register providers, and create runtime beans.

OllamaProcessor Class

package io.quarkiverse.langchain4j.ollama.deployment;

import java.util.List;
import jakarta.enterprise.context.ApplicationScoped;
import io.quarkus.deployment.annotations.BuildProducer;
import io.quarkus.deployment.annotations.BuildStep;
import io.quarkus.deployment.annotations.ExecutionTime;
import io.quarkus.deployment.annotations.Record;
import io.quarkus.deployment.builditem.*;
import io.quarkus.deployment.Capabilities;
import io.quarkus.deployment.IsNormal;
import io.quarkus.arc.deployment.SyntheticBeanBuildItem;
import io.quarkiverse.langchain4j.deployment.items.*;
import io.quarkiverse.langchain4j.deployment.devservice.Langchain4jDevServicesEnabled;
import io.quarkiverse.langchain4j.ollama.runtime.OllamaRecorder;
import io.quarkiverse.langchain4j.ollama.runtime.config.LangChain4jOllamaFixedRuntimeConfig;
import io.quarkus.resteasy.reactive.spi.MessageBodyReaderOverrideBuildItem;
import io.quarkus.resteasy.reactive.spi.MessageBodyWriterOverrideBuildItem;

public class OllamaProcessor {
    private static final String FEATURE = "langchain4j-ollama";
    private static final String PROVIDER = "ollama";

    @BuildStep
    FeatureBuildItem feature();

    @BuildStep
    IndexDependencyBuildItem indexUpstreamOllamaModule();

    @BuildStep
    void nativeSupport(
        BuildProducer<ServiceProviderBuildItem> serviceProviderProducer,
        BuildProducer<ReflectiveClassBuildItem> reflectiveClassProducer,
        BuildProducer<ReflectiveHierarchyBuildItem> reflectiveHierarchyProducer
    );

    @BuildStep
    void providerCandidates(
        BuildProducer<ChatModelProviderCandidateBuildItem> chatProducer,
        BuildProducer<EmbeddingModelProviderCandidateBuildItem> embeddingProducer,
        LangChain4jOllamaOpenAiBuildConfig config
    );

    @BuildStep
    void implicitlyConfiguredProviders(
        LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
        BuildProducer<ImplicitlyUserConfiguredChatProviderBuildItem> producer
    );

    @BuildStep(onlyIfNot = IsNormal.class, onlyIf = Langchain4jDevServicesEnabled.class)
    void devServicesSupport(
        List<SelectedChatModelProviderBuildItem> selectedChatModels,
        List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbeddingModels,
        LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
        BuildProducer<DevServicesChatModelRequiredBuildItem> chatProducer,
        BuildProducer<DevServicesEmbeddingModelRequiredBuildItem> embeddingProducer
    );

    @BuildStep
    @Record(ExecutionTime.RUNTIME_INIT)
    void generateBeans(
        OllamaRecorder recorder,
        List<SelectedChatModelProviderBuildItem> selectedChatItem,
        List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbedding,
        BuildProducer<SyntheticBeanBuildItem> beanProducer
    );

    @BuildStep
    void deprioritizeJsonb(
        Capabilities capabilities,
        BuildProducer<MessageBodyReaderOverrideBuildItem> readerOverrideProducer,
        BuildProducer<MessageBodyWriterOverrideBuildItem> writerOverrideProducer
    );
}

Constants:

  • FEATURE = "langchain4j-ollama" - Feature identifier
  • PROVIDER = "ollama" - Provider identifier used throughout the build chain

Build Steps

Feature Registration

@BuildStep
FeatureBuildItem feature() {
    return new FeatureBuildItem(FEATURE);
}

Purpose: Registers the "langchain4j-ollama" feature with Quarkus.

Produces: FeatureBuildItem

Description: This build step registers the extension as a Quarkus feature, which is reported in build logs and can be queried by other extensions. The feature name appears in the build output and is used for feature detection.

Dependency Indexing

@BuildStep
IndexDependencyBuildItem indexUpstreamOllamaModule() {
    return new IndexDependencyBuildItem("dev.langchain4j", "langchain4j-ollama");
}

Purpose: Indexes the upstream LangChain4j Ollama module for Jandex processing.

Produces: IndexDependencyBuildItem

Description: This build step ensures that the upstream dev.langchain4j:langchain4j-ollama module is indexed by Jandex, making its classes available for annotation scanning and reflection configuration. This is necessary for the build process to discover and process annotations in the upstream module.

Native Image Support

@BuildStep
void nativeSupport(
    BuildProducer<ServiceProviderBuildItem> serviceProviderProducer,
    BuildProducer<ReflectiveClassBuildItem> reflectiveClassProducer,
    BuildProducer<ReflectiveHierarchyBuildItem> reflectiveHierarchyProducer
) {
    // Register all ConfigSourceInterceptor service providers
    serviceProviderProducer.produce(
        ServiceProviderBuildItem.allProvidersFromClassPath(
            ConfigSourceInterceptor.class.getName()
        )
    );

    // Register OllamaChatRequest for reflection with full hierarchy
    reflectiveHierarchyProducer.produce(
        ReflectiveHierarchyBuildItem.builder("dev.langchain4j.model.ollama.OllamaChatRequest")
            .source(getClass().getSimpleName())
            .build()
    );

    // Register OllamaChatResponse for reflection with nested types
    reflectiveHierarchyProducer.produce(
        ReflectiveHierarchyBuildItem.builder("dev.langchain4j.model.ollama.OllamaChatResponse")
            .source(getClass().getSimpleName())
            .ignoreNested(false)
            .build()
    );

    // Register serializers/deserializers for reflection
    reflectiveClassProducer.produce(
        ReflectiveClassBuildItem.builder(
            "dev.langchain4j.model.ollama.FormatSerializer",
            "dev.langchain4j.model.ollama.OllamaDateDeserializer"
        )
        .constructors()
        .methods(false)
        .fields(false)
        .build()
    );
}

Purpose: Configures GraalVM native image compilation support.

Produces:

  • ServiceProviderBuildItem - Registers service providers
  • ReflectiveClassBuildItem - Registers classes for reflection
  • ReflectiveHierarchyBuildItem - Registers class hierarchies for reflection

Description: This build step configures reflection and service provider registration for native image compilation:

  1. Service Providers: Registers all ConfigSourceInterceptor implementations found on the classpath
  2. Request Classes: Registers OllamaChatRequest with full hierarchy reflection
  3. Response Classes: Registers OllamaChatResponse with nested type reflection
  4. Serializers: Registers FormatSerializer and OllamaDateDeserializer with constructor reflection

This ensures that these classes can be accessed via reflection in native images, which is necessary for JSON serialization/deserialization and configuration interception.

Provider Candidate Registration

@BuildStep
public void providerCandidates(
    BuildProducer<ChatModelProviderCandidateBuildItem> chatProducer,
    BuildProducer<EmbeddingModelProviderCandidateBuildItem> embeddingProducer,
    LangChain4jOllamaOpenAiBuildConfig config
) {
    if (config.chatModel().enabled().isEmpty() || config.chatModel().enabled().get()) {
        chatProducer.produce(new ChatModelProviderCandidateBuildItem(PROVIDER));
    }
    if (config.embeddingModel().enabled().isEmpty() || config.embeddingModel().enabled().get()) {
        embeddingProducer.produce(new EmbeddingModelProviderCandidateBuildItem(PROVIDER));
    }
}

Purpose: Registers Ollama as a provider candidate for chat and embedding models.

Consumes: LangChain4jOllamaOpenAiBuildConfig

Produces:

  • ChatModelProviderCandidateBuildItem (conditionally)
  • EmbeddingModelProviderCandidateBuildItem (conditionally)

Description: This build step examines the build-time configuration to determine whether to register Ollama as a provider candidate:

  • Chat Model: If chat-model.enabled is not set or is true, produces ChatModelProviderCandidateBuildItem with provider name "ollama"
  • Embedding Model: If embedding-model.enabled is not set or is true, produces EmbeddingModelProviderCandidateBuildItem with provider name "ollama"

These build items participate in the provider selection mechanism, allowing the Quarkus LangChain4j extension to choose Ollama as the active provider for models.

Implicit Configuration Detection

@BuildStep
public void implicitlyConfiguredProviders(
    LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
    BuildProducer<ImplicitlyUserConfiguredChatProviderBuildItem> producer
) {
    fixedRuntimeConfig.namedConfig().keySet().forEach(configName -> {
        producer.produce(new ImplicitlyUserConfiguredChatProviderBuildItem(configName, PROVIDER));
    });
}

Purpose: Detects and registers named Ollama configurations as implicitly configured providers.

Consumes: LangChain4jOllamaFixedRuntimeConfig

Produces: ImplicitlyUserConfiguredChatProviderBuildItem

Description: This build step examines the fixed runtime configuration to discover named Ollama instances. For each named configuration found (e.g., quarkus.langchain4j.ollama.my-instance.*), it produces an ImplicitlyUserConfiguredChatProviderBuildItem indicating that the user has implicitly configured a provider through named configuration.

This enables the provider selection mechanism to recognize and handle named configurations automatically.

DevServices Support

@BuildStep(onlyIfNot = IsNormal.class, onlyIf = Langchain4jDevServicesEnabled.class)
public void devServicesSupport(
    List<SelectedChatModelProviderBuildItem> selectedChatModels,
    List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbeddingModels,
    LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
    BuildProducer<DevServicesChatModelRequiredBuildItem> chatProducer,
    BuildProducer<DevServicesEmbeddingModelRequiredBuildItem> embeddingProducer
) {
    // For each selected chat model using Ollama provider
    for (var selected : selectedChatModels) {
        if (PROVIDER.equals(selected.getProvider())) {
            String configName = selected.getConfigName();
            String baseUrlProperty = String.format(
                "quarkus.langchain4j.ollama%s%s",
                NamedConfigUtil.isDefault(configName) ? "." : ("." + configName + "."),
                "base-url"
            );

            // Only need DevServices if base URL not configured or points to localhost
            if (canUseDevServices(baseUrlProperty)) {
                String modelId = NamedConfigUtil.isDefault(configName)
                    ? fixedRuntimeConfig.defaultConfig().chatModel().modelId()
                    : fixedRuntimeConfig.namedConfig().get(configName).chatModel().modelId();
                chatProducer.produce(
                    new DevServicesChatModelRequiredBuildItem(PROVIDER, modelId, baseUrlProperty)
                );
            }
        }
    }

    // For each selected embedding model using Ollama provider
    for (var selected : selectedEmbeddingModels) {
        if (PROVIDER.equals(selected.getProvider())) {
            String configName = selected.getConfigName();
            String baseUrlProperty = String.format(
                "quarkus.langchain4j.ollama%s%s",
                NamedConfigUtil.isDefault(configName) ? "." : ("." + configName + "."),
                "base-url"
            );

            // Only need DevServices if base URL not configured or points to localhost
            if (canUseDevServices(baseUrlProperty)) {
                String modelId = NamedConfigUtil.isDefault(configName)
                    ? fixedRuntimeConfig.defaultConfig().embeddingModel().modelId()
                    : fixedRuntimeConfig.namedConfig().get(configName).embeddingModel().modelId();
                embeddingProducer.produce(
                    new DevServicesEmbeddingModelRequiredBuildItem(PROVIDER, modelId, baseUrlProperty)
                );
            }
        }
    }
}

Conditions:

  • onlyIfNot = IsNormal.class - Only runs in development/test modes
  • onlyIf = Langchain4jDevServicesEnabled.class - Only runs if LangChain4j DevServices are enabled

Consumes:

  • List<SelectedChatModelProviderBuildItem>
  • List<SelectedEmbeddingModelCandidateBuildItem>
  • LangChain4jOllamaFixedRuntimeConfig

Produces:

  • DevServicesChatModelRequiredBuildItem (conditionally)
  • DevServicesEmbeddingModelRequiredBuildItem (conditionally)

Description: This build step determines which models need DevServices. For each selected model using the Ollama provider, it:

  1. Constructs the base URL property name
  2. Checks if DevServices can be used (base URL not configured or points to localhost)
  3. Retrieves the model ID from configuration
  4. Produces a build item indicating DevServices are required

The canUseDevServices(String baseUrlProperty) helper method checks if the base URL is unset or points to localhost.

Synthetic Bean Generation

@BuildStep
@Record(ExecutionTime.RUNTIME_INIT)
void generateBeans(
    OllamaRecorder recorder,
    List<SelectedChatModelProviderBuildItem> selectedChatItem,
    List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbedding,
    BuildProducer<SyntheticBeanBuildItem> beanProducer
) {
    // Generate ChatModel and StreamingChatModel beans
    for (var selected : selectedChatItem) {
        if (PROVIDER.equals(selected.getProvider())) {
            String configName = selected.getConfigName();

            // ChatModel bean
            var builder = SyntheticBeanBuildItem
                .configure(CHAT_MODEL)
                .setRuntimeInit()
                .defaultBean()
                .scope(ApplicationScoped.class)
                .addInjectionPoint(ParameterizedType.create(
                    DotNames.CDI_INSTANCE,
                    new Type[] { ClassType.create(DotNames.CHAT_MODEL_LISTENER) },
                    null
                ))
                .createWith(recorder.chatModel(configName));
            addQualifierIfNecessary(builder, configName);
            beanProducer.produce(builder.done());

            // StreamingChatModel bean
            var streamingBuilder = SyntheticBeanBuildItem
                .configure(STREAMING_CHAT_MODEL)
                .setRuntimeInit()
                .defaultBean()
                .scope(ApplicationScoped.class)
                .addInjectionPoint(ParameterizedType.create(
                    DotNames.CDI_INSTANCE,
                    new Type[] { ClassType.create(DotNames.CHAT_MODEL_LISTENER) },
                    null
                ))
                .createWith(recorder.streamingChatModel(configName));
            addQualifierIfNecessary(streamingBuilder, configName);
            beanProducer.produce(streamingBuilder.done());
        }
    }

    // Generate EmbeddingModel beans
    for (var selected : selectedEmbedding) {
        if (PROVIDER.equals(selected.getProvider())) {
            String configName = selected.getConfigName();
            var builder = SyntheticBeanBuildItem
                .configure(EMBEDDING_MODEL)
                .setRuntimeInit()
                .defaultBean()
                .unremovable()
                .scope(ApplicationScoped.class)
                .supplier(recorder.embeddingModel(configName));
            addQualifierIfNecessary(builder, configName);
            beanProducer.produce(builder.done());
        }
    }
}

Execution Time: RUNTIME_INIT - Executes at runtime initialization

Consumes:

  • OllamaRecorder - Runtime recorder for bean creation
  • List<SelectedChatModelProviderBuildItem>
  • List<SelectedEmbeddingModelCandidateBuildItem>

Produces: SyntheticBeanBuildItem

Description: This build step creates synthetic CDI beans for the selected models. For each selected model using the Ollama provider:

Chat Models:

  • Creates a ChatModel bean with @ApplicationScoped scope
  • Adds injection point for Instance<ChatModelListener>
  • Uses recorder.chatModel(configName) to create the bean at runtime
  • Adds @ModelName qualifier for named configurations

Streaming Chat Models:

  • Creates a StreamingChatModel bean with @ApplicationScoped scope
  • Adds injection point for Instance<ChatModelListener>
  • Uses recorder.streamingChatModel(configName) to create the bean at runtime
  • Adds @ModelName qualifier for named configurations

Embedding Models:

  • Creates an EmbeddingModel bean with @ApplicationScoped scope
  • Marked as unremovable() to prevent removal during optimization
  • Uses recorder.embeddingModel(configName) to supply the bean at runtime
  • Adds @ModelName qualifier for named configurations

The addQualifierIfNecessary(builder, configName) helper method adds a @ModelName qualifier for non-default configurations.

JSON-B Deprioritization

@BuildStep
public void deprioritizeJsonb(
    Capabilities capabilities,
    BuildProducer<MessageBodyReaderOverrideBuildItem> readerOverrideProducer,
    BuildProducer<MessageBodyWriterOverrideBuildItem> writerOverrideProducer
) {
    if (capabilities.isPresent(Capability.REST_CLIENT_REACTIVE_JSONB)) {
        readerOverrideProducer.produce(
            new MessageBodyReaderOverrideBuildItem(
                "org.jboss.resteasy.reactive.server.jsonb.JsonbMessageBodyReader",
                Priorities.APPLICATION + 1,
                true
            )
        );
        writerOverrideProducer.produce(
            new MessageBodyWriterOverrideBuildItem(
                "org.jboss.resteasy.reactive.server.jsonb.JsonbMessageBodyWriter",
                Priorities.APPLICATION + 1,
                true
            )
        );
    }
}

Consumes: Capabilities

Produces:

  • MessageBodyReaderOverrideBuildItem (conditionally)
  • MessageBodyWriterOverrideBuildItem (conditionally)

Description: This build step ensures Jackson is preferred over JSON-B for REST clients when both are present on the classpath. It checks for the REST_CLIENT_REACTIVE_JSONB capability and, if present, overrides the priority of JSON-B message body readers and writers to APPLICATION + 1, making them lower priority than Jackson (which uses APPLICATION priority).

This is necessary because the Ollama REST client uses Jackson-specific features, and having JSON-B selected would cause serialization issues.

Build Step Execution Order

Quarkus automatically determines the execution order of build steps based on their inputs and outputs. The typical execution order for Ollama processor build steps is:

  1. feature() - Early registration
  2. indexUpstreamOllamaModule() - Early indexing
  3. nativeSupport() - Reflection configuration
  4. providerCandidates() - Provider registration
  5. implicitlyConfiguredProviders() - Named configuration detection
  6. Provider selection (external to this processor)
  7. devServicesSupport() - DevServices requirement detection
  8. DevServices startup (external processor)
  9. generateBeans() - Runtime bean creation
  10. deprioritizeJsonb() - Serialization configuration

Helper Methods

canUseDevServices

private boolean canUseDevServices(String baseUrlProperty) {
    SmallRyeConfig smallRyeConfig = ConfigProvider.getConfig()
        .unwrap(SmallRyeConfig.class);
    ConfigValue configValue = smallRyeConfig.getConfigValue(baseUrlProperty);
    if (configValue.getValue() == null) {
        return true;
    }
    return configValue.getValue().startsWith("http://localhost");
}

Purpose: Determines if DevServices can be used for a given base URL property.

Returns: true if DevServices can be used, false otherwise

Logic:

  • Returns true if the base URL is not configured
  • Returns true if the base URL starts with "http://localhost"
  • Returns false otherwise

addQualifierIfNecessary

private void addQualifierIfNecessary(
    SyntheticBeanBuildItem.ExtendedBeanConfigurator builder,
    String configName
) {
    if (!NamedConfigUtil.isDefault(configName)) {
        builder.addQualifier(
            AnnotationInstance.builder(ModelName.class)
                .add("value", configName)
                .build()
        );
    }
}

Purpose: Adds a @ModelName qualifier to a bean for named configurations.

Parameters:

  • builder - Bean configurator to add qualifier to
  • configName - Configuration name

Logic:

  • Checks if the configuration name is the default (unnamed) configuration
  • If not default, adds a @ModelName(configName) qualifier to the bean
  • Default configurations have no qualifier, making them the default injectable bean

Notes

  • Build steps run during application build, not at runtime
  • The order of build steps is determined by Quarkus based on dependencies
  • Build steps can be conditional based on launch mode, configuration, or capabilities
  • Synthetic beans are created programmatically and injected into application code
  • The recorder is used to defer actual bean creation to runtime initialization
  • Build items communicate information between different processors in the build chain

Install with Tessl CLI

npx tessl i tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-ollama-deployment@1.7.0

docs

architecture.md

build-step-processing.md

build-time-configuration.md

devservices.md

index.md

native-image-support.md

runtime-configuration.md

runtime-model-types.md

synthetic-beans.md

tile.json