CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-ollama-deployment

Quarkus extension deployment module for integrating Ollama LLM models with Quarkus applications through the LangChain4j framework

Overview
Eval results
Files

devservices.mddocs/

DevServices Configuration

The Ollama DevServices functionality automatically starts an Ollama container during development and testing, eliminating the need for manual Ollama installation and configuration. This feature is part of Quarkus DevServices, which provides zero-configuration development experiences.

DevServices Configuration Interface

package io.quarkiverse.langchain4j.ollama.deployment.devservices;

import java.util.OptionalInt;
import io.quarkus.runtime.annotations.ConfigGroup;
import io.smallrye.config.WithDefault;

@ConfigGroup
public interface OllamaDevServicesBuildConfig {
    /**
     * Default docker image name.
     */
    String OLLAMA_IMAGE = "ollama/ollama:latest";

    /**
     * If Dev Services for Ollama has been explicitly enabled or disabled.
     * Dev Services are generally enabled by default, unless there is an
     * existing configuration present.
     */
    @WithDefault("true")
    boolean enabled();

    /**
     * The Ollama container image to use.
     */
    @WithDefault(OLLAMA_IMAGE)
    String imageName();

    /**
     * The port that the dev service should be exposed on.
     * Default: A random free port.
     */
    OptionalInt port();
}

Configuration Prefix: quarkus.langchain4j.ollama.devservices

Methods:

  • enabled() - Whether DevServices are enabled (default: true)
  • imageName() - Docker image to use (default: "ollama/ollama:latest")
  • port() - Optional fixed port for the service (default: random free port)

Constant:

  • OLLAMA_IMAGE - Default docker image name constant

Configuration Properties

DevServices Enabled

quarkus.langchain4j.ollama.devservices.enabled=true

Type: boolean

Default: true

Description: Controls whether DevServices should automatically start an Ollama container. DevServices are enabled by default but will not start if:

  • The application is running in production mode
  • A base URL is explicitly configured (and doesn't point to localhost)
  • Ollama is already running on localhost:11434
  • Docker is not available

Container Image Name

quarkus.langchain4j.ollama.devservices.image-name=ollama/ollama:latest

Type: String

Default: ollama/ollama:latest

Description: Specifies the Docker image to use for the Ollama container. You can specify a different version or custom image if needed.

Fixed Port

quarkus.langchain4j.ollama.devservices.port=11434

Type: OptionalInt

Default: (random free port)

Description: Specifies a fixed port for the DevServices container. If not set, a random free port is selected. Setting a fixed port is useful for:

  • Consistent development environments
  • Integration with external tools
  • Debugging and testing

DevServices Processor

The OllamaDevServicesProcessor manages the DevServices lifecycle.

package io.quarkiverse.langchain4j.ollama.deployment.devservices;

import java.util.List;
import java.util.Optional;
import io.quarkus.deployment.IsNormal;
import io.quarkus.deployment.annotations.BuildProducer;
import io.quarkus.deployment.annotations.BuildStep;
import io.quarkus.deployment.annotations.BuildSteps;
import io.quarkus.deployment.builditem.*;
import io.quarkus.deployment.console.ConsoleInstalledBuildItem;
import io.quarkus.deployment.console.StartupLogCompressor;
import io.quarkus.deployment.dev.devservices.DevServicesConfig;
import io.quarkus.deployment.logging.LoggingSetupBuildItem;
import io.quarkiverse.langchain4j.deployment.devservice.Langchain4jDevServicesEnabled;
import io.quarkiverse.langchain4j.deployment.items.*;
import io.quarkiverse.langchain4j.ollama.deployment.LangChain4jOllamaOpenAiBuildConfig;

@BuildSteps(onlyIfNot = IsNormal.class, onlyIf = DevServicesConfig.Enabled.class)
public class OllamaDevServicesProcessor {
    public static final String FEATURE = "langchain4j-ollama-dev-service";
    public static final String PROVIDER = "ollama";
    static final String DEV_SERVICE_LABEL = "quarkus-dev-service-ollama";

    static volatile DevServicesResultBuildItem.RunningDevService devService;
    static volatile OllamaDevServicesBuildConfig cfg;
    static volatile boolean first = true;

    @BuildStep
    FeatureBuildItem feature() {
        return new FeatureBuildItem(FEATURE);
    }

    @BuildStep(onlyIfNot = IsNormal.class, onlyIf = Langchain4jDevServicesEnabled.class)
    public void startOllamaDevService(
        DockerStatusBuildItem dockerStatusBuildItem,
        LaunchModeBuildItem launchMode,
        LangChain4jOllamaOpenAiBuildConfig ollamaBuildConfig,
        Optional<ConsoleInstalledBuildItem> consoleInstalledBuildItem,
        LoggingSetupBuildItem loggingSetupBuildItem,
        List<DevServicesSharedNetworkBuildItem> devServicesSharedNetworkBuildItem,
        List<DevServicesChatModelRequiredBuildItem> devServicesChatModels,
        List<DevServicesEmbeddingModelRequiredBuildItem> devServicesEmbeddingModels,
        BuildProducer<DevServicesOllamaConfigBuildItem> ollamaDevServicesBuildItemBuildProducer,
        BuildProducer<DevServicesResultBuildItem> devServicesResultProducer
    );
}

Constants:

  • FEATURE - Feature name for DevServices ("langchain4j-ollama-dev-service")
  • PROVIDER - Provider identifier ("ollama")
  • DEV_SERVICE_LABEL - Label for shared dev service containers ("quarkus-dev-service-ollama")

Static Fields (Lifecycle Management):

  • devService (static volatile DevServicesResultBuildItem.RunningDevService) - Currently running DevService instance. Marked volatile to ensure visibility across threads during hot-reload scenarios in development mode.
  • cfg (static volatile OllamaDevServicesBuildConfig) - Current DevServices configuration. Used to detect configuration changes that require container restart. Marked volatile for thread-safe configuration comparison.
  • first (static volatile boolean) - Initialization flag tracking first run for cleanup hook registration. Ensures cleanup hooks are registered only once per ClassLoader lifecycle.

DevServices Activation Conditions

DevServices are activated only when ALL of the following conditions are met:

  1. Not Production Mode: The application is not running in normal (production) mode
  2. DevServices Enabled: DevServicesConfig.Enabled condition is true
  3. LangChain4j DevServices Enabled: Langchain4jDevServicesEnabled condition is true
  4. Configuration Check: Either:
    • No base URL is configured, OR
    • The configured base URL points to localhost
  5. Docker Available: Docker runtime is available and working
  6. Ollama Not Running: Ollama is not already running on localhost:11434
  7. Model Selected: At least one chat or embedding model has selected Ollama as its provider

DevServices Lifecycle

Startup Process

  1. Check Existing Instance: If an Ollama instance is already running on localhost:11434, DevServices will not start
  2. Configuration Comparison: If DevServices was previously started, the new OllamaDevServicesBuildConfig is compared with the stored configuration using the equals() method. The comparison checks if enabled(), imageName(), or port() values have changed.
  3. Shutdown if Changed: If configuration changed (configurations are not equal), shut down the existing container to allow restart with new settings
  4. Collect Requirements: Gather all base URL properties that need DevServices
  5. Start Container: Create and start the Ollama container using the configured image and port
  6. Configure Application: Update all model base URL properties to point to the container endpoint
  7. Register Cleanup: Set up shutdown hooks to clean up the container

Container Configuration

The OllamaContainer class wraps Testcontainers' Ollama container:

package io.quarkiverse.langchain4j.ollama.deployment.devservices;

import java.time.Duration;
import java.util.Map;
import java.util.OptionalInt;
import org.testcontainers.ollama.OllamaContainer as TestcontainersOllamaContainer;
import org.testcontainers.utility.DockerImageName;

public class OllamaContainer extends TestcontainersOllamaContainer {
    public static final String CONFIG_OLLAMA_PORT = "langchain4j-ollama-dev-service.ollama.port";
    public static final String CONFIG_OLLAMA_HTTP_SERVER = "langchain4j-ollama-dev-service.ollama.host";
    public static final String CONFIG_OLLAMA_ENDPOINT = "langchain4j-ollama-dev-service.ollama.endpoint";
    public static final int DEFAULT_OLLAMA_PORT = 11434;

    // Package-private constructor - not intended for direct instantiation
    // Only instantiated by OllamaDevServicesProcessor
    OllamaContainer(OllamaDevServicesBuildConfig config, boolean useSharedNetwork);

    public Map<String, String> getExposedConfig();
    public int getPort();
    public String getHost();  // Overrides parent class method
}

Constants:

  • CONFIG_OLLAMA_PORT - Configuration key for Ollama port
  • CONFIG_OLLAMA_HTTP_SERVER - Configuration key for Ollama host
  • CONFIG_OLLAMA_ENDPOINT - Configuration key for Ollama endpoint
  • DEFAULT_OLLAMA_PORT - Default Ollama port (11434)

Constructor:

  • OllamaContainer(OllamaDevServicesBuildConfig, boolean) - Package-private constructor, instantiated only by OllamaDevServicesProcessor. Do not instantiate directly; use DevServices configuration instead.

Methods:

  • getExposedConfig() - Returns configuration map with port, host, endpoint, and environment variables. Includes the getEndpoint() value inherited from parent Testcontainers class.
  • getPort() - Returns the exposed port (fixed or mapped from configuration)
  • getHost() - Returns the host name (shared network hostname or container host). Overrides parent class method.

Directory Binding

The DevServices container automatically binds the local Ollama directory for model caching:

  1. Checks for OLLAMA_MODELS environment variable
  2. Falls back to ~/.ollama directory
  3. Creates the directory if it doesn't exist
  4. Binds it to /root/.ollama in the container
  5. Uses READ_WRITE mode to allow model downloads to persist

This ensures that downloaded models are cached locally and shared between container restarts and multiple applications.

Shutdown Process

  1. Cleanup Trigger: Container cleanup is triggered when:
    • The ClassLoader is closed (via registered close task)
    • The application stops
    • The configuration changes (triggering restart)
  2. Close Task Registration: On first DevServices startup, a cleanup hook is registered with the parent QuarkusClassLoader using ((QuarkusClassLoader) cl.parent()).addCloseTask(closeTask). This ensures the container is shut down when the ClassLoader is closed during application shutdown or reload.
  3. Close Container: The container's close() method is called via the RunningDevService.close() interface
  4. Resource Cleanup: All container resources (ports, volumes, network) are released
  5. Reset State: Static fields (devService, cfg, first) are reset for next run

Shared Network Support

DevServices can use a shared Docker network when multiple Quarkus applications with DevServices are running:

  1. Network Detection: Checks for DevServicesSharedNetworkBuildItem in the build chain
  2. Network Configuration: Calls ConfigureUtil.configureSharedNetwork(this, "ollama") to join the container to a shared Docker network named "quarkus-shared-network"
  3. Service Discovery: Other applications can discover the running Ollama service via the label quarkus-dev-service-ollama
  4. Resource Sharing: Multiple applications share the same Ollama container, reducing memory and startup time
  5. Network Isolation: Shared network allows containers to communicate via service names rather than IP addresses

Configuration Injection

When DevServices starts successfully, it injects configuration into the application:

var devServiceConfig = new LinkedHashMap<>(devService.getConfig());
modelBaseUrlKeys.forEach(baseUrlKey ->
    devServiceConfig.put(baseUrlKey, devServiceConfig.get(CONFIG_OLLAMA_ENDPOINT))
);

This updates all model base URL properties to point to the DevServices container endpoint, ensuring that the application connects to the correct Ollama instance.

Usage Examples

Basic DevServices (Default)

No configuration needed - DevServices automatically start in development mode:

# No DevServices configuration needed - uses defaults
# Container will use ollama/ollama:latest on random port

Custom Image Version

quarkus.langchain4j.ollama.devservices.image-name=ollama/ollama:0.1.23

Fixed Port

quarkus.langchain4j.ollama.devservices.port=11434

Disable DevServices

quarkus.langchain4j.ollama.devservices.enabled=false

Use this when:

  • You have Ollama installed locally and want to use it
  • You're connecting to a remote Ollama instance
  • You're running in production mode

Named Configuration with DevServices

# Default instance uses DevServices
quarkus.langchain4j.ollama.devservices.enabled=true

# Named instance connects to remote Ollama
quarkus.langchain4j.ollama.remote.base-url=https://ollama.example.com

DevServices Requirements Detection

The OllamaProcessor.devServicesSupport() build step determines which models need DevServices:

@BuildStep(onlyIfNot = IsNormal.class, onlyIf = Langchain4jDevServicesEnabled.class)
public void devServicesSupport(
    List<SelectedChatModelProviderBuildItem> selectedChatModels,
    List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbeddingModels,
    LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
    BuildProducer<DevServicesChatModelRequiredBuildItem> chatProducer,
    BuildProducer<DevServicesEmbeddingModelRequiredBuildItem> embeddingProducer
);

This build step:

  1. Examines all selected chat and embedding models
  2. Checks if each model uses the "ollama" provider
  3. Determines the base URL property for each model
  4. Checks if the base URL is not configured or points to localhost
  5. Produces DevServicesChatModelRequiredBuildItem or DevServicesEmbeddingModelRequiredBuildItem for models that need DevServices

Dev UI Integration

The DevServices integration includes Dev UI support through OllamaDevUiProcessor:

package io.quarkiverse.langchain4j.ollama.deployment.devui;

import java.util.Map;
import io.quarkiverse.langchain4j.deployment.devui.AdditionalDevUiCardBuildItem;
import io.quarkus.deployment.IsDevelopment;
import io.quarkus.deployment.annotations.BuildProducer;
import io.quarkus.deployment.annotations.BuildStep;

public final class OllamaDevUiProcessor {
    @BuildStep(onlyIf = IsDevelopment.class)
    void registerOpenWebUiCard(BuildProducer<AdditionalDevUiCardBuildItem> producer) {
        producer.produce(new AdditionalDevUiCardBuildItem(
            "Open WebUI",
            "font-awesome-solid:globe",
            "qwc-open-webui.js",
            Map.of("envVarMappings", Map.of("OLLAMA_BASE_URL", "quarkus.langchain4j.ollama.base-url"))
        ));
    }
}

This adds an "Open WebUI" card to the Quarkus Dev UI in development mode, providing quick access to the Ollama web interface with the correct base URL.

Notes

  • DevServices only run in development and test modes, never in production
  • The container automatically shuts down when the application stops
  • Downloaded models are persisted in ~/.ollama (or $OLLAMA_MODELS directory)
  • Multiple applications can share the same DevServices container via the shared network feature
  • DevServices will not start if Ollama is already running locally on port 11434
  • The startup timeout is 1 minute to allow time for the container to initialize
  • DevServices use Testcontainers internally, which requires Docker or a compatible container runtime

Install with Tessl CLI

npx tessl i tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-ollama-deployment

docs

architecture.md

build-step-processing.md

build-time-configuration.md

devservices.md

index.md

native-image-support.md

runtime-configuration.md

runtime-model-types.md

synthetic-beans.md

tile.json