Quarkus extension deployment module for integrating Ollama LLM models with Quarkus applications through the LangChain4j framework
The Ollama DevServices functionality automatically starts an Ollama container during development and testing, eliminating the need for manual Ollama installation and configuration. This feature is part of Quarkus DevServices, which provides zero-configuration development experiences.
package io.quarkiverse.langchain4j.ollama.deployment.devservices;
import java.util.OptionalInt;
import io.quarkus.runtime.annotations.ConfigGroup;
import io.smallrye.config.WithDefault;
@ConfigGroup
public interface OllamaDevServicesBuildConfig {
/**
* Default docker image name.
*/
String OLLAMA_IMAGE = "ollama/ollama:latest";
/**
* If Dev Services for Ollama has been explicitly enabled or disabled.
* Dev Services are generally enabled by default, unless there is an
* existing configuration present.
*/
@WithDefault("true")
boolean enabled();
/**
* The Ollama container image to use.
*/
@WithDefault(OLLAMA_IMAGE)
String imageName();
/**
* The port that the dev service should be exposed on.
* Default: A random free port.
*/
OptionalInt port();
}Configuration Prefix: quarkus.langchain4j.ollama.devservices
Methods:
enabled() - Whether DevServices are enabled (default: true)imageName() - Docker image to use (default: "ollama/ollama:latest")port() - Optional fixed port for the service (default: random free port)Constant:
OLLAMA_IMAGE - Default docker image name constantquarkus.langchain4j.ollama.devservices.enabled=trueType: boolean
Default: true
Description: Controls whether DevServices should automatically start an Ollama container. DevServices are enabled by default but will not start if:
quarkus.langchain4j.ollama.devservices.image-name=ollama/ollama:latestType: String
Default: ollama/ollama:latest
Description: Specifies the Docker image to use for the Ollama container. You can specify a different version or custom image if needed.
quarkus.langchain4j.ollama.devservices.port=11434Type: OptionalInt
Default: (random free port)
Description: Specifies a fixed port for the DevServices container. If not set, a random free port is selected. Setting a fixed port is useful for:
The OllamaDevServicesProcessor manages the DevServices lifecycle.
package io.quarkiverse.langchain4j.ollama.deployment.devservices;
import java.util.List;
import java.util.Optional;
import io.quarkus.deployment.IsNormal;
import io.quarkus.deployment.annotations.BuildProducer;
import io.quarkus.deployment.annotations.BuildStep;
import io.quarkus.deployment.annotations.BuildSteps;
import io.quarkus.deployment.builditem.*;
import io.quarkus.deployment.console.ConsoleInstalledBuildItem;
import io.quarkus.deployment.console.StartupLogCompressor;
import io.quarkus.deployment.dev.devservices.DevServicesConfig;
import io.quarkus.deployment.logging.LoggingSetupBuildItem;
import io.quarkiverse.langchain4j.deployment.devservice.Langchain4jDevServicesEnabled;
import io.quarkiverse.langchain4j.deployment.items.*;
import io.quarkiverse.langchain4j.ollama.deployment.LangChain4jOllamaOpenAiBuildConfig;
@BuildSteps(onlyIfNot = IsNormal.class, onlyIf = DevServicesConfig.Enabled.class)
public class OllamaDevServicesProcessor {
public static final String FEATURE = "langchain4j-ollama-dev-service";
public static final String PROVIDER = "ollama";
static final String DEV_SERVICE_LABEL = "quarkus-dev-service-ollama";
static volatile DevServicesResultBuildItem.RunningDevService devService;
static volatile OllamaDevServicesBuildConfig cfg;
static volatile boolean first = true;
@BuildStep
FeatureBuildItem feature() {
return new FeatureBuildItem(FEATURE);
}
@BuildStep(onlyIfNot = IsNormal.class, onlyIf = Langchain4jDevServicesEnabled.class)
public void startOllamaDevService(
DockerStatusBuildItem dockerStatusBuildItem,
LaunchModeBuildItem launchMode,
LangChain4jOllamaOpenAiBuildConfig ollamaBuildConfig,
Optional<ConsoleInstalledBuildItem> consoleInstalledBuildItem,
LoggingSetupBuildItem loggingSetupBuildItem,
List<DevServicesSharedNetworkBuildItem> devServicesSharedNetworkBuildItem,
List<DevServicesChatModelRequiredBuildItem> devServicesChatModels,
List<DevServicesEmbeddingModelRequiredBuildItem> devServicesEmbeddingModels,
BuildProducer<DevServicesOllamaConfigBuildItem> ollamaDevServicesBuildItemBuildProducer,
BuildProducer<DevServicesResultBuildItem> devServicesResultProducer
);
}Constants:
FEATURE - Feature name for DevServices ("langchain4j-ollama-dev-service")PROVIDER - Provider identifier ("ollama")DEV_SERVICE_LABEL - Label for shared dev service containers ("quarkus-dev-service-ollama")Static Fields (Lifecycle Management):
devService (static volatile DevServicesResultBuildItem.RunningDevService) - Currently running DevService instance. Marked volatile to ensure visibility across threads during hot-reload scenarios in development mode.cfg (static volatile OllamaDevServicesBuildConfig) - Current DevServices configuration. Used to detect configuration changes that require container restart. Marked volatile for thread-safe configuration comparison.first (static volatile boolean) - Initialization flag tracking first run for cleanup hook registration. Ensures cleanup hooks are registered only once per ClassLoader lifecycle.DevServices are activated only when ALL of the following conditions are met:
DevServicesConfig.Enabled condition is trueLangchain4jDevServicesEnabled condition is trueOllamaDevServicesBuildConfig is compared with the stored configuration using the equals() method. The comparison checks if enabled(), imageName(), or port() values have changed.The OllamaContainer class wraps Testcontainers' Ollama container:
package io.quarkiverse.langchain4j.ollama.deployment.devservices;
import java.time.Duration;
import java.util.Map;
import java.util.OptionalInt;
import org.testcontainers.ollama.OllamaContainer as TestcontainersOllamaContainer;
import org.testcontainers.utility.DockerImageName;
public class OllamaContainer extends TestcontainersOllamaContainer {
public static final String CONFIG_OLLAMA_PORT = "langchain4j-ollama-dev-service.ollama.port";
public static final String CONFIG_OLLAMA_HTTP_SERVER = "langchain4j-ollama-dev-service.ollama.host";
public static final String CONFIG_OLLAMA_ENDPOINT = "langchain4j-ollama-dev-service.ollama.endpoint";
public static final int DEFAULT_OLLAMA_PORT = 11434;
// Package-private constructor - not intended for direct instantiation
// Only instantiated by OllamaDevServicesProcessor
OllamaContainer(OllamaDevServicesBuildConfig config, boolean useSharedNetwork);
public Map<String, String> getExposedConfig();
public int getPort();
public String getHost(); // Overrides parent class method
}Constants:
CONFIG_OLLAMA_PORT - Configuration key for Ollama portCONFIG_OLLAMA_HTTP_SERVER - Configuration key for Ollama hostCONFIG_OLLAMA_ENDPOINT - Configuration key for Ollama endpointDEFAULT_OLLAMA_PORT - Default Ollama port (11434)Constructor:
OllamaContainer(OllamaDevServicesBuildConfig, boolean) - Package-private constructor, instantiated only by OllamaDevServicesProcessor. Do not instantiate directly; use DevServices configuration instead.Methods:
getExposedConfig() - Returns configuration map with port, host, endpoint, and environment variables. Includes the getEndpoint() value inherited from parent Testcontainers class.getPort() - Returns the exposed port (fixed or mapped from configuration)getHost() - Returns the host name (shared network hostname or container host). Overrides parent class method.The DevServices container automatically binds the local Ollama directory for model caching:
OLLAMA_MODELS environment variable~/.ollama directory/root/.ollama in the containerREAD_WRITE mode to allow model downloads to persistThis ensures that downloaded models are cached locally and shared between container restarts and multiple applications.
((QuarkusClassLoader) cl.parent()).addCloseTask(closeTask). This ensures the container is shut down when the ClassLoader is closed during application shutdown or reload.close() method is called via the RunningDevService.close() interfacedevService, cfg, first) are reset for next runDevServices can use a shared Docker network when multiple Quarkus applications with DevServices are running:
DevServicesSharedNetworkBuildItem in the build chainConfigureUtil.configureSharedNetwork(this, "ollama") to join the container to a shared Docker network named "quarkus-shared-network"quarkus-dev-service-ollamaWhen DevServices starts successfully, it injects configuration into the application:
var devServiceConfig = new LinkedHashMap<>(devService.getConfig());
modelBaseUrlKeys.forEach(baseUrlKey ->
devServiceConfig.put(baseUrlKey, devServiceConfig.get(CONFIG_OLLAMA_ENDPOINT))
);This updates all model base URL properties to point to the DevServices container endpoint, ensuring that the application connects to the correct Ollama instance.
No configuration needed - DevServices automatically start in development mode:
# No DevServices configuration needed - uses defaults
# Container will use ollama/ollama:latest on random portquarkus.langchain4j.ollama.devservices.image-name=ollama/ollama:0.1.23quarkus.langchain4j.ollama.devservices.port=11434quarkus.langchain4j.ollama.devservices.enabled=falseUse this when:
# Default instance uses DevServices
quarkus.langchain4j.ollama.devservices.enabled=true
# Named instance connects to remote Ollama
quarkus.langchain4j.ollama.remote.base-url=https://ollama.example.comThe OllamaProcessor.devServicesSupport() build step determines which models need DevServices:
@BuildStep(onlyIfNot = IsNormal.class, onlyIf = Langchain4jDevServicesEnabled.class)
public void devServicesSupport(
List<SelectedChatModelProviderBuildItem> selectedChatModels,
List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbeddingModels,
LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
BuildProducer<DevServicesChatModelRequiredBuildItem> chatProducer,
BuildProducer<DevServicesEmbeddingModelRequiredBuildItem> embeddingProducer
);This build step:
DevServicesChatModelRequiredBuildItem or DevServicesEmbeddingModelRequiredBuildItem for models that need DevServicesThe DevServices integration includes Dev UI support through OllamaDevUiProcessor:
package io.quarkiverse.langchain4j.ollama.deployment.devui;
import java.util.Map;
import io.quarkiverse.langchain4j.deployment.devui.AdditionalDevUiCardBuildItem;
import io.quarkus.deployment.IsDevelopment;
import io.quarkus.deployment.annotations.BuildProducer;
import io.quarkus.deployment.annotations.BuildStep;
public final class OllamaDevUiProcessor {
@BuildStep(onlyIf = IsDevelopment.class)
void registerOpenWebUiCard(BuildProducer<AdditionalDevUiCardBuildItem> producer) {
producer.produce(new AdditionalDevUiCardBuildItem(
"Open WebUI",
"font-awesome-solid:globe",
"qwc-open-webui.js",
Map.of("envVarMappings", Map.of("OLLAMA_BASE_URL", "quarkus.langchain4j.ollama.base-url"))
));
}
}This adds an "Open WebUI" card to the Quarkus Dev UI in development mode, providing quick access to the Ollama web interface with the correct base URL.
~/.ollama (or $OLLAMA_MODELS directory)Install with Tessl CLI
npx tessl i tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-ollama-deployment