Quarkus extension deployment module for integrating Ollama LLM models with Quarkus applications through the LangChain4j framework
The OllamaProcessor class contains build steps that participate in the Quarkus build chain. These methods are annotated with @BuildStep and execute during the application build to configure Ollama integration, register providers, and create runtime beans.
package io.quarkiverse.langchain4j.ollama.deployment;
import java.util.List;
import jakarta.enterprise.context.ApplicationScoped;
import io.quarkus.deployment.annotations.BuildProducer;
import io.quarkus.deployment.annotations.BuildStep;
import io.quarkus.deployment.annotations.ExecutionTime;
import io.quarkus.deployment.annotations.Record;
import io.quarkus.deployment.builditem.*;
import io.quarkus.deployment.Capabilities;
import io.quarkus.deployment.IsNormal;
import io.quarkus.arc.deployment.SyntheticBeanBuildItem;
import io.quarkiverse.langchain4j.deployment.items.*;
import io.quarkiverse.langchain4j.deployment.devservice.Langchain4jDevServicesEnabled;
import io.quarkiverse.langchain4j.ollama.runtime.OllamaRecorder;
import io.quarkiverse.langchain4j.ollama.runtime.config.LangChain4jOllamaFixedRuntimeConfig;
import io.quarkus.resteasy.reactive.spi.MessageBodyReaderOverrideBuildItem;
import io.quarkus.resteasy.reactive.spi.MessageBodyWriterOverrideBuildItem;
public class OllamaProcessor {
private static final String FEATURE = "langchain4j-ollama";
private static final String PROVIDER = "ollama";
@BuildStep
FeatureBuildItem feature();
@BuildStep
IndexDependencyBuildItem indexUpstreamOllamaModule();
@BuildStep
void nativeSupport(
BuildProducer<ServiceProviderBuildItem> serviceProviderProducer,
BuildProducer<ReflectiveClassBuildItem> reflectiveClassProducer,
BuildProducer<ReflectiveHierarchyBuildItem> reflectiveHierarchyProducer
);
@BuildStep
void providerCandidates(
BuildProducer<ChatModelProviderCandidateBuildItem> chatProducer,
BuildProducer<EmbeddingModelProviderCandidateBuildItem> embeddingProducer,
LangChain4jOllamaOpenAiBuildConfig config
);
@BuildStep
void implicitlyConfiguredProviders(
LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
BuildProducer<ImplicitlyUserConfiguredChatProviderBuildItem> producer
);
@BuildStep(onlyIfNot = IsNormal.class, onlyIf = Langchain4jDevServicesEnabled.class)
void devServicesSupport(
List<SelectedChatModelProviderBuildItem> selectedChatModels,
List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbeddingModels,
LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
BuildProducer<DevServicesChatModelRequiredBuildItem> chatProducer,
BuildProducer<DevServicesEmbeddingModelRequiredBuildItem> embeddingProducer
);
@BuildStep
@Record(ExecutionTime.RUNTIME_INIT)
void generateBeans(
OllamaRecorder recorder,
List<SelectedChatModelProviderBuildItem> selectedChatItem,
List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbedding,
BuildProducer<SyntheticBeanBuildItem> beanProducer
);
@BuildStep
void deprioritizeJsonb(
Capabilities capabilities,
BuildProducer<MessageBodyReaderOverrideBuildItem> readerOverrideProducer,
BuildProducer<MessageBodyWriterOverrideBuildItem> writerOverrideProducer
);
}Constants:
FEATURE = "langchain4j-ollama" - Feature identifierPROVIDER = "ollama" - Provider identifier used throughout the build chain@BuildStep
FeatureBuildItem feature() {
return new FeatureBuildItem(FEATURE);
}Purpose: Registers the "langchain4j-ollama" feature with Quarkus.
Produces: FeatureBuildItem
Description: This build step registers the extension as a Quarkus feature, which is reported in build logs and can be queried by other extensions. The feature name appears in the build output and is used for feature detection.
@BuildStep
IndexDependencyBuildItem indexUpstreamOllamaModule() {
return new IndexDependencyBuildItem("dev.langchain4j", "langchain4j-ollama");
}Purpose: Indexes the upstream LangChain4j Ollama module for Jandex processing.
Produces: IndexDependencyBuildItem
Description: This build step ensures that the upstream dev.langchain4j:langchain4j-ollama module is indexed by Jandex, making its classes available for annotation scanning and reflection configuration. This is necessary for the build process to discover and process annotations in the upstream module.
@BuildStep
void nativeSupport(
BuildProducer<ServiceProviderBuildItem> serviceProviderProducer,
BuildProducer<ReflectiveClassBuildItem> reflectiveClassProducer,
BuildProducer<ReflectiveHierarchyBuildItem> reflectiveHierarchyProducer
) {
// Register all ConfigSourceInterceptor service providers
serviceProviderProducer.produce(
ServiceProviderBuildItem.allProvidersFromClassPath(
ConfigSourceInterceptor.class.getName()
)
);
// Register OllamaChatRequest for reflection with full hierarchy
reflectiveHierarchyProducer.produce(
ReflectiveHierarchyBuildItem.builder("dev.langchain4j.model.ollama.OllamaChatRequest")
.source(getClass().getSimpleName())
.build()
);
// Register OllamaChatResponse for reflection with nested types
reflectiveHierarchyProducer.produce(
ReflectiveHierarchyBuildItem.builder("dev.langchain4j.model.ollama.OllamaChatResponse")
.source(getClass().getSimpleName())
.ignoreNested(false)
.build()
);
// Register serializers/deserializers for reflection
reflectiveClassProducer.produce(
ReflectiveClassBuildItem.builder(
"dev.langchain4j.model.ollama.FormatSerializer",
"dev.langchain4j.model.ollama.OllamaDateDeserializer"
)
.constructors()
.methods(false)
.fields(false)
.build()
);
}Purpose: Configures GraalVM native image compilation support.
Produces:
ServiceProviderBuildItem - Registers service providersReflectiveClassBuildItem - Registers classes for reflectionReflectiveHierarchyBuildItem - Registers class hierarchies for reflectionDescription: This build step configures reflection and service provider registration for native image compilation:
ConfigSourceInterceptor implementations found on the classpathOllamaChatRequest with full hierarchy reflectionOllamaChatResponse with nested type reflectionFormatSerializer and OllamaDateDeserializer with constructor reflectionThis ensures that these classes can be accessed via reflection in native images, which is necessary for JSON serialization/deserialization and configuration interception.
@BuildStep
public void providerCandidates(
BuildProducer<ChatModelProviderCandidateBuildItem> chatProducer,
BuildProducer<EmbeddingModelProviderCandidateBuildItem> embeddingProducer,
LangChain4jOllamaOpenAiBuildConfig config
) {
if (config.chatModel().enabled().isEmpty() || config.chatModel().enabled().get()) {
chatProducer.produce(new ChatModelProviderCandidateBuildItem(PROVIDER));
}
if (config.embeddingModel().enabled().isEmpty() || config.embeddingModel().enabled().get()) {
embeddingProducer.produce(new EmbeddingModelProviderCandidateBuildItem(PROVIDER));
}
}Purpose: Registers Ollama as a provider candidate for chat and embedding models.
Consumes: LangChain4jOllamaOpenAiBuildConfig
Produces:
ChatModelProviderCandidateBuildItem (conditionally)EmbeddingModelProviderCandidateBuildItem (conditionally)Description: This build step examines the build-time configuration to determine whether to register Ollama as a provider candidate:
chat-model.enabled is not set or is true, produces ChatModelProviderCandidateBuildItem with provider name "ollama"embedding-model.enabled is not set or is true, produces EmbeddingModelProviderCandidateBuildItem with provider name "ollama"These build items participate in the provider selection mechanism, allowing the Quarkus LangChain4j extension to choose Ollama as the active provider for models.
@BuildStep
public void implicitlyConfiguredProviders(
LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
BuildProducer<ImplicitlyUserConfiguredChatProviderBuildItem> producer
) {
fixedRuntimeConfig.namedConfig().keySet().forEach(configName -> {
producer.produce(new ImplicitlyUserConfiguredChatProviderBuildItem(configName, PROVIDER));
});
}Purpose: Detects and registers named Ollama configurations as implicitly configured providers.
Consumes: LangChain4jOllamaFixedRuntimeConfig
Produces: ImplicitlyUserConfiguredChatProviderBuildItem
Description: This build step examines the fixed runtime configuration to discover named Ollama instances. For each named configuration found (e.g., quarkus.langchain4j.ollama.my-instance.*), it produces an ImplicitlyUserConfiguredChatProviderBuildItem indicating that the user has implicitly configured a provider through named configuration.
This enables the provider selection mechanism to recognize and handle named configurations automatically.
@BuildStep(onlyIfNot = IsNormal.class, onlyIf = Langchain4jDevServicesEnabled.class)
public void devServicesSupport(
List<SelectedChatModelProviderBuildItem> selectedChatModels,
List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbeddingModels,
LangChain4jOllamaFixedRuntimeConfig fixedRuntimeConfig,
BuildProducer<DevServicesChatModelRequiredBuildItem> chatProducer,
BuildProducer<DevServicesEmbeddingModelRequiredBuildItem> embeddingProducer
) {
// For each selected chat model using Ollama provider
for (var selected : selectedChatModels) {
if (PROVIDER.equals(selected.getProvider())) {
String configName = selected.getConfigName();
String baseUrlProperty = String.format(
"quarkus.langchain4j.ollama%s%s",
NamedConfigUtil.isDefault(configName) ? "." : ("." + configName + "."),
"base-url"
);
// Only need DevServices if base URL not configured or points to localhost
if (canUseDevServices(baseUrlProperty)) {
String modelId = NamedConfigUtil.isDefault(configName)
? fixedRuntimeConfig.defaultConfig().chatModel().modelId()
: fixedRuntimeConfig.namedConfig().get(configName).chatModel().modelId();
chatProducer.produce(
new DevServicesChatModelRequiredBuildItem(PROVIDER, modelId, baseUrlProperty)
);
}
}
}
// For each selected embedding model using Ollama provider
for (var selected : selectedEmbeddingModels) {
if (PROVIDER.equals(selected.getProvider())) {
String configName = selected.getConfigName();
String baseUrlProperty = String.format(
"quarkus.langchain4j.ollama%s%s",
NamedConfigUtil.isDefault(configName) ? "." : ("." + configName + "."),
"base-url"
);
// Only need DevServices if base URL not configured or points to localhost
if (canUseDevServices(baseUrlProperty)) {
String modelId = NamedConfigUtil.isDefault(configName)
? fixedRuntimeConfig.defaultConfig().embeddingModel().modelId()
: fixedRuntimeConfig.namedConfig().get(configName).embeddingModel().modelId();
embeddingProducer.produce(
new DevServicesEmbeddingModelRequiredBuildItem(PROVIDER, modelId, baseUrlProperty)
);
}
}
}
}Conditions:
onlyIfNot = IsNormal.class - Only runs in development/test modesonlyIf = Langchain4jDevServicesEnabled.class - Only runs if LangChain4j DevServices are enabledConsumes:
List<SelectedChatModelProviderBuildItem>List<SelectedEmbeddingModelCandidateBuildItem>LangChain4jOllamaFixedRuntimeConfigProduces:
DevServicesChatModelRequiredBuildItem (conditionally)DevServicesEmbeddingModelRequiredBuildItem (conditionally)Description: This build step determines which models need DevServices. For each selected model using the Ollama provider, it:
The canUseDevServices(String baseUrlProperty) helper method checks if the base URL is unset or points to localhost.
@BuildStep
@Record(ExecutionTime.RUNTIME_INIT)
void generateBeans(
OllamaRecorder recorder,
List<SelectedChatModelProviderBuildItem> selectedChatItem,
List<SelectedEmbeddingModelCandidateBuildItem> selectedEmbedding,
BuildProducer<SyntheticBeanBuildItem> beanProducer
) {
// Generate ChatModel and StreamingChatModel beans
for (var selected : selectedChatItem) {
if (PROVIDER.equals(selected.getProvider())) {
String configName = selected.getConfigName();
// ChatModel bean
var builder = SyntheticBeanBuildItem
.configure(CHAT_MODEL)
.setRuntimeInit()
.defaultBean()
.scope(ApplicationScoped.class)
.addInjectionPoint(ParameterizedType.create(
DotNames.CDI_INSTANCE,
new Type[] { ClassType.create(DotNames.CHAT_MODEL_LISTENER) },
null
))
.createWith(recorder.chatModel(configName));
addQualifierIfNecessary(builder, configName);
beanProducer.produce(builder.done());
// StreamingChatModel bean
var streamingBuilder = SyntheticBeanBuildItem
.configure(STREAMING_CHAT_MODEL)
.setRuntimeInit()
.defaultBean()
.scope(ApplicationScoped.class)
.addInjectionPoint(ParameterizedType.create(
DotNames.CDI_INSTANCE,
new Type[] { ClassType.create(DotNames.CHAT_MODEL_LISTENER) },
null
))
.createWith(recorder.streamingChatModel(configName));
addQualifierIfNecessary(streamingBuilder, configName);
beanProducer.produce(streamingBuilder.done());
}
}
// Generate EmbeddingModel beans
for (var selected : selectedEmbedding) {
if (PROVIDER.equals(selected.getProvider())) {
String configName = selected.getConfigName();
var builder = SyntheticBeanBuildItem
.configure(EMBEDDING_MODEL)
.setRuntimeInit()
.defaultBean()
.unremovable()
.scope(ApplicationScoped.class)
.supplier(recorder.embeddingModel(configName));
addQualifierIfNecessary(builder, configName);
beanProducer.produce(builder.done());
}
}
}Execution Time: RUNTIME_INIT - Executes at runtime initialization
Consumes:
OllamaRecorder - Runtime recorder for bean creationList<SelectedChatModelProviderBuildItem>List<SelectedEmbeddingModelCandidateBuildItem>Produces: SyntheticBeanBuildItem
Description: This build step creates synthetic CDI beans for the selected models. For each selected model using the Ollama provider:
Chat Models:
ChatModel bean with @ApplicationScoped scopeInstance<ChatModelListener>recorder.chatModel(configName) to create the bean at runtime@ModelName qualifier for named configurationsStreaming Chat Models:
StreamingChatModel bean with @ApplicationScoped scopeInstance<ChatModelListener>recorder.streamingChatModel(configName) to create the bean at runtime@ModelName qualifier for named configurationsEmbedding Models:
EmbeddingModel bean with @ApplicationScoped scopeunremovable() to prevent removal during optimizationrecorder.embeddingModel(configName) to supply the bean at runtime@ModelName qualifier for named configurationsThe addQualifierIfNecessary(builder, configName) helper method adds a @ModelName qualifier for non-default configurations.
@BuildStep
public void deprioritizeJsonb(
Capabilities capabilities,
BuildProducer<MessageBodyReaderOverrideBuildItem> readerOverrideProducer,
BuildProducer<MessageBodyWriterOverrideBuildItem> writerOverrideProducer
) {
if (capabilities.isPresent(Capability.REST_CLIENT_REACTIVE_JSONB)) {
readerOverrideProducer.produce(
new MessageBodyReaderOverrideBuildItem(
"org.jboss.resteasy.reactive.server.jsonb.JsonbMessageBodyReader",
Priorities.APPLICATION + 1,
true
)
);
writerOverrideProducer.produce(
new MessageBodyWriterOverrideBuildItem(
"org.jboss.resteasy.reactive.server.jsonb.JsonbMessageBodyWriter",
Priorities.APPLICATION + 1,
true
)
);
}
}Consumes: Capabilities
Produces:
MessageBodyReaderOverrideBuildItem (conditionally)MessageBodyWriterOverrideBuildItem (conditionally)Description: This build step ensures Jackson is preferred over JSON-B for REST clients when both are present on the classpath. It checks for the REST_CLIENT_REACTIVE_JSONB capability and, if present, overrides the priority of JSON-B message body readers and writers to APPLICATION + 1, making them lower priority than Jackson (which uses APPLICATION priority).
This is necessary because the Ollama REST client uses Jackson-specific features, and having JSON-B selected would cause serialization issues.
Quarkus automatically determines the execution order of build steps based on their inputs and outputs. The typical execution order for Ollama processor build steps is:
feature() - Early registrationindexUpstreamOllamaModule() - Early indexingnativeSupport() - Reflection configurationproviderCandidates() - Provider registrationimplicitlyConfiguredProviders() - Named configuration detectiondevServicesSupport() - DevServices requirement detectiongenerateBeans() - Runtime bean creationdeprioritizeJsonb() - Serialization configurationprivate boolean canUseDevServices(String baseUrlProperty) {
SmallRyeConfig smallRyeConfig = ConfigProvider.getConfig()
.unwrap(SmallRyeConfig.class);
ConfigValue configValue = smallRyeConfig.getConfigValue(baseUrlProperty);
if (configValue.getValue() == null) {
return true;
}
return configValue.getValue().startsWith("http://localhost");
}Purpose: Determines if DevServices can be used for a given base URL property.
Returns: true if DevServices can be used, false otherwise
Logic:
true if the base URL is not configuredtrue if the base URL starts with "http://localhost"false otherwiseprivate void addQualifierIfNecessary(
SyntheticBeanBuildItem.ExtendedBeanConfigurator builder,
String configName
) {
if (!NamedConfigUtil.isDefault(configName)) {
builder.addQualifier(
AnnotationInstance.builder(ModelName.class)
.add("value", configName)
.build()
);
}
}Purpose: Adds a @ModelName qualifier to a bean for named configurations.
Parameters:
builder - Bean configurator to add qualifier toconfigName - Configuration nameLogic:
@ModelName(configName) qualifier to the beanInstall with Tessl CLI
npx tessl i tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-ollama-deployment