CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-dev-langchain4j--langchain4j-ollama

Java integration library enabling LangChain4j applications to use Ollama's local language models with support for chat, streaming, embeddings, and advanced reasoning features

Overview
Eval results
Files

spi.mddocs/

Service Provider Interface (SPI)

The langchain4j-ollama module provides SPI (Service Provider Interface) interfaces for custom builder factory implementations. These interfaces enable framework integrations and dependency injection scenarios.

Overview

SPI interfaces follow the Java ServiceLoader pattern, allowing custom implementations to provide builder instances. Each model class has a corresponding factory interface.

Common use cases:

  • Dependency injection frameworks (Spring, CDI, Guice)
  • Custom builder configuration and initialization
  • Framework-level model management
  • Testing and mocking infrastructure

OllamaChatModelBuilderFactory

Factory interface for creating OllamaChatModel builder instances.

Interface Signature

package dev.langchain4j.model.ollama.spi;

import dev.langchain4j.model.ollama.OllamaChatModel;
import java.util.function.Supplier;

public interface OllamaChatModelBuilderFactory
    extends Supplier<OllamaChatModel.OllamaChatModelBuilder>

Thread Safety: Implementations should be stateless and thread-safe

Method

OllamaChatModel.OllamaChatModelBuilder get()

Returns a new builder instance for OllamaChatModel.

Returns: Fresh OllamaChatModelBuilder instance

  • Never null
  • Each call returns new instance

Thread Safety: Must be safe for concurrent calls

Example Implementation:

public class CustomOllamaChatModelBuilderFactory
    implements OllamaChatModelBuilderFactory {

    @Override
    public OllamaChatModel.OllamaChatModelBuilder get() {
        return OllamaChatModel.builder()
            .baseUrl(loadFromConfig("ollama.baseUrl"))
            .timeout(Duration.ofMinutes(5))
            .logRequests(true);
    }
}

ServiceLoader Registration

Create file META-INF/services/dev.langchain4j.model.ollama.spi.OllamaChatModelBuilderFactory:

com.example.CustomOllamaChatModelBuilderFactory

Loading via ServiceLoader

import dev.langchain4j.model.ollama.spi.OllamaChatModelBuilderFactory;
import java.util.ServiceLoader;

ServiceLoader<OllamaChatModelBuilderFactory> loader =
    ServiceLoader.load(OllamaChatModelBuilderFactory.class);

OllamaChatModelBuilderFactory factory = loader.findFirst()
    .orElseThrow(() -> new IllegalStateException("No factory found"));

OllamaChatModel model = factory.get()
    .modelName("llama2")
    .build();

Best Practices

1. Provide Sensible Defaults

@Override
public OllamaChatModel.OllamaChatModelBuilder get() {
    return OllamaChatModel.builder()
        .baseUrl("http://localhost:11434")  // Default URL
        .timeout(Duration.ofMinutes(5))      // Reasonable timeout
        .maxRetries(3);                       // Automatic retries
}

2. Thread Safety

// Factories should be stateless and thread-safe
public class ThreadSafeFactory implements OllamaChatModelBuilderFactory {
    @Override
    public OllamaChatModel.OllamaChatModelBuilder get() {
        // Returns new builder instance each time - thread safe
        return OllamaChatModel.builder()
            .baseUrl(getConfiguredUrl());
    }

    private String getConfiguredUrl() {
        // Read from thread-safe configuration source
        return ConfigurationManager.getOllamaUrl();
    }
}

See Also

  • Chat Models - OllamaChatModel
  • Embedding Model - OllamaEmbeddingModel
  • Java ServiceLoader - Standard Java SPI mechanism

Install with Tessl CLI

npx tessl i tessl/maven-dev-langchain4j--langchain4j-ollama

docs

architecture.md

chat-models.md

embedding-model.md

index.md

language-models.md

model-management.md

request-parameters.md

spi.md

types.md

README.md

tile.json