CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-dev-langchain4j--langchain4j-ollama

Java integration library enabling LangChain4j applications to use Ollama's local language models with support for chat, streaming, embeddings, and advanced reasoning features

Overview
Eval results
Files

request-parameters.mddocs/

Request Parameters

OllamaChatRequestParameters extends the standard LangChain4j chat request parameters with Ollama-specific options for fine-grained control over model behavior.

OllamaChatRequestParameters

Ollama-specific request parameters for chat models.

Class Signature

package dev.langchain4j.model.ollama;

public class OllamaChatRequestParameters extends DefaultChatRequestParameters

Immutability: Immutable after creation

Thread Safety: Thread-safe; can be shared across threads

Nullability: Fields can be null (means use default from previous layer)

Creating an Instance

public static OllamaChatRequestParameters.Builder builder()

Returns a builder for creating OllamaChatRequestParameters instances.

Returns: Fresh builder instance

  • Never null
  • Not thread-safe

Example:

OllamaChatRequestParameters params = OllamaChatRequestParameters.builder()
    .temperature(0.7)
    .mirostat(2)
    .numCtx(4096)
    .build();

Builder Methods

mirostat

public Builder mirostat(Integer mirostat)

Sets the Mirostat sampling mode for perplexity control.

Parameters:

  • mirostat - Mirostat mode
    • Valid values: 0 (disabled), 1 (Mirostat), 2 (Mirostat 2.0)
    • Default: 0
    • Null means use default

Returns: This builder (never null)

Throws:

  • IllegalArgumentException at build() - If value not in {0, 1, 2}

Example:

OllamaChatRequestParameters params = OllamaChatRequestParameters.builder()
    .mirostat(2)  // Use Mirostat 2.0
    .build();

numCtx

public Builder numCtx(Integer numCtx)

Sets the context window size in tokens.

Parameters:

  • numCtx - Context window size
    • Valid range: > 0
    • Default: Model-specific default
    • Typical values: 2048, 4096, 8192
    • Null means use model default

Returns: This builder (never null)

Throws:

  • IllegalArgumentException at build() - If numCtx <= 0

Note: Larger contexts use more memory and are slower

Example:

OllamaChatRequestParameters params = OllamaChatRequestParameters.builder()
    .numCtx(4096)  // 4K context window
    .build();

think

public Builder think(Boolean think)

Controls thinking/reasoning mode for models like DeepSeek R1.

Parameters:

  • think - Thinking mode
    • true: LLM thinks and returns thoughts in separate thinking field
    • false: LLM does not think
    • null (default): Reasoning LLMs prepend thoughts with <think> tags

Returns: This builder (never null)

Example:

OllamaChatRequestParameters params = OllamaChatRequestParameters.builder()
    .think(true)  // Enable structured thinking
    .build();

Parameter Reference

Ollama-Specific Parameters

ParameterTypeValid RangeDefault
mirostatInteger0, 1, 20
mirostatEtaDouble> 0.00.1
mirostatTauDouble> 0.05.0
numCtxInteger> 0Model default
repeatPenaltyDouble>= 0.01.0
repeatLastNInteger>= 064
seedIntegerAnyRandom
minPDouble0.0-1.00.0
keepAliveInteger>= 0300 (5m)
thinkBooleantrue/false/nullnull

See Also

  • Chat Models - Using parameters with chat models
  • Language Models - Language model configuration

Install with Tessl CLI

npx tessl i tessl/maven-dev-langchain4j--langchain4j-ollama@1.11.0

docs

architecture.md

chat-models.md

embedding-model.md

index.md

language-models.md

model-management.md

request-parameters.md

spi.md

types.md

README.md

tile.json