CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-dev-langchain4j--langchain4j-github-models

This package provides a deprecated integration module that enables Java applications to interact with GitHub Models through the LangChain4j framework. It offers chat models (both synchronous and streaming), embedding models, and support for AI services with tool integration, JSON schema responses, and responsible AI features. The module wraps Azure AI Inference SDK to provide a unified API for accessing various language models hosted on GitHub Models, including chat completion capabilities, embeddings generation, and content filtering management. As of version 1.10.0, this module has been marked for deprecation and future removal, with users recommended to migrate to the langchain4j-openai-official module for enhanced functionality and better integration. The library is designed for reusability as a foundational component in LLM-powered Java applications that need to leverage GitHub-hosted AI models, offering builder patterns for configuration, support for proxy options, custom timeouts, and comprehensive model service versioning capabilities.

Pending

Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

Overview
Eval results
Files

chat-model-api.mddocs/api/

Chat Model API Reference

API reference for GitHubModelsChatModel - synchronous chat completion.

Class

package dev.langchain4j.model.github;

public class GitHubModelsChatModel implements ChatModel {
    public ChatResponse chat(ChatRequest chatRequest);
    public Set<Capability> supportedCapabilities();
    public List<ChatModelListener> listeners();
    public ModelProvider provider();
    public static Builder builder();
}

Methods

chat

public ChatResponse chat(ChatRequest chatRequest);

Execute synchronous chat completion.

Parameters:

  • chatRequest: ChatRequest - Messages, tools, and parameters

Returns:

  • ChatResponse - AI message, token usage, finish reason

Throws:

  • HttpResponseException - On API errors
  • UnsupportedFeatureException - On unsupported features (e.g., multiple tools with REQUIRED)

supportedCapabilities

public Set<Capability> supportedCapabilities();

Get model's supported capabilities.

Returns:

  • Set<Capability> - Supported capabilities (e.g., RESPONSE_FORMAT_JSON_SCHEMA)

listeners

public List<ChatModelListener> listeners();

Get registered listeners.

Returns:

  • List<ChatModelListener> - Registered listeners

provider

public ModelProvider provider();

Get model provider identifier.

Returns:

  • ModelProvider - Returns ModelProvider.GITHUB_MODELS

builder

public static Builder builder();

Create builder instance. Uses SPI factory if registered.

Returns:

  • Builder - Builder instance

Builder API

public static class Builder {
    // Required
    public Builder gitHubToken(String gitHubToken);
    public Builder modelName(String modelName);
    public Builder modelName(GitHubModelsChatModelName modelName);

    // Endpoint
    public Builder endpoint(String endpoint);
    public Builder serviceVersion(ModelServiceVersion serviceVersion);

    // Sampling
    public Builder temperature(Double temperature);
    public Builder topP(Double topP);
    public Builder maxTokens(Integer maxTokens);
    public Builder presencePenalty(Double presencePenalty);
    public Builder frequencyPenalty(Double frequencyPenalty);
    public Builder seed(Long seed);
    public Builder stop(List<String> stop);

    // Response Format
    public Builder responseFormat(ChatCompletionsResponseFormat responseFormat);
    public Builder strictJsonSchema(boolean strictJsonSchema);

    // Network
    public Builder timeout(Duration timeout);
    public Builder maxRetries(Integer maxRetries);
    public Builder proxyOptions(ProxyOptions proxyOptions);
    public Builder customHeaders(Map<String, String> customHeaders);

    // Advanced
    public Builder supportedCapabilities(Set<Capability> supportedCapabilities);
    public Builder chatCompletionsClient(ChatCompletionsClient chatCompletionsClient);
    public Builder listeners(List<ChatModelListener> listeners);
    public Builder logRequestsAndResponses(Boolean logRequestsAndResponses);
    public Builder userAgentSuffix(String userAgentSuffix);

    public GitHubModelsChatModel build();
}

gitHubToken

public Builder gitHubToken(String gitHubToken);

Set GitHub personal access token (required).

Parameters:

  • gitHubToken: String - GitHub token

Returns:

  • Builder

modelName

public Builder modelName(String modelName);
public Builder modelName(GitHubModelsChatModelName modelName);

Set model name (required).

Parameters:

  • modelName: String or GitHubModelsChatModelName - Model identifier

Returns:

  • Builder

endpoint

public Builder endpoint(String endpoint);

Set API endpoint (default: https://models.inference.ai.azure.com).

Parameters:

  • endpoint: String - Endpoint URL

Returns:

  • Builder

serviceVersion

public Builder serviceVersion(ModelServiceVersion serviceVersion);

Set Azure API service version.

Parameters:

  • serviceVersion: ModelServiceVersion - API version

Returns:

  • Builder

temperature

public Builder temperature(Double temperature);

Set sampling temperature (0.0-2.0).

Parameters:

  • temperature: Double - Temperature value

Returns:

  • Builder

topP

public Builder topP(Double topP);

Set nucleus sampling parameter (0.0-1.0).

Parameters:

  • topP: Double - Top-p value

Returns:

  • Builder

maxTokens

public Builder maxTokens(Integer maxTokens);

Set maximum tokens to generate.

Parameters:

  • maxTokens: Integer - Max token count

Returns:

  • Builder

presencePenalty

public Builder presencePenalty(Double presencePenalty);

Set presence penalty (-2.0 to 2.0).

Parameters:

  • presencePenalty: Double - Penalty value

Returns:

  • Builder

frequencyPenalty

public Builder frequencyPenalty(Double frequencyPenalty);

Set frequency penalty (-2.0 to 2.0).

Parameters:

  • frequencyPenalty: Double - Penalty value

Returns:

  • Builder

seed

public Builder seed(Long seed);

Set random seed for deterministic generation.

Parameters:

  • seed: Long - Seed value

Returns:

  • Builder

stop

public Builder stop(List<String> stop);

Set stop sequences.

Parameters:

  • stop: List<String> - Stop sequences

Returns:

  • Builder

responseFormat

public Builder responseFormat(ChatCompletionsResponseFormat responseFormat);

Set response format (e.g., JSON).

Parameters:

  • responseFormat: ChatCompletionsResponseFormat - Format specification

Returns:

  • Builder

strictJsonSchema

public Builder strictJsonSchema(boolean strictJsonSchema);

Enable strict JSON schema validation.

Parameters:

  • strictJsonSchema: boolean - Enable strict validation

Returns:

  • Builder

timeout

public Builder timeout(Duration timeout);

Set request timeout.

Parameters:

  • timeout: Duration - Timeout duration

Returns:

  • Builder

maxRetries

public Builder maxRetries(Integer maxRetries);

Set maximum retry attempts.

Parameters:

  • maxRetries: Integer - Max retries

Returns:

  • Builder

proxyOptions

public Builder proxyOptions(ProxyOptions proxyOptions);

Set HTTP proxy configuration.

Parameters:

  • proxyOptions: ProxyOptions - Proxy configuration

Returns:

  • Builder

customHeaders

public Builder customHeaders(Map<String, String> customHeaders);

Set custom HTTP headers.

Parameters:

  • customHeaders: Map<String, String> - Custom headers

Returns:

  • Builder

supportedCapabilities

public Builder supportedCapabilities(Set<Capability> supportedCapabilities);

Set supported capabilities.

Parameters:

  • supportedCapabilities: Set<Capability> - Capabilities

Returns:

  • Builder

chatCompletionsClient

public Builder chatCompletionsClient(ChatCompletionsClient chatCompletionsClient);

Set custom Azure AI client (overrides other config).

Parameters:

  • chatCompletionsClient: ChatCompletionsClient - Custom client

Returns:

  • Builder

listeners

public Builder listeners(List<ChatModelListener> listeners);

Register model listeners.

Parameters:

  • listeners: List<ChatModelListener> - Listeners

Returns:

  • Builder

logRequestsAndResponses

public Builder logRequestsAndResponses(Boolean logRequestsAndResponses);

Enable request/response logging.

Parameters:

  • logRequestsAndResponses: Boolean - Enable logging

Returns:

  • Builder

userAgentSuffix

public Builder userAgentSuffix(String userAgentSuffix);

Set User-Agent suffix.

Parameters:

  • userAgentSuffix: String - Suffix

Returns:

  • Builder

build

public GitHubModelsChatModel build();

Build model instance.

Returns:

  • GitHubModelsChatModel

Throws:

  • IllegalStateException - If required fields missing

Types

import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.request.ChatRequest;
import dev.langchain4j.model.chat.response.ChatResponse;
import dev.langchain4j.model.chat.Capability;
import dev.langchain4j.model.chat.listener.ChatModelListener;
import dev.langchain4j.model.ModelProvider;
import com.azure.ai.inference.ChatCompletionsClient;
import com.azure.ai.inference.ModelServiceVersion;
import com.azure.ai.inference.models.ChatCompletionsResponseFormat;
import com.azure.core.http.ProxyOptions;
import java.time.Duration;
import java.util.List;
import java.util.Set;
import java.util.Map;

Install with Tessl CLI

npx tessl i tessl/maven-dev-langchain4j--langchain4j-github-models

docs

api

chat-model-api.md

embedding-model-api.md

model-names-api.md

streaming-chat-model-api.md

index.md

quick-reference.md

tile.json