CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-dev-langchain4j--langchain4j-github-models

This package provides a deprecated integration module that enables Java applications to interact with GitHub Models through the LangChain4j framework. It offers chat models (both synchronous and streaming), embedding models, and support for AI services with tool integration, JSON schema responses, and responsible AI features. The module wraps Azure AI Inference SDK to provide a unified API for accessing various language models hosted on GitHub Models, including chat completion capabilities, embeddings generation, and content filtering management. As of version 1.10.0, this module has been marked for deprecation and future removal, with users recommended to migrate to the langchain4j-openai-official module for enhanced functionality and better integration. The library is designed for reusability as a foundational component in LLM-powered Java applications that need to leverage GitHub-hosted AI models, offering builder patterns for configuration, support for proxy options, custom timeouts, and comprehensive model service versioning capabilities.

Pending

Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

Overview
Eval results
Files

streaming-chat-model-api.mddocs/api/

Streaming Chat Model API Reference

API reference for GitHubModelsStreamingChatModel - streaming chat completion with real-time token generation.

Class

package dev.langchain4j.model.github;

public class GitHubModelsStreamingChatModel implements StreamingChatModel {
    public void chat(ChatRequest request, StreamingChatResponseHandler handler);
    public List<ChatModelListener> listeners();
    public ModelProvider provider();
    public static Builder builder();
}

Methods

chat

public void chat(ChatRequest request, StreamingChatResponseHandler handler);

Execute streaming chat completion with real-time token delivery.

Parameters:

  • request: ChatRequest - Messages, tools, and parameters
  • handler: StreamingChatResponseHandler - Handler for streaming events

Throws:

  • UnsupportedFeatureException - On unsupported features

Note: Errors delivered via handler.onError(), not thrown

listeners

public List<ChatModelListener> listeners();

Get registered listeners.

Returns:

  • List<ChatModelListener> - Registered listeners

provider

public ModelProvider provider();

Get model provider identifier.

Returns:

  • ModelProvider - Returns ModelProvider.GITHUB_MODELS

builder

public static Builder builder();

Create builder instance. Uses SPI factory if registered.

Returns:

  • Builder - Builder instance

Builder API

public static class Builder {
    // Required
    public Builder gitHubToken(String gitHubToken);
    public Builder modelName(String modelName);
    public Builder modelName(GitHubModelsChatModelName modelName);

    // Endpoint
    public Builder endpoint(String endpoint);
    public Builder serviceVersion(ModelServiceVersion serviceVersion);

    // Sampling
    public Builder temperature(Double temperature);
    public Builder topP(Double topP);
    public Builder maxTokens(Integer maxTokens);
    public Builder presencePenalty(Double presencePenalty);
    public Builder frequencyPenalty(Double frequencyPenalty);
    public Builder seed(Long seed);
    public Builder stop(List<String> stop);

    // Response Format
    public Builder responseFormat(ChatCompletionsResponseFormat responseFormat);

    // Network
    public Builder timeout(Duration timeout);
    public Builder maxRetries(Integer maxRetries);
    public Builder proxyOptions(ProxyOptions proxyOptions);
    public Builder customHeaders(Map<String, String> customHeaders);

    // Advanced
    public Builder chatCompletionsAsyncClient(ChatCompletionsAsyncClient client);
    public Builder listeners(List<ChatModelListener> listeners);
    public Builder logRequestsAndResponses(boolean logRequestsAndResponses);
    public Builder userAgentSuffix(String userAgentSuffix);

    public GitHubModelsStreamingChatModel build();
}

Builder Methods

See Chat Model API for detailed parameter documentation. Streaming builder methods are identical except:

  • chatCompletionsAsyncClient: Uses async client instead of sync client
  • No strictJsonSchema method (not available for streaming)

chatCompletionsAsyncClient

public Builder chatCompletionsAsyncClient(ChatCompletionsAsyncClient client);

Set custom Azure AI async client (overrides other config).

Parameters:

  • client: ChatCompletionsAsyncClient - Custom async client

Returns:

  • Builder

StreamingChatResponseHandler Interface

public interface StreamingChatResponseHandler {
    void onPartialResponse(String token);
    void onCompleteResponse(ChatResponse response);
    void onError(Throwable error);
}

onPartialResponse

void onPartialResponse(String token);

Called for each token as it's generated.

Parameters:

  • token: String - Generated token text

Threading: Called on background thread

onCompleteResponse

void onCompleteResponse(ChatResponse response);

Called when streaming completes successfully.

Parameters:

  • response: ChatResponse - Complete response with metadata

Threading: Called on background thread

onError

void onError(Throwable error);

Called if an error occurs during streaming.

Parameters:

  • error: Throwable - The error

Threading: Called on background thread

Types

import dev.langchain4j.model.chat.StreamingChatModel;
import dev.langchain4j.model.chat.request.ChatRequest;
import dev.langchain4j.model.chat.response.StreamingChatResponseHandler;
import dev.langchain4j.model.chat.response.ChatResponse;
import dev.langchain4j.model.chat.listener.ChatModelListener;
import dev.langchain4j.model.ModelProvider;
import com.azure.ai.inference.ChatCompletionsAsyncClient;
import com.azure.ai.inference.ModelServiceVersion;
import com.azure.ai.inference.models.ChatCompletionsResponseFormat;
import com.azure.core.http.ProxyOptions;
import java.time.Duration;
import java.util.List;
import java.util.Map;

Install with Tessl CLI

npx tessl i tessl/maven-dev-langchain4j--langchain4j-github-models@1.11.0

docs

api

chat-model-api.md

embedding-model-api.md

model-names-api.md

streaming-chat-model-api.md

index.md

quick-reference.md

tile.json