CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-dev-langchain4j--langchain4j-github-models

This package provides a deprecated integration module that enables Java applications to interact with GitHub Models through the LangChain4j framework. It offers chat models (both synchronous and streaming), embedding models, and support for AI services with tool integration, JSON schema responses, and responsible AI features. The module wraps Azure AI Inference SDK to provide a unified API for accessing various language models hosted on GitHub Models, including chat completion capabilities, embeddings generation, and content filtering management. As of version 1.10.0, this module has been marked for deprecation and future removal, with users recommended to migrate to the langchain4j-openai-official module for enhanced functionality and better integration. The library is designed for reusability as a foundational component in LLM-powered Java applications that need to leverage GitHub-hosted AI models, offering builder patterns for configuration, support for proxy options, custom timeouts, and comprehensive model service versioning capabilities.

Pending

Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

Overview
Eval results
Files

LangChain4j GitHub Models Integration

⚠️ DEPRECATED: Migrate to langchain4j-openai-official module. This module will be removed in a future release.

Java library integrating LangChain4j with GitHub Models (Azure AI Inference) for chat completion and text embeddings.

Installation

Maven:

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-github-models</artifactId>
    <version>1.11.0</version>
</dependency>

Gradle:

implementation 'dev.langchain4j:langchain4j-github-models:1.11.0'

Core Imports

import dev.langchain4j.model.github.GitHubModelsChatModel;
import dev.langchain4j.model.github.GitHubModelsStreamingChatModel;
import dev.langchain4j.model.github.GitHubModelsEmbeddingModel;
import dev.langchain4j.model.github.GitHubModelsChatModelName;
import dev.langchain4j.model.github.GitHubModelsEmbeddingModelName;

Quick Start

Synchronous Chat

GitHubModelsChatModel model = GitHubModelsChatModel.builder()
    .gitHubToken(System.getenv("GITHUB_TOKEN"))
    .modelName(GitHubModelsChatModelName.GPT_4_O)
    .build();

ChatResponse response = model.chat(ChatRequest.builder()
    .messages(UserMessage.from("Hello"))
    .build());

Streaming Chat

GitHubModelsStreamingChatModel model = GitHubModelsStreamingChatModel.builder()
    .gitHubToken(System.getenv("GITHUB_TOKEN"))
    .modelName("gpt-4o")
    .build();

model.chat(request, new StreamingChatResponseHandler() {
    public void onPartialResponse(String token) { System.out.print(token); }
    public void onCompleteResponse(ChatResponse response) { }
    public void onError(Throwable error) { }
});

Embeddings

GitHubModelsEmbeddingModel model = GitHubModelsEmbeddingModel.builder()
    .gitHubToken(System.getenv("GITHUB_TOKEN"))
    .modelName(GitHubModelsEmbeddingModelName.TEXT_EMBEDDING_3_SMALL)
    .build();

Response<List<Embedding>> response = model.embedAll(segments);

Documentation Structure

For quick task lookup: Quick Reference

API Reference (method signatures and parameters):

Usage Guides (examples and patterns):

Configuration:

Reference:

Key Limitations

  • Deprecated: Scheduled for removal, migrate to langchain4j-openai-official
  • Authentication: Requires GitHub personal access token
  • Batch Size: Embedding model processes max 16 segments per request
  • Tool Execution: ToolChoice.REQUIRED supports only single tool
  • Content Filtering: Returns FinishReason.CONTENT_FILTER on policy violations

Install with Tessl CLI

npx tessl i tessl/maven-dev-langchain4j--langchain4j-github-models@1.11.0
Workspace
tessl
Visibility
Public
Created
Last updated
Describes
mavenpkg:maven/dev.langchain4j/langchain4j-github-models@1.11.x
Publish Source
CLI
Badge
tessl/maven-dev-langchain4j--langchain4j-github-models badge