This package provides a deprecated integration module that enables Java applications to interact with GitHub Models through the LangChain4j framework. It offers chat models (both synchronous and streaming), embedding models, and support for AI services with tool integration, JSON schema responses, and responsible AI features. The module wraps Azure AI Inference SDK to provide a unified API for accessing various language models hosted on GitHub Models, including chat completion capabilities, embeddings generation, and content filtering management. As of version 1.10.0, this module has been marked for deprecation and future removal, with users recommended to migrate to the langchain4j-openai-official module for enhanced functionality and better integration. The library is designed for reusability as a foundational component in LLM-powered Java applications that need to leverage GitHub-hosted AI models, offering builder patterns for configuration, support for proxy options, custom timeouts, and comprehensive model service versioning capabilities.
—
Quality
Pending
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
⚠️ DEPRECATED: Migrate to langchain4j-openai-official module. This module will be removed in a future release.
Java library integrating LangChain4j with GitHub Models (Azure AI Inference) for chat completion and text embeddings.
Maven:
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-github-models</artifactId>
<version>1.11.0</version>
</dependency>Gradle:
implementation 'dev.langchain4j:langchain4j-github-models:1.11.0'import dev.langchain4j.model.github.GitHubModelsChatModel;
import dev.langchain4j.model.github.GitHubModelsStreamingChatModel;
import dev.langchain4j.model.github.GitHubModelsEmbeddingModel;
import dev.langchain4j.model.github.GitHubModelsChatModelName;
import dev.langchain4j.model.github.GitHubModelsEmbeddingModelName;GitHubModelsChatModel model = GitHubModelsChatModel.builder()
.gitHubToken(System.getenv("GITHUB_TOKEN"))
.modelName(GitHubModelsChatModelName.GPT_4_O)
.build();
ChatResponse response = model.chat(ChatRequest.builder()
.messages(UserMessage.from("Hello"))
.build());GitHubModelsStreamingChatModel model = GitHubModelsStreamingChatModel.builder()
.gitHubToken(System.getenv("GITHUB_TOKEN"))
.modelName("gpt-4o")
.build();
model.chat(request, new StreamingChatResponseHandler() {
public void onPartialResponse(String token) { System.out.print(token); }
public void onCompleteResponse(ChatResponse response) { }
public void onError(Throwable error) { }
});GitHubModelsEmbeddingModel model = GitHubModelsEmbeddingModel.builder()
.gitHubToken(System.getenv("GITHUB_TOKEN"))
.modelName(GitHubModelsEmbeddingModelName.TEXT_EMBEDDING_3_SMALL)
.build();
Response<List<Embedding>> response = model.embedAll(segments);For quick task lookup: Quick Reference
API Reference (method signatures and parameters):
Usage Guides (examples and patterns):
Configuration:
Reference:
langchain4j-openai-officialToolChoice.REQUIRED supports only single toolFinishReason.CONTENT_FILTER on policy violationsInstall with Tessl CLI
npx tessl i tessl/maven-dev-langchain4j--langchain4j-github-models