This package provides a deprecated integration module that enables Java applications to interact with GitHub Models through the LangChain4j framework. It offers chat models (both synchronous and streaming), embedding models, and support for AI services with tool integration, JSON schema responses, and responsible AI features. The module wraps Azure AI Inference SDK to provide a unified API for accessing various language models hosted on GitHub Models, including chat completion capabilities, embeddings generation, and content filtering management. As of version 1.10.0, this module has been marked for deprecation and future removal, with users recommended to migrate to the langchain4j-openai-official module for enhanced functionality and better integration. The library is designed for reusability as a foundational component in LLM-powered Java applications that need to leverage GitHub-hosted AI models, offering builder patterns for configuration, support for proxy options, custom timeouts, and comprehensive model service versioning capabilities.
—
Quality
Pending
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
API reference for GitHubModelsEmbeddingModel - text embedding generation.
package dev.langchain4j.model.github;
public class GitHubModelsEmbeddingModel extends DimensionAwareEmbeddingModel {
public static final int BATCH_SIZE = 16;
public Response<List<Embedding>> embedAll(List<TextSegment> textSegments);
public String modelName();
public static Builder builder();
protected Integer knownDimension();
}public static final int BATCH_SIZE = 16;Maximum segments per API request. Lists larger than 16 are automatically split into multiple batches.
Value: 16
public Response<List<Embedding>> embedAll(List<TextSegment> textSegments);Generate embeddings for text segments. Automatically batches large lists.
Parameters:
textSegments: List<TextSegment> - Segments to embedReturns:
Throws:
Note: Lists > 16 segments split into multiple batches transparently
public String modelName();Get model name.
Returns:
public static Builder builder();Create builder instance. Uses SPI factory if registered.
Returns:
protected Integer knownDimension();Get known embedding dimension (protected, internal use).
Returns:
public static class Builder {
// Required
public Builder gitHubToken(String gitHubToken);
public Builder modelName(String modelName);
public Builder modelName(GitHubModelsEmbeddingModelName modelName);
// Endpoint
public Builder endpoint(String endpoint);
public Builder serviceVersion(ModelServiceVersion serviceVersion);
// Embedding
public Builder dimensions(Integer dimensions);
// Network
public Builder timeout(Duration timeout);
public Builder maxRetries(Integer maxRetries);
public Builder proxyOptions(ProxyOptions proxyOptions);
public Builder customHeaders(Map<String, String> customHeaders);
// Advanced
public Builder embeddingsClient(EmbeddingsClient embeddingsClient);
public Builder logRequestsAndResponses(boolean logRequestsAndResponses);
public Builder userAgentSuffix(String userAgentSuffix);
public GitHubModelsEmbeddingModel build();
}public Builder gitHubToken(String gitHubToken);Set GitHub personal access token (required).
Parameters:
gitHubToken: String - GitHub tokenReturns:
public Builder modelName(String modelName);
public Builder modelName(GitHubModelsEmbeddingModelName modelName);Set embedding model name (required).
Parameters:
modelName: String or GitHubModelsEmbeddingModelName - Model identifierReturns:
public Builder endpoint(String endpoint);Set API endpoint (default: https://models.inference.ai.azure.com).
Parameters:
endpoint: String - Endpoint URLReturns:
public Builder serviceVersion(ModelServiceVersion serviceVersion);Set Azure API service version.
Parameters:
serviceVersion: ModelServiceVersion - API versionReturns:
public Builder dimensions(Integer dimensions);Set custom embedding dimensions (if model supports).
Parameters:
dimensions: Integer - Desired dimensionReturns:
Note: Not all models support custom dimensions
public Builder timeout(Duration timeout);Set request timeout.
Parameters:
timeout: Duration - Timeout durationReturns:
public Builder maxRetries(Integer maxRetries);Set maximum retry attempts.
Parameters:
maxRetries: Integer - Max retriesReturns:
public Builder proxyOptions(ProxyOptions proxyOptions);Set HTTP proxy configuration.
Parameters:
proxyOptions: ProxyOptions - Proxy configurationReturns:
public Builder customHeaders(Map<String, String> customHeaders);Set custom HTTP headers.
Parameters:
customHeaders: Map<String, String> - Custom headersReturns:
public Builder embeddingsClient(EmbeddingsClient embeddingsClient);Set custom Azure AI embeddings client (overrides other config).
Parameters:
embeddingsClient: EmbeddingsClient - Custom clientReturns:
public Builder logRequestsAndResponses(boolean logRequestsAndResponses);Enable request/response logging.
Parameters:
logRequestsAndResponses: boolean - Enable loggingReturns:
public Builder userAgentSuffix(String userAgentSuffix);Set User-Agent suffix.
Parameters:
userAgentSuffix: String - SuffixReturns:
public GitHubModelsEmbeddingModel build();Build model instance.
Returns:
Throws:
import dev.langchain4j.model.embedding.DimensionAwareEmbeddingModel;
import dev.langchain4j.data.embedding.Embedding;
import dev.langchain4j.data.segment.TextSegment;
import dev.langchain4j.model.output.Response;
import dev.langchain4j.model.output.TokenUsage;
import com.azure.ai.inference.EmbeddingsClient;
import com.azure.ai.inference.ModelServiceVersion;
import com.azure.core.http.ProxyOptions;
import com.azure.core.exception.HttpResponseException;
import java.time.Duration;
import java.util.List;
import java.util.Map;Install with Tessl CLI
npx tessl i tessl/maven-dev-langchain4j--langchain4j-github-models@1.11.0docs