CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-dev-langchain4j--langchain4j-hugging-face

LangChain4j integration library for Hugging Face inference capabilities including chat, language, and embedding models

Overview
Eval results
Files

migration-guide.mddocs/

Migration Guide

Guide for migrating from deprecated classes to recommended alternatives.

Overview

As of LangChain4j 1.7.0-beta13, the following classes are deprecated and scheduled for removal:

Deprecated:

  • HuggingFaceChatModel - Chat-style interactions
  • HuggingFaceLanguageModel - Text generation
  • HuggingFaceModelName - Model name constants

Still Supported:

  • HuggingFaceEmbeddingModel - Vector embeddings (active)
  • HuggingFaceClient - Low-level client (active)

Recommended Migration Path

Migrate deprecated classes to OpenAiChatModel from the langchain4j-open-ai module, using Hugging Face's OpenAI-compatible endpoint.

Migration Steps

Step 1: Add Dependency

Add langchain4j-open-ai module:

Maven:

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-open-ai</artifactId>
    <version>1.11.0</version>
</dependency>

Gradle:

implementation 'dev.langchain4j:langchain4j-open-ai:1.11.0'

Step 2: Update Imports

Old:

import dev.langchain4j.model.huggingface.HuggingFaceChatModel;
import dev.langchain4j.model.huggingface.HuggingFaceLanguageModel;
import dev.langchain4j.model.huggingface.HuggingFaceModelName;

New:

import dev.langchain4j.model.openai.OpenAiChatModel;

Step 3: Update Model Creation

Old (HuggingFaceChatModel):

HuggingFaceChatModel model = HuggingFaceChatModel.builder()
    .accessToken(System.getenv("HF_API_KEY"))
    .modelId("tiiuae/falcon-7b-instruct")
    .temperature(0.7)
    .maxNewTokens(200)
    .timeout(Duration.ofSeconds(30))
    .build();

New (OpenAiChatModel):

OpenAiChatModel model = OpenAiChatModel.builder()
    .apiKey(System.getenv("HF_API_KEY"))
    .baseUrl("https://router.huggingface.co/v1")
    .modelName("tiiuae/falcon-7b-instruct:hf-inference")
    .temperature(0.7)
    .maxTokens(200)
    .timeout(Duration.ofSeconds(30))
    .build();

Parameter Mapping

Common Parameters

Old (HuggingFace)New (OpenAI)Notes
accessToken()apiKey()Same value, different name
modelId()modelName()Add :hf-inference suffix
baseUrl()baseUrl()Use https://router.huggingface.co/v1
temperature()temperature()Same
maxNewTokens()maxTokens()Same
timeout()timeout()Same
waitForModel()N/ANot needed (handled by endpoint)
returnFullText()N/ANot applicable

Model Name Format

Old Format:

tiiuae/falcon-7b-instruct

New Format:

tiiuae/falcon-7b-instruct:hf-inference

Add :hf-inference suffix to model names.

Migration Examples

Example 1: Simple Chat

Before:

import dev.langchain4j.model.huggingface.HuggingFaceChatModel;

HuggingFaceChatModel model = HuggingFaceChatModel.builder()
    .accessToken(System.getenv("HF_API_KEY"))
    .modelId("tiiuae/falcon-7b-instruct")
    .build();

String response = model.chat("What is Java?");
System.out.println(response);

After:

import dev.langchain4j.model.openai.OpenAiChatModel;

OpenAiChatModel model = OpenAiChatModel.builder()
    .apiKey(System.getenv("HF_API_KEY"))
    .baseUrl("https://router.huggingface.co/v1")
    .modelName("tiiuae/falcon-7b-instruct:hf-inference")
    .build();

String response = model.generate("What is Java?");
System.out.println(response);

Example 2: Chat with Messages

Before:

import dev.langchain4j.model.huggingface.HuggingFaceChatModel;
import dev.langchain4j.data.message.*;
import dev.langchain4j.model.chat.response.ChatResponse;
import java.util.List;

HuggingFaceChatModel model = HuggingFaceChatModel.builder()
    .accessToken(System.getenv("HF_API_KEY"))
    .modelId("tiiuae/falcon-7b-instruct")
    .temperature(0.7)
    .build();

List<ChatMessage> messages = List.of(
    SystemMessage.from("You are helpful"),
    UserMessage.from("What is AI?")
);

ChatResponse response = model.chat(messages);
String aiReply = response.aiMessage().text();

After:

import dev.langchain4j.model.openai.OpenAiChatModel;
import dev.langchain4j.data.message.*;
import dev.langchain4j.model.chat.response.ChatResponse;
import java.util.List;

OpenAiChatModel model = OpenAiChatModel.builder()
    .apiKey(System.getenv("HF_API_KEY"))
    .baseUrl("https://router.huggingface.co/v1")
    .modelName("tiiuae/falcon-7b-instruct:hf-inference")
    .temperature(0.7)
    .build();

List<ChatMessage> messages = List.of(
    SystemMessage.from("You are helpful"),
    UserMessage.from("What is AI?")
);

ChatResponse response = model.chat(messages);
String aiReply = response.aiMessage().text();

Example 3: Language Model to Chat Model

Before:

import dev.langchain4j.model.huggingface.HuggingFaceLanguageModel;
import dev.langchain4j.model.output.Response;

HuggingFaceLanguageModel model = HuggingFaceLanguageModel.builder()
    .accessToken(System.getenv("HF_API_KEY"))
    .modelId("microsoft/Phi-3-mini-4k-instruct")
    .temperature(0.8)
    .maxNewTokens(150)
    .build();

Response<String> response = model.generate("Write a poem:");
String text = response.content();

After:

import dev.langchain4j.model.openai.OpenAiChatModel;

OpenAiChatModel model = OpenAiChatModel.builder()
    .apiKey(System.getenv("HF_API_KEY"))
    .baseUrl("https://router.huggingface.co/v1")
    .modelName("microsoft/Phi-3-mini-4k-instruct:hf-inference")
    .temperature(0.8)
    .maxTokens(150)
    .build();

String text = model.generate("Write a poem:");

Example 4: Using Model Name Constants

Before:

import dev.langchain4j.model.huggingface.HuggingFaceChatModel;
import dev.langchain4j.model.huggingface.HuggingFaceModelName;

HuggingFaceChatModel model = HuggingFaceChatModel.builder()
    .accessToken(System.getenv("HF_API_KEY"))
    .modelId(HuggingFaceModelName.TII_UAE_FALCON_7B_INSTRUCT)
    .build();

After:

import dev.langchain4j.model.openai.OpenAiChatModel;

// Define your own constants or use strings directly
private static final String FALCON_7B = "tiiuae/falcon-7b-instruct:hf-inference";

OpenAiChatModel model = OpenAiChatModel.builder()
    .apiKey(System.getenv("HF_API_KEY"))
    .baseUrl("https://router.huggingface.co/v1")
    .modelName(FALCON_7B)
    .build();

Example 5: Multi-turn Conversation

Before:

import dev.langchain4j.model.huggingface.HuggingFaceChatModel;
import dev.langchain4j.data.message.*;
import java.util.ArrayList;
import java.util.List;

HuggingFaceChatModel model = HuggingFaceChatModel.builder()
    .accessToken(System.getenv("HF_API_KEY"))
    .modelId("tiiuae/falcon-7b-instruct")
    .build();

List<ChatMessage> history = new ArrayList<>();
history.add(SystemMessage.from("You are helpful"));
history.add(UserMessage.from("What is ML?"));

ChatResponse response1 = model.chat(history);
history.add(AiMessage.from(response1.aiMessage().text()));

history.add(UserMessage.from("Give an example"));
ChatResponse response2 = model.chat(history);

After:

import dev.langchain4j.model.openai.OpenAiChatModel;
import dev.langchain4j.data.message.*;
import java.util.ArrayList;
import java.util.List;

OpenAiChatModel model = OpenAiChatModel.builder()
    .apiKey(System.getenv("HF_API_KEY"))
    .baseUrl("https://router.huggingface.co/v1")
    .modelName("tiiuae/falcon-7b-instruct:hf-inference")
    .build();

List<ChatMessage> history = new ArrayList<>();
history.add(SystemMessage.from("You are helpful"));
history.add(UserMessage.from("What is ML?"));

ChatResponse response1 = model.chat(history);
history.add(AiMessage.from(response1.aiMessage().text()));

history.add(UserMessage.from("Give an example"));
ChatResponse response2 = model.chat(history);

Configuration Migration

Environment Variables

Before:

export HF_API_KEY="hf_xxxxxxxxxxxx"

After:

# Same token, can use same variable name
export HF_API_KEY="hf_xxxxxxxxxxxx"
# Or use OpenAI-style naming
export OPENAI_API_KEY="hf_xxxxxxxxxxxx"

Spring Boot Configuration

Before:

huggingface:
  api-key: ${HF_API_KEY}
  model-id: tiiuae/falcon-7b-instruct
  temperature: 0.7

After:

openai:
  api-key: ${HF_API_KEY}
  base-url: https://router.huggingface.co/v1
  model-name: tiiuae/falcon-7b-instruct:hf-inference
  temperature: 0.7

Java Configuration Class

Before:

@Configuration
public class HuggingFaceConfig {

    @Value("${huggingface.api-key}")
    private String apiKey;

    @Bean
    public HuggingFaceChatModel chatModel() {
        return HuggingFaceChatModel.builder()
            .accessToken(apiKey)
            .modelId("tiiuae/falcon-7b-instruct")
            .build();
    }
}

After:

@Configuration
public class OpenAiConfig {

    @Value("${openai.api-key}")
    private String apiKey;

    @Bean
    public OpenAiChatModel chatModel() {
        return OpenAiChatModel.builder()
            .apiKey(apiKey)
            .baseUrl("https://router.huggingface.co/v1")
            .modelName("tiiuae/falcon-7b-instruct:hf-inference")
            .build();
    }
}

Common Migration Issues

Issue 1: Model Name Format

Problem: Model not found with old format

Solution: Add :hf-inference suffix

// ❌ Wrong
.modelName("tiiuae/falcon-7b-instruct")

// ✅ Correct
.modelName("tiiuae/falcon-7b-instruct:hf-inference")

Issue 2: Missing Base URL

Problem: Connects to OpenAI instead of Hugging Face

Solution: Always specify base URL

// ❌ Missing baseUrl - connects to OpenAI
OpenAiChatModel model = OpenAiChatModel.builder()
    .apiKey(apiKey)
    .modelName("tiiuae/falcon-7b-instruct:hf-inference")
    .build();

// ✅ Correct - connects to Hugging Face
OpenAiChatModel model = OpenAiChatModel.builder()
    .apiKey(apiKey)
    .baseUrl("https://router.huggingface.co/v1")
    .modelName("tiiuae/falcon-7b-instruct:hf-inference")
    .build();

Issue 3: Parameter Name Changes

Problem: Compilation errors for old parameter names

Solution: Update parameter names

// ❌ Old names
.accessToken(apiKey)
.modelId("model")
.maxNewTokens(200)

// ✅ New names
.apiKey(apiKey)
.modelName("model:hf-inference")
.maxTokens(200)

Testing Migration

Unit Tests

Before:

@Test
public void testChatModel() {
    HuggingFaceChatModel model = HuggingFaceChatModel.builder()
        .accessToken("test-key")
        .build();

    String response = model.chat("test");
    assertNotNull(response);
}

After:

@Test
public void testChatModel() {
    OpenAiChatModel model = OpenAiChatModel.builder()
        .apiKey("test-key")
        .baseUrl("https://router.huggingface.co/v1")
        .modelName("test-model:hf-inference")
        .build();

    String response = model.generate("test");
    assertNotNull(response);
}

Migration Checklist

  • Add langchain4j-open-ai dependency
  • Update all imports
  • Change HuggingFaceChatModel to OpenAiChatModel
  • Change HuggingFaceLanguageModel to OpenAiChatModel
  • Update parameter names (accessTokenapiKey, etc.)
  • Add baseUrl("https://router.huggingface.co/v1")
  • Add :hf-inference suffix to model names
  • Update configuration files (YAML, properties)
  • Update environment variables (if needed)
  • Run tests to verify functionality
  • Update documentation

Rollback Plan

If migration causes issues, you can temporarily continue using deprecated classes:

// Still works, but deprecated
@SuppressWarnings("deprecation")
HuggingFaceChatModel model = HuggingFaceChatModel.builder()
    .accessToken(apiKey)
    .modelId("model")
    .build();

Note: Deprecated classes will be removed in a future version.

Benefits of Migration

  1. Long-term Support: OpenAI module is actively maintained
  2. More Features: Access to full OpenAI API features
  3. Better Compatibility: Standard OpenAI-compatible interface
  4. Community Support: Larger user base and better documentation
  5. Future-proof: Won't be removed in future versions

Embeddings (No Migration Needed)

HuggingFaceEmbeddingModel is NOT deprecated and requires no migration:

// ✅ Still supported - no changes needed
HuggingFaceEmbeddingModel model = HuggingFaceEmbeddingModel.builder()
    .accessToken(System.getenv("HF_API_KEY"))
    .modelId("sentence-transformers/all-MiniLM-L6-v2")
    .build();

Related Documentation

  • Quick Start Guide - Getting started with new approach
  • Configuration Guide - Configuration options
  • Error Handling - Error scenarios
  • Chat Model API - Old API (deprecated)
  • Language Model API - Old API (deprecated)

Install with Tessl CLI

npx tessl i tessl/maven-dev-langchain4j--langchain4j-hugging-face@1.11.0

docs

chat-model.md

client-api.md

common-tasks.md

configuration.md

embedding-model.md

error-handling.md

index.md

language-model.md

migration-guide.md

model-names.md

quick-start.md

spi-extensions.md

tile.json