or run

tessl search
Log in

Version

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
mavenpkg:maven/io.quarkiverse.langchain4j/quarkus-langchain4j-core@1.5.x

docs

index.md
tile.json

tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-core

tessl install tessl/maven-io-quarkiverse-langchain4j--quarkus-langchain4j-core@1.5.0

Quarkus LangChain4j Core provides runtime integration for LangChain4j with the Quarkus framework, enabling declarative AI service creation through CDI annotations.

quick-start.mddocs/guides/

Quick Start Guide

This guide walks you through creating your first AI-powered Quarkus application using LangChain4j.

Prerequisites

  • Java 17 or later
  • Maven 3.8+ or Gradle
  • Quarkus 3.2.0+
  • An OpenAI API key (or other LLM provider)

Step 1: Add Dependencies

Add Quarkus LangChain4j Core to your pom.xml:

<dependency>
    <groupId>io.quarkiverse.langchain4j</groupId>
    <artifactId>quarkus-langchain4j-core</artifactId>
    <version>1.5.0</version>
</dependency>

<!-- Add a model provider (e.g., OpenAI) -->
<dependency>
    <groupId>io.quarkiverse.langchain4j</groupId>
    <artifactId>quarkus-langchain4j-openai</artifactId>
    <version>1.5.0</version>
</dependency>

Step 2: Configure Your API Key

Add your API key to application.properties:

quarkus.langchain4j.openai.api-key=${OPENAI_API_KEY}

Set the environment variable:

export OPENAI_API_KEY=your-api-key-here

Step 3: Create an AI Service

Create a simple AI service interface:

package com.example;

import io.quarkiverse.langchain4j.RegisterAiService;
import dev.langchain4j.service.UserMessage;

@RegisterAiService
public interface AssistantService {
    
    @UserMessage("What is the capital of {country}?")
    String getCapital(String country);
    
    @UserMessage("Tell me a joke about {topic}")
    String tellJoke(String topic);
}

What's happening here:

  • @RegisterAiService creates a CDI bean automatically
  • @UserMessage defines the prompt template
  • {country} and {topic} are template variables that match parameter names
  • Return type String means you get the AI response as text

Step 4: Inject and Use

Inject your AI service into any CDI bean:

package com.example;

import jakarta.inject.Inject;
import jakarta.ws.rs.GET;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.QueryParam;

@Path("/assistant")
public class AssistantResource {

    @Inject
    AssistantService assistant;

    @GET
    @Path("/capital")
    public String getCapital(@QueryParam("country") String country) {
        return assistant.getCapital(country);
    }

    @GET
    @Path("/joke")
    public String getJoke(@QueryParam("topic") String topic) {
        return assistant.tellJoke(topic);
    }
}

Step 5: Run and Test

Start your application:

mvn quarkus:dev

Test the endpoints:

curl "http://localhost:8080/assistant/capital?country=France"
# Response: "The capital of France is Paris."

curl "http://localhost:8080/assistant/joke?topic=programming"
# Response: "Why do programmers prefer dark mode?..."

Next Steps

Add System Messages

Provide context to guide the AI's behavior:

@RegisterAiService
public interface CodeReviewer {
    
    @SystemMessage("You are an expert code reviewer. Provide constructive feedback.")
    @UserMessage("Review this code: {code}")
    String reviewCode(String code);
}

Add Conversation Memory

Enable the AI to remember previous messages:

@RegisterAiService
public interface ChatBot {
    
    String chat(@MemoryId String userId, @UserMessage String message);
}

// Each user gets their own conversation history
@Inject ChatBot bot;

bot.chat("user123", "My name is Alice");
bot.chat("user123", "What's my name?");  // Response: "Your name is Alice"

Add Tools (Function Calling)

Let the AI call Java methods:

import dev.langchain4j.agent.tool.Tool;
import jakarta.enterprise.context.ApplicationScoped;

@ApplicationScoped
public class WeatherTool {
    
    @Tool("Get current weather for a city")
    public String getWeather(String city) {
        // Call weather API
        return "Current weather in " + city + ": Sunny, 22°C";
    }
}

@RegisterAiService(tools = WeatherTool.class)
public interface WeatherAssistant {
    String chat(String message);
}

// Usage
assistant.chat("What's the weather in Paris?");
// AI automatically calls getWeather("Paris") and responds

Configure Different Models

Use multiple models in the same application:

@RegisterAiService(modelName = "gpt-4")
public interface AdvancedAssistant {
    String chat(String message);
}

@RegisterAiService(modelName = "gpt-3.5-turbo")
public interface QuickAssistant {
    String chat(String message);
}

Configure in application.properties:

quarkus.langchain4j.openai.gpt-4.api-key=${OPENAI_API_KEY}
quarkus.langchain4j.openai.gpt-4.model-name=gpt-4
quarkus.langchain4j.openai.gpt-4.temperature=0.7

quarkus.langchain4j.openai.gpt-3-5-turbo.api-key=${OPENAI_API_KEY}
quarkus.langchain4j.openai.gpt-3-5-turbo.model-name=gpt-3.5-turbo
quarkus.langchain4j.openai.gpt-3-5-turbo.temperature=0.3

Add Streaming Responses

Stream responses for better user experience:

import io.smallrye.mutiny.Multi;

@RegisterAiService
public interface StreamingAssistant {
    
    Multi<String> chatStreaming(String message);
}

// Usage
@Inject StreamingAssistant assistant;

assistant.chatStreaming("Tell me a long story")
    .subscribe().with(
        chunk -> System.out.print(chunk),  // Print each chunk as it arrives
        error -> System.err.println("Error: " + error),
        () -> System.out.println("\nComplete")
    );

Common Configuration Options

# Logging
quarkus.langchain4j.log-requests=true
quarkus.langchain4j.log-responses=true

# Timeouts
quarkus.langchain4j.timeout=60s

# Model parameters
quarkus.langchain4j.temperature=0.7
quarkus.langchain4j.openai.gpt-4.max-tokens=2000

# Chat memory
quarkus.langchain4j.chat-memory.type=in-memory
quarkus.langchain4j.chat-memory.max-messages=100

Troubleshooting

"AI Service not injected"

  • Ensure interface has @RegisterAiService annotation
  • Interface must be public
  • Interface must be in a package scanned by Quarkus

"No model found"

  • Check API key is configured correctly
  • Verify model name matches configuration
  • Check logs for initialization errors

"Template variable not found"

  • Ensure {variableName} matches parameter name exactly
  • If using Java 8+, compile with -parameters flag

What You've Learned

✓ How to add Quarkus LangChain4j to your project
✓ How to create declarative AI services
✓ How to inject and use AI services
✓ How to add conversation memory
✓ How to enable function calling with tools
✓ How to configure multiple models
✓ How to stream responses

Next Steps

  • Real-World Scenarios - See complete examples
  • Tool Guardrails - Add validation and security
  • Observability - Monitor your AI services
  • RAG Guide - Add knowledge retrieval