CtrlK
BlogDocsLog inGet started
Tessl Logo

jbaruch/quarkus-langchain4j

Build Quarkus applications with LangChain4j extensions - project setup, CDI services, REST endpoints, MCP, agentic, and dev mode

91

1.71x
Quality

90%

Does it follow best practices?

Impact

96%

1.71x

Average score across 3 eval scenarios

SecuritybySnyk

Advisory

Suggest reviewing before use

Overview
Quality
Evals
Security
Files

SKILL.mdskills/quarkus-langchain4j/

name:
quarkus-langchain4j
description:
Build Quarkus applications with LangChain4j extensions. Use when creating Quarkus projects with AI capabilities, configuring quarkus-langchain4j extensions, using RegisterAiService, adding MCP support, agentic workflows, or Quarkus dev mode for AI apps.

Quarkus LangChain4j Skill

This skill covers the Quarkus integration layer for LangChain4j. The LangChain4j AI APIs themselves (prompts, memory, chains) live in a separate tile.

Quick-Start Workflow

  1. Create projectquarkus create app with LangChain4j extension (see §2)
  2. Configure API key — set quarkus.langchain4j.<provider>.api-key in application.properties (see §7)
  3. Verify build compilesmvn compile; fix any missing extension or BOM errors before continuing
  4. Create AI service — define interface with @RegisterAiService (see §3)
  5. Inject and expose@Inject the AI service into a REST resource
  6. Run dev modequarkus dev; verify Dev UI shows your AI service at http://localhost:8080/q/dev-ui before testing endpoints
  7. Test — interact via Dev UI or HTTP endpoint; confirm tool/MCP connections in the Dev UI LangChain4j card

1. Project Setup

Create project

quarkus create app com.example:my-ai-app \
  --extension='rest,rest-jackson,quarkus-langchain4j-anthropic'

Or add extensions to existing project:

quarkus ext add quarkus-langchain4j-anthropic quarkus-langchain4j-mcp

Maven BOM (pom.xml)

<dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>io.quarkiverse.langchain4j</groupId>
      <artifactId>quarkus-langchain4j-bom</artifactId>
      <version>1.8.4</version>
      <type>pom</type>
      <scope>import</scope>
    </dependency>
  </dependencies>
</dependencyManagement>

All quarkus-langchain4j extensions share the same version. Always use the BOM.

Key extension artifacts (groupId: io.quarkiverse.langchain4j)

  • quarkus-langchain4j-anthropic - Claude models
  • quarkus-langchain4j-openai - OpenAI / compatible APIs
  • quarkus-langchain4j-ollama - Local Ollama models
  • quarkus-langchain4j-mcp - MCP client support
  • quarkus-langchain4j-agentic - Agent workflows
  • quarkus-langchain4j-pgvector - PgVector embeddings

2. AI Service Registration

@RegisterAiService on an interface produces a CDI bean — inject with @Inject anywhere.

import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.UserMessage;
import io.quarkiverse.langchain4j.RegisterAiService;

@RegisterAiService
public interface MyAiService {

    @SystemMessage("You are a helpful assistant.")
    String chat(@UserMessage String message);
}

Key parameters

ParameterPurpose
modelNameNamed model config (default: "<default>")
toolsArray of CDI tool bean classes
toolProviderSupplierDynamic tool supplier
chatMemoryProviderSupplierCustom memory provider
maxSequentialToolInvocationsTool call limit per request (default 10)

Named models

@RegisterAiService(modelName = "smart")
public interface SmartService { /* ... */ }

@RegisterAiService(modelName = "fast")
public interface FastService { /* ... */ }

Memory with user sessions

@RegisterAiService
public interface ChatBot {
    String chat(@MemoryId String sessionId, @UserMessage String msg);
}

Default memory: MessageWindowChatMemory with 10-message window.

3. Tool Calling in Quarkus

Tools are CDI bean methods annotated with @Tool.

@ApplicationScoped
public class Calculator {

    @Tool("Calculates the square root of a number")
    double sqrt(double number) {
        return Math.sqrt(number);
    }
}

Registering tools with AI services

Service-level (all methods get these tools):

@RegisterAiService(tools = {Calculator.class, WebSearch.class})

Method-level with @ToolBox:

@RegisterAiService
public interface Assistant {
    @ToolBox(Calculator.class)
    String chatWithCalc(@UserMessage String msg);
}

Execution model annotations on tool methods

  • @Blocking -- runs on worker thread (default for sync methods)
  • @NonBlocking -- runs on event loop; must not block
  • @RunOnVirtualThread -- runs on virtual thread (Java 21+)

Methods returning Uni or CompletionStage are automatically non-blocking.

Tool guardrails

  • @ToolInputGuardrails -- validates parameters before tool execution
  • @ToolOutputGuardrails -- filters/transforms tool results after execution

4. MCP in Quarkus

Add quarkus-langchain4j-mcp extension. Configure MCP servers in application.properties.

STDIO transport (local process)

quarkus.langchain4j.mcp.myserver.transport-type=stdio
quarkus.langchain4j.mcp.myserver.command=npx,-y,@modelcontextprotocol/server-filesystem,/tmp

HTTP transport (remote)

quarkus.langchain4j.mcp.remote.transport-type=streamable-http
quarkus.langchain4j.mcp.remote.url=https://mcp.example.com/mcp

Using MCP tools in AI services

@RegisterAiService
public interface FileAssistant {

    @SystemMessage("You help users manage files.")
    @McpToolBox("myserver")
    String chat(@UserMessage String message);
}

@McpToolBox with no value uses all configured MCP servers. Pass an array for multiple: @McpToolBox({"github", "filesystem"}).

Verify connection: In quarkus dev, open Dev UI → LangChain4j card → MCP Clients. If a server shows as disconnected, check the command path or URL and review the Quarkus log for connection errors.

Programmatic MCP client access

@Inject
@McpClientName("github")
McpClient githubClient;

Global MCP properties

quarkus.langchain4j.mcp.enabled=true
quarkus.langchain4j.mcp.generate-tool-provider=true
quarkus.langchain4j.mcp.health.enabled=true

5. Agentic Workflows in Quarkus

Add quarkus-langchain4j-agentic extension. Agents are interfaces with exactly one method annotated @Agent.

public interface AnalysisAgent {

    @SystemMessage("You analyze data and produce reports.")
    @Agent(outputKey = "analysis", description = "Data analysis specialist")
    @ToolBox(DataTools.class)
    String analyze(String rawData);
}

Workflow composition patterns

Sequential — agents run one after another:

@SequenceAgent(outputKey = "result",
    subAgents = {AnalysisAgent.class, SummaryAgent.class})
MyResult processData(String input);

Parallel — agents run simultaneously:

@ParallelAgent(subAgents = {AgentA.class, AgentB.class})

Conditional — agents run only if activation condition is met:

@ConditionalAgent(subAgents = {AlertAgent.class})

Loop — agents repeat until a condition is met.

Shared state with AgenticScope

Workflow inputs populate AgenticScope. Each agent's outputKey stores its result in the scope for subsequent agents.

The @Output method extracts final results; parameter names match agent outputKey values:

@Output
static MyResult output(String analysis, String summary) {
    return new MyResult(analysis, summary);
}

6. Configuration

Model provider config (application.properties)

# Anthropic
quarkus.langchain4j.anthropic.api-key=${ANTHROPIC_API_KEY}
quarkus.langchain4j.anthropic.chat-model.model-name=claude-sonnet-4-20250514

# OpenAI
quarkus.langchain4j.openai.api-key=${OPENAI_API_KEY}
quarkus.langchain4j.openai.chat-model.model-name=gpt-4o

# Ollama (local)
quarkus.langchain4j.ollama.chat-model.model-id=llama3

Named model configuration

quarkus.langchain4j.smart.chat-model.provider=anthropic
quarkus.langchain4j.anthropic.smart.api-key=${ANTHROPIC_API_KEY}
quarkus.langchain4j.anthropic.smart.chat-model.model-name=claude-sonnet-4-20250514

quarkus.langchain4j.fast.chat-model.provider=openai
quarkus.langchain4j.openai.fast.api-key=${OPENAI_API_KEY}

Common tuning properties

quarkus.langchain4j.anthropic.chat-model.temperature=0.7
quarkus.langchain4j.anthropic.chat-model.max-tokens=4096
quarkus.langchain4j.anthropic.log-requests=true
quarkus.langchain4j.anthropic.log-responses=true

Dev services

Quarkus auto-starts Ollama in dev mode if no API key is configured:

quarkus.langchain4j.devservices.enabled=true

7. Dev Mode

Start with:

mvn quarkus:dev
# or
quarkus dev

Features:

  • Live reload -- code changes apply instantly without restart
  • Dev UI at http://localhost:8080/q/dev-ui -- browse AI services, test tools, inspect MCP clients
  • Continuous testing -- tests re-run on code changes
  • Dev services -- auto-provisions Ollama, databases (PgVector), etc.

The LangChain4j Dev UI card lets you interact with registered AI services, view tool descriptions, and test MCP server connections directly from the browser.

skills

quarkus-langchain4j

tile.json