Build Quarkus applications with LangChain4j extensions - project setup, CDI services, REST endpoints, MCP, agentic, and dev mode
91
90%
Does it follow best practices?
Impact
96%
1.71xAverage score across 3 eval scenarios
Advisory
Suggest reviewing before use
This skill covers the Quarkus integration layer for LangChain4j. The LangChain4j AI APIs themselves (prompts, memory, chains) live in a separate tile.
quarkus create app with LangChain4j extension (see §2)quarkus.langchain4j.<provider>.api-key in application.properties (see §7)mvn compile; fix any missing extension or BOM errors before continuing@RegisterAiService (see §3)@Inject the AI service into a REST resourcequarkus dev; verify Dev UI shows your AI service at http://localhost:8080/q/dev-ui before testing endpointsquarkus create app com.example:my-ai-app \
--extension='rest,rest-jackson,quarkus-langchain4j-anthropic'Or add extensions to existing project:
quarkus ext add quarkus-langchain4j-anthropic quarkus-langchain4j-mcp<dependencyManagement>
<dependencies>
<dependency>
<groupId>io.quarkiverse.langchain4j</groupId>
<artifactId>quarkus-langchain4j-bom</artifactId>
<version>1.8.4</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>All quarkus-langchain4j extensions share the same version. Always use the BOM.
io.quarkiverse.langchain4j)quarkus-langchain4j-anthropic - Claude modelsquarkus-langchain4j-openai - OpenAI / compatible APIsquarkus-langchain4j-ollama - Local Ollama modelsquarkus-langchain4j-mcp - MCP client supportquarkus-langchain4j-agentic - Agent workflowsquarkus-langchain4j-pgvector - PgVector embeddings@RegisterAiService on an interface produces a CDI bean — inject with @Inject anywhere.
import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.UserMessage;
import io.quarkiverse.langchain4j.RegisterAiService;
@RegisterAiService
public interface MyAiService {
@SystemMessage("You are a helpful assistant.")
String chat(@UserMessage String message);
}| Parameter | Purpose |
|---|---|
modelName | Named model config (default: "<default>") |
tools | Array of CDI tool bean classes |
toolProviderSupplier | Dynamic tool supplier |
chatMemoryProviderSupplier | Custom memory provider |
maxSequentialToolInvocations | Tool call limit per request (default 10) |
@RegisterAiService(modelName = "smart")
public interface SmartService { /* ... */ }
@RegisterAiService(modelName = "fast")
public interface FastService { /* ... */ }@RegisterAiService
public interface ChatBot {
String chat(@MemoryId String sessionId, @UserMessage String msg);
}Default memory: MessageWindowChatMemory with 10-message window.
Tools are CDI bean methods annotated with @Tool.
@ApplicationScoped
public class Calculator {
@Tool("Calculates the square root of a number")
double sqrt(double number) {
return Math.sqrt(number);
}
}Service-level (all methods get these tools):
@RegisterAiService(tools = {Calculator.class, WebSearch.class})Method-level with @ToolBox:
@RegisterAiService
public interface Assistant {
@ToolBox(Calculator.class)
String chatWithCalc(@UserMessage String msg);
}@Blocking -- runs on worker thread (default for sync methods)@NonBlocking -- runs on event loop; must not block@RunOnVirtualThread -- runs on virtual thread (Java 21+)Methods returning Uni or CompletionStage are automatically non-blocking.
@ToolInputGuardrails -- validates parameters before tool execution@ToolOutputGuardrails -- filters/transforms tool results after executionAdd quarkus-langchain4j-mcp extension. Configure MCP servers in application.properties.
quarkus.langchain4j.mcp.myserver.transport-type=stdio
quarkus.langchain4j.mcp.myserver.command=npx,-y,@modelcontextprotocol/server-filesystem,/tmpquarkus.langchain4j.mcp.remote.transport-type=streamable-http
quarkus.langchain4j.mcp.remote.url=https://mcp.example.com/mcp@RegisterAiService
public interface FileAssistant {
@SystemMessage("You help users manage files.")
@McpToolBox("myserver")
String chat(@UserMessage String message);
}@McpToolBox with no value uses all configured MCP servers. Pass an array for multiple: @McpToolBox({"github", "filesystem"}).
Verify connection: In quarkus dev, open Dev UI → LangChain4j card → MCP Clients. If a server shows as disconnected, check the command path or URL and review the Quarkus log for connection errors.
@Inject
@McpClientName("github")
McpClient githubClient;quarkus.langchain4j.mcp.enabled=true
quarkus.langchain4j.mcp.generate-tool-provider=true
quarkus.langchain4j.mcp.health.enabled=trueAdd quarkus-langchain4j-agentic extension. Agents are interfaces with exactly one method annotated @Agent.
public interface AnalysisAgent {
@SystemMessage("You analyze data and produce reports.")
@Agent(outputKey = "analysis", description = "Data analysis specialist")
@ToolBox(DataTools.class)
String analyze(String rawData);
}Sequential — agents run one after another:
@SequenceAgent(outputKey = "result",
subAgents = {AnalysisAgent.class, SummaryAgent.class})
MyResult processData(String input);Parallel — agents run simultaneously:
@ParallelAgent(subAgents = {AgentA.class, AgentB.class})Conditional — agents run only if activation condition is met:
@ConditionalAgent(subAgents = {AlertAgent.class})Loop — agents repeat until a condition is met.
Workflow inputs populate AgenticScope. Each agent's outputKey stores its result in the scope for subsequent agents.
The @Output method extracts final results; parameter names match agent outputKey values:
@Output
static MyResult output(String analysis, String summary) {
return new MyResult(analysis, summary);
}# Anthropic
quarkus.langchain4j.anthropic.api-key=${ANTHROPIC_API_KEY}
quarkus.langchain4j.anthropic.chat-model.model-name=claude-sonnet-4-20250514
# OpenAI
quarkus.langchain4j.openai.api-key=${OPENAI_API_KEY}
quarkus.langchain4j.openai.chat-model.model-name=gpt-4o
# Ollama (local)
quarkus.langchain4j.ollama.chat-model.model-id=llama3quarkus.langchain4j.smart.chat-model.provider=anthropic
quarkus.langchain4j.anthropic.smart.api-key=${ANTHROPIC_API_KEY}
quarkus.langchain4j.anthropic.smart.chat-model.model-name=claude-sonnet-4-20250514
quarkus.langchain4j.fast.chat-model.provider=openai
quarkus.langchain4j.openai.fast.api-key=${OPENAI_API_KEY}quarkus.langchain4j.anthropic.chat-model.temperature=0.7
quarkus.langchain4j.anthropic.chat-model.max-tokens=4096
quarkus.langchain4j.anthropic.log-requests=true
quarkus.langchain4j.anthropic.log-responses=trueQuarkus auto-starts Ollama in dev mode if no API key is configured:
quarkus.langchain4j.devservices.enabled=trueStart with:
mvn quarkus:dev
# or
quarkus devFeatures:
http://localhost:8080/q/dev-ui -- browse AI services, test tools, inspect MCP clientsThe LangChain4j Dev UI card lets you interact with registered AI services, view tool descriptions, and test MCP server connections directly from the browser.