CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-dev-langchain4j--langchain4j

Build LLM-powered applications in Java with support for chatbots, agents, RAG, tools, and much more

Overview
Eval results
Files

spi.mddocs/

Service Provider Interfaces

SPI (Service Provider Interface) interfaces for customization and framework integration. Allows downstream frameworks like Quarkus and Spring Boot to provide custom implementations of core LangChain4j services.

Overview

LangChain4j uses Java's ServiceLoader mechanism to enable framework integrations without creating hard dependencies. SPIs allow:

  • Framework Integration: Seamless integration with Spring Boot, Quarkus, Micronaut
  • Customization: Replace default implementations with optimized versions
  • Extension: Add framework-specific features without modifying core library
  • Decoupling: Core library remains framework-agnostic

ServiceLoader Basics

Java's ServiceLoader discovers implementations at runtime:

  1. Interface defined in core library
  2. Implementation provided in separate module
  3. META-INF/services file declares implementation
  4. ServiceLoader.load() discovers and instantiates

Capabilities

AI Services SPI

SPIs for customizing AI service creation and configuration.

package dev.langchain4j.spi.services;

/**
 * SPI factory interface for creating AI service contexts
 * Allows frameworks to customize AI service context creation
 */
public interface AiServiceContextFactory {
    /**
     * Create AI service context
     * @param config Configuration for the context
     * @return AiServiceContext instance
     */
    AiServiceContext create(AiServiceConfig config);
}

/**
 * SPI factory interface for creating AI services
 * Allows frameworks to customize how AI service implementations are created
 */
public interface AiServicesFactory {
    /**
     * Create AI service implementation
     * @param context AI service context
     * @return AI service implementation
     */
    <T> T create(AiServiceContext context);
}

/**
 * SPI adapter interface for token streams
 * Allows frameworks to customize token stream handling
 */
public interface TokenStreamAdapter {
    /**
     * Adapt token stream to framework-specific implementation
     * @param tokenStream Original token stream
     * @return Adapted token stream
     */
    TokenStream adapt(TokenStream tokenStream);
}

Thread Safety:

  • Factory implementations MUST be thread-safe (called concurrently)
  • create() methods should be stateless or properly synchronized
  • Returned objects' thread safety depends on implementation
  • ServiceLoader caches instances (singleton pattern)

Common Pitfalls:

  • DO NOT maintain mutable state in factory implementations
  • DO NOT assume single-threaded usage
  • DO NOT throw checked exceptions (wrap in RuntimeException)
  • DO NOT perform expensive initialization in constructor (lazy init instead)
  • DO NOT forget to register in META-INF/services

Edge Cases:

  • Multiple SPI implementations: First found is used (order undefined)
  • No SPI implementation: Default core implementation is used
  • SPI implementation throws exception: Falls back to default
  • ClassLoader isolation: Each classloader has separate ServiceLoader cache

Exception Handling:

  • SPI methods should throw RuntimeException for errors
  • LangChain4j catches and logs SPI failures, falls back to default
  • Critical errors should fail fast with clear error messages

Performance Notes:

  • ServiceLoader caches discovered implementations
  • Factory create() methods called per AI service instance
  • Keep factory operations lightweight (avoid I/O, heavy computation)
  • Consider lazy initialization for expensive resources

Usage Example:

// Custom factory implementation
public class SpringAiServiceContextFactory implements AiServiceContextFactory {
    @Override
    public AiServiceContext create(AiServiceConfig config) {
        // Integrate with Spring
        return new SpringAwareAiServiceContext(config);
    }
}

// Register in META-INF/services/dev.langchain4j.spi.services.AiServiceContextFactory:
// com.mycompany.SpringAiServiceContextFactory

Integration Benefits:

  • Spring: Dependency injection for tools, transaction management
  • Quarkus: Build-time optimization, native image support
  • Micronaut: Compile-time DI, reactive streams
  • Custom: Any framework-specific behavior

Related APIs:

  • AiServices - Main API that uses these SPIs
  • AiServiceContext - Context object passed to services
  • AiServiceConfig - Configuration for service creation

Guardrail Services SPI

SPI for customizing guardrail service creation.

package dev.langchain4j.service.guardrail.spi;

/**
 * SPI factory interface for creating guardrail service builders
 * Allows frameworks to provide custom guardrail implementations
 */
public interface GuardrailServiceBuilderFactory {
    /**
     * Create guardrail service builder
     * @return GuardrailServiceBuilder instance
     */
    GuardrailServiceBuilder create();
}

Thread Safety:

  • Factory implementation MUST be thread-safe
  • create() can be called concurrently
  • Returned builder does NOT need to be thread-safe (single-use)

Common Pitfalls:

  • DO NOT return same builder instance (create new each time)
  • DO NOT assume create() is called once
  • DO NOT perform heavy initialization in create()

Edge Cases:

  • Multiple implementations: First found is used
  • No implementation: Default builder is used
  • Builder creation failure: Falls back to default

Exception Handling:

  • Throw RuntimeException for unrecoverable errors
  • Framework catches and logs, falls back to default

Performance Notes:

  • create() called once per guardrail configuration
  • Keep builder creation lightweight

Usage Example:

// Custom guardrail factory
public class CustomGuardrailFactory implements GuardrailServiceBuilderFactory {
    @Override
    public GuardrailServiceBuilder create() {
        return new CustomGuardrailBuilder();
    }
}

// Register in META-INF/services/dev.langchain4j.service.guardrail.spi.GuardrailServiceBuilderFactory:
// com.mycompany.CustomGuardrailFactory

Use Cases:

  • Custom content filtering
  • Framework-specific moderation services
  • Policy enforcement integration
  • Compliance logging

Related APIs:

  • @Moderate - Annotation that triggers guardrails
  • GuardrailService - Service interface for guardrails
  • ModerationException - Exception thrown by guardrails

Embedding Store SPI

SPI for customizing embedding store JSON serialization.

package dev.langchain4j.spi.store.embedding.inmemory;

/**
 * SPI factory interface for creating JSON codecs for in-memory embedding store
 * Allows custom serialization/deserialization implementations
 */
public interface InMemoryEmbeddingStoreJsonCodecFactory {
    /**
     * Create JSON codec instance
     * @return InMemoryEmbeddingStoreJsonCodec instance
     */
    InMemoryEmbeddingStoreJsonCodec create();
}

Thread Safety:

  • Factory MUST be thread-safe
  • Returned codec MUST be thread-safe (used concurrently)
  • Serialization/deserialization must handle concurrent calls

Common Pitfalls:

  • DO NOT use non-thread-safe JSON libraries without synchronization
  • DO NOT assume single-threaded serialization
  • DO NOT lose data during round-trip (serialize → deserialize)
  • DO NOT change format between versions (backward compatibility)

Edge Cases:

  • Large embedding stores (GB+): Consider streaming
  • Corrupted JSON: Throw clear exception
  • Version mismatches: Handle gracefully or fail clearly

Exception Handling:

  • Throw IOException for I/O errors
  • Throw RuntimeException for format errors
  • Provide clear error messages with context

Performance Notes:

  • Serialization can be slow for large stores (millions of embeddings)
  • Consider compression (gzip) for large stores
  • Incremental serialization for very large stores
  • Binary formats (e.g., protobuf) faster than JSON

Usage Example:

// Custom codec using Gson
public class GsonCodecFactory implements InMemoryEmbeddingStoreJsonCodecFactory {
    @Override
    public InMemoryEmbeddingStoreJsonCodec create() {
        return new GsonEmbeddingStoreCodec();
    }
}

class GsonEmbeddingStoreCodec implements InMemoryEmbeddingStoreJsonCodec {
    private final Gson gson = new GsonBuilder()
        .setPrettyPrinting()
        .create();

    @Override
    public String toJson(InMemoryEmbeddingStore<?> store) {
        return gson.toJson(store);
    }

    @Override
    public InMemoryEmbeddingStore<?> fromJson(String json) {
        return gson.fromJson(json, InMemoryEmbeddingStore.class);
    }
}

// Register in META-INF/services/dev.langchain4j.spi.store.embedding.inmemory.InMemoryEmbeddingStoreJsonCodecFactory:
// com.mycompany.GsonCodecFactory

Alternative Serialization Formats:

  • JSON (default): Human-readable, large size
  • Binary JSON (BSON, MessagePack): Faster, smaller
  • Protobuf: Very fast, very compact, schema required
  • Custom: Optimized for specific use cases

Related APIs:

  • InMemoryEmbeddingStore - Store being serialized
  • Embedding - Vector data in store
  • TextSegment - Metadata in store

Class Loading SPI

SPIs for class metadata and reflection customization.

package dev.langchain4j.classloading;

/**
 * Utility class for returning metadata about a class and its methods
 * Allows downstream frameworks (Quarkus, Spring) to use their own mechanisms
 * for providing this information
 */
public class ClassMetadataProvider {
    /**
     * Retrieves implementation of ClassMetadataProviderFactory via ServiceLoader
     * @return ClassMetadataProviderFactory instance
     */
    public static <MethodKey> ClassMetadataProviderFactory<MethodKey> getClassMetadataProviderFactory();
}

/**
 * Implementation of ClassMetadataProviderFactory using Java Reflection
 * Default implementation for retrieving annotations and method metadata
 */
public class ReflectionBasedClassMetadataProviderFactory {
    /**
     * Get annotation from method
     * @param method Method to inspect
     * @param annotationClass Annotation class to find
     * @return Optional containing annotation if present
     */
    public <T extends Annotation> Optional<T> getAnnotation(
        Method method,
        Class<T> annotationClass
    );

    /**
     * Get annotation from class
     * @param clazz Class to inspect
     * @param annotationClass Annotation class to find
     * @return Optional containing annotation if present
     */
    public <T extends Annotation> Optional<T> getAnnotation(
        Class<?> clazz,
        Class<T> annotationClass
    );

    /**
     * Get all non-static methods on class
     * @param clazz Class to inspect
     * @return Iterable of methods
     */
    public Iterable<Method> getNonStaticMethodsOnClass(Class<?> clazz);
}

Thread Safety:

  • ClassMetadataProvider MUST be thread-safe (used concurrently)
  • Reflection operations are generally thread-safe
  • Custom implementations must ensure thread safety

Common Pitfalls:

  • DO NOT cache Method objects in static fields (ClassLoader leak risk)
  • DO NOT assume annotations are present (always use Optional)
  • DO NOT perform expensive operations (called frequently)
  • DO NOT forget inherited methods (unless intentional)

Edge Cases:

  • Bridge methods: May need special handling
  • Inherited annotations: May or may not be visible depending on @Inherited
  • Interface methods: Consider when scanning
  • Private methods: Usually excluded from non-static methods

Exception Handling:

  • Return empty Optional for missing annotations (not exception)
  • Throw RuntimeException for reflection errors
  • Provide clear error messages for security exceptions

Performance Notes:

  • Reflection is relatively slow - cache results when possible
  • Method scanning is O(n) where n is number of methods
  • Annotation lookup can trigger class loading
  • Quarkus build-time approach eliminates runtime reflection cost

Build-Time vs Runtime Reflection:

Runtime (Default):

  • Uses Java Reflection API
  • Works everywhere
  • Slower, not GraalVM native-friendly

Build-Time (Quarkus):

  • Processes annotations at build time
  • Generates metadata
  • Fast at runtime, GraalVM native compatible

Usage Example:

// Quarkus build-time metadata provider
public class QuarkusClassMetadataProviderFactory
    implements ClassMetadataProviderFactory<String> {

    private final Map<String, MethodMetadata> methodCache;

    public QuarkusClassMetadataProviderFactory(
        Map<String, MethodMetadata> buildTimeMetadata
    ) {
        this.methodCache = buildTimeMetadata;
    }

    @Override
    public <T extends Annotation> Optional<T> getAnnotation(
        String methodKey,
        Class<T> annotationClass
    ) {
        MethodMetadata metadata = methodCache.get(methodKey);
        if (metadata == null) {
            return Optional.empty();
        }
        return metadata.getAnnotation(annotationClass);
    }

    @Override
    public <T extends Annotation> Optional<T> getAnnotation(
        Class<?> clazz,
        Class<T> annotationClass
    ) {
        // Look up from build-time metadata
        return buildTimeMetadata.getClassAnnotation(clazz, annotationClass);
    }

    @Override
    public Iterable<String> getNonStaticMethodsOnClass(Class<?> clazz) {
        // Return method keys from build-time scan
        return buildTimeMetadata.getMethodKeys(clazz);
    }
}

// Register in META-INF/services/dev.langchain4j.classloading.ClassMetadataProviderFactory:
// io.quarkus.langchain4j.QuarkusClassMetadataProviderFactory

GraalVM Native Image Support:

// At build time, register reflection metadata
RuntimeReflection.register(MyAiService.class);
for (Method method : MyAiService.class.getDeclaredMethods()) {
    RuntimeReflection.register(method);
}

// Custom provider uses pre-registered metadata
public class NativeImageMetadataProvider
    implements ClassMetadataProviderFactory<Method> {
    // Implementation uses registered metadata
}

Related APIs:

  • AiServices - Uses metadata provider to inspect interfaces
  • @UserMessage, @SystemMessage, etc. - Annotations to discover
  • @Tool - Tool discovery via reflection

Usage Examples

Implementing Custom AI Service Context Factory

import dev.langchain4j.spi.services.AiServiceContextFactory;
import dev.langchain4j.service.AiServiceContext;

// Custom factory implementation
public class MyCustomAiServiceContextFactory implements AiServiceContextFactory {
    @Override
    public AiServiceContext create(AiServiceConfig config) {
        // Custom context creation logic
        // Can integrate with framework-specific features
        return new CustomAiServiceContext(config);
    }
}

To register the SPI implementation, create a file: META-INF/services/dev.langchain4j.spi.services.AiServiceContextFactory

With content:

com.mycompany.MyCustomAiServiceContextFactory

Testing SPI Registration:

@Test
void testSpiDiscovery() {
    ServiceLoader<AiServiceContextFactory> loader =
        ServiceLoader.load(AiServiceContextFactory.class);

    boolean found = false;
    for (AiServiceContextFactory factory : loader) {
        if (factory instanceof MyCustomAiServiceContextFactory) {
            found = true;
            break;
        }
    }

    assertTrue(found, "Custom SPI implementation not discovered");
}

Implementing Custom Guardrail Service Builder

import dev.langchain4j.service.guardrail.spi.GuardrailServiceBuilderFactory;

// Custom guardrail factory for framework integration
public class SpringGuardrailServiceBuilderFactory implements GuardrailServiceBuilderFactory {
    @Override
    public GuardrailServiceBuilder create() {
        // Create builder that can inject Spring beans
        return new SpringAwareGuardrailServiceBuilder();
    }
}

Register via: META-INF/services/dev.langchain4j.service.guardrail.spi.GuardrailServiceBuilderFactory

Spring Integration Example:

public class SpringAwareGuardrailServiceBuilder implements GuardrailServiceBuilder {
    @Autowired
    private ApplicationContext applicationContext;

    @Override
    public GuardrailService build() {
        // Create guardrail that uses Spring beans
        return new GuardrailService() {
            @Override
            public void validate(String input) {
                // Use Spring-managed moderation service
                ModerationService moderationService =
                    applicationContext.getBean(ModerationService.class);
                moderationService.check(input);
            }
        };
    }
}

Implementing Custom Embedding Store Codec

import dev.langchain4j.spi.store.embedding.inmemory.InMemoryEmbeddingStoreJsonCodecFactory;
import dev.langchain4j.store.embedding.inmemory.InMemoryEmbeddingStoreJsonCodec;

// Custom JSON codec using a different JSON library
public class GsonEmbeddingStoreCodecFactory implements InMemoryEmbeddingStoreJsonCodecFactory {
    @Override
    public InMemoryEmbeddingStoreJsonCodec create() {
        return new GsonInMemoryEmbeddingStoreJsonCodec();
    }
}

class GsonInMemoryEmbeddingStoreJsonCodec implements InMemoryEmbeddingStoreJsonCodec {
    private final Gson gson = new Gson();

    @Override
    public String toJson(InMemoryEmbeddingStore<?> store) {
        // Serialize using Gson
        return gson.toJson(store);
    }

    @Override
    public InMemoryEmbeddingStore<?> fromJson(String json) {
        // Deserialize using Gson
        return gson.fromJson(json, InMemoryEmbeddingStore.class);
    }
}

Register via: META-INF/services/dev.langchain4j.spi.store.embedding.inmemory.InMemoryEmbeddingStoreJsonCodecFactory

Testing Codec:

@Test
void testCodecRoundTrip() {
    InMemoryEmbeddingStore<TextSegment> store = new InMemoryEmbeddingStore<>();
    store.add(embedding, textSegment);

    // Serialize
    String json = codec.toJson(store);

    // Deserialize
    InMemoryEmbeddingStore<?> restored = codec.fromJson(json);

    // Verify
    List<EmbeddingMatch<TextSegment>> results =
        restored.findRelevant(embedding, 1);
    assertEquals(1, results.size());
}

Using Custom Class Metadata Provider

import dev.langchain4j.classloading.ClassMetadataProvider;

// Custom implementation for build-time reflection (e.g., Quarkus)
public class QuarkusClassMetadataProviderFactory implements ClassMetadataProviderFactory<String> {
    @Override
    public <T extends Annotation> Optional<T> getAnnotation(
        String methodKey,
        Class<T> annotationClass
    ) {
        // Use Quarkus build-time metadata
        return quarkusRecorder.getMethodAnnotation(methodKey, annotationClass);
    }

    @Override
    public <T extends Annotation> Optional<T> getAnnotation(
        Class<?> clazz,
        Class<T> annotationClass
    ) {
        // Use Quarkus build-time metadata
        return quarkusRecorder.getClassAnnotation(clazz, annotationClass);
    }

    @Override
    public Iterable<String> getNonStaticMethodsOnClass(Class<?> clazz) {
        // Return method keys from build-time metadata
        return quarkusRecorder.getMethodKeys(clazz);
    }
}

Quarkus Build Step:

@BuildStep
public void processAiServices(
    BuildProducer<ReflectiveClassBuildItem> reflectiveClass,
    BuildProducer<ServiceProviderBuildItem> serviceProvider
) {
    // Register AI service interfaces for reflection
    reflectiveClass.produce(
        ReflectiveClassBuildItem.builder(MyAiService.class)
            .methods(true)
            .build()
    );

    // Register custom metadata provider
    serviceProvider.produce(
        ServiceProviderBuildItem.allProvidersFromClassPath(
            ClassMetadataProviderFactory.class.getName()
        )
    );
}

Framework Integration Example: Spring Boot

import dev.langchain4j.spi.services.AiServicesFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;

// Spring Boot integration using SPI
public class SpringAiServicesFactory implements AiServicesFactory {
    @Autowired
    private ApplicationContext applicationContext;

    @Override
    public <T> T create(AiServiceContext context) {
        // Create AI service with Spring bean injection
        // Tools can be Spring beans, automatically injected
        return createSpringManagedAiService(context);
    }

    private <T> T createSpringManagedAiService(AiServiceContext context) {
        // Implementation that integrates with Spring
        // - Inject Spring beans as tools
        // - Use Spring's transaction management
        // - Apply Spring AOP aspects
        // - Use Spring's event system
        return (T) Proxy.newProxyInstance(
            context.aiServiceClass().getClassLoader(),
            new Class[]{context.aiServiceClass()},
            new SpringAwareInvocationHandler(context, applicationContext)
        );
    }
}

Spring Configuration:

@Configuration
@ConditionalOnClass(AiServices.class)
public class LangChain4jAutoConfiguration {

    @Bean
    public AiServicesFactory springAiServicesFactory(ApplicationContext context) {
        return new SpringAiServicesFactory(context);
    }

    @Bean
    public ChatModel chatModel(@Value("${langchain4j.api-key}") String apiKey) {
        return OpenAiChatModel.builder()
            .apiKey(apiKey)
            .build();
    }

    // Auto-discover @Tool beans
    @Bean
    public List<Object> toolBeans(ApplicationContext context) {
        return context.getBeansWithAnnotation(Tool.class).values()
            .stream()
            .collect(Collectors.toList());
    }
}

Custom Token Stream Adapter

import dev.langchain4j.spi.services.TokenStreamAdapter;
import dev.langchain4j.service.TokenStream;

// Adapter for framework-specific reactive streams
public class ReactorTokenStreamAdapter implements TokenStreamAdapter {
    @Override
    public TokenStream adapt(TokenStream tokenStream) {
        // Adapt to Project Reactor Flux or other reactive type
        return new ReactorBackedTokenStream(tokenStream);
    }
}

class ReactorBackedTokenStream implements TokenStream {
    private final TokenStream delegate;

    public ReactorBackedTokenStream(TokenStream delegate) {
        this.delegate = delegate;
    }

    @Override
    public void onNext(String token) {
        // Forward to Reactor Flux
        flux.next(token);
    }

    @Override
    public void onComplete(Response<AiMessage> response) {
        // Complete Flux
        flux.complete();
    }

    @Override
    public void onError(Throwable error) {
        // Error Flux
        flux.error(error);
    }
}

Reactive Streams Integration:

// Use adapted token stream with reactive frameworks
@GetMapping(value = "/stream", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<String> streamResponse(@RequestParam String message) {
    return Flux.create(sink -> {
        TokenStream tokenStream = new TokenStream() {
            @Override
            public void onNext(String token) {
                sink.next(token);
            }

            @Override
            public void onComplete(Response<AiMessage> response) {
                sink.complete();
            }

            @Override
            public void onError(Throwable error) {
                sink.error(error);
            }
        };

        assistant.chat(message, tokenStream);
    });
}

SPI Discovery

LangChain4j uses Java's ServiceLoader mechanism to discover SPI implementations:

  1. Create implementation class
  2. Create service file in META-INF/services/ directory
  3. File name is the fully qualified interface name
  4. File content is the fully qualified implementation class name(s)

Example structure:

src/main/resources/
  META-INF/
    services/
      dev.langchain4j.spi.services.AiServiceContextFactory
      dev.langchain4j.spi.services.AiServicesFactory
      dev.langchain4j.service.guardrail.spi.GuardrailServiceBuilderFactory

File Content Format:

# Comment lines start with #
com.mycompany.MyFactoryImpl
com.mycompany.AlternativeFactoryImpl  # Multiple implementations allowed

ServiceLoader Discovery Process:

  1. ServiceLoader.load(InterfaceClass.class) called
  2. Scans META-INF/services/fully.qualified.InterfaceName in classpath
  3. Reads implementation class names from file
  4. Loads and instantiates classes using no-arg constructor
  5. Caches instances (singleton per ServiceLoader)

Multiple Implementations:

// When multiple implementations exist
ServiceLoader<MyFactory> loader = ServiceLoader.load(MyFactory.class);
for (MyFactory factory : loader) {
    // Iterate all implementations
    // Usually, first one found is used
}

// LangChain4j pattern: use first found
Optional<MyFactory> factory = ServiceLoader.load(MyFactory.class)
    .findFirst();

ClassLoader Considerations:

// Use context class loader (default)
ServiceLoader<MyFactory> loader = ServiceLoader.load(MyFactory.class);

// Use specific class loader
ClassLoader classLoader = Thread.currentThread().getContextClassLoader();
ServiceLoader<MyFactory> loader = ServiceLoader.load(MyFactory.class, classLoader);

// In modular applications (Java 9+), module must declare "uses"
module my.module {
    uses dev.langchain4j.spi.services.AiServicesFactory;
    provides dev.langchain4j.spi.services.AiServicesFactory
        with com.mycompany.MyFactoryImpl;
}

Framework Integration Use Cases

Quarkus Integration

Quarkus can provide:

  • Build-time reflection and metadata collection
  • Native image compatibility
  • CDI bean injection for tools
  • Build-time tool discovery
// Quarkus provides custom ClassMetadataProvider for build-time processing
public class QuarkusLangChain4jExtension {
    @BuildStep
    public void registerAiServices(BuildProducer<ServiceProviderBuildItem> serviceProvider) {
        serviceProvider.produce(
            ServiceProviderBuildItem.allProvidersFromClassPath(
                AiServiceContextFactory.class.getName()
            )
        );
    }
}

Quarkus Build-Time Advantages:

@BuildStep
public void processAiServiceInterfaces(
    CombinedIndexBuildItem index,
    BuildProducer<UnremovableBeanBuildItem> unremovable,
    BuildProducer<AdditionalBeanBuildItem> additionalBeans,
    BuildProducer<ReflectiveClassBuildItem> reflective
) {
    // Scan for AI service interfaces at build time
    Collection<ClassInfo> aiServiceInterfaces = index.getIndex()
        .getKnownDirectImplementors(DotName.createSimple(AiService.class.getName()));

    for (ClassInfo classInfo : aiServiceInterfaces) {
        // Register for reflection (native image)
        reflective.produce(
            ReflectiveClassBuildItem.builder(classInfo.name().toString())
                .methods(true)
                .fields(true)
                .build()
        );

        // Process @Tool annotations at build time
        processToolAnnotations(classInfo);
    }
}

Benefits:

  • Fast Startup: No runtime reflection scanning
  • Small Binary: Only needed classes included
  • Native Image: GraalVM native-image compatible
  • Type Safety: Build-time validation of configurations

Spring Boot Integration

Spring Boot can provide:

  • Spring bean injection for tools and dependencies
  • Spring AOP integration
  • Spring transaction management
  • Spring event system integration
@Configuration
public class LangChain4jAutoConfiguration {
    @Bean
    public AiServicesFactory springAiServicesFactory(ApplicationContext context) {
        return new SpringAiServicesFactory(context);
    }
}

Spring Integration Benefits:

@Configuration
public class AiServiceConfiguration {

    // Tools as Spring beans
    @Bean
    public WeatherService weatherService() {
        return new WeatherService();
    }

    @Bean
    public Assistant assistant(
        ChatModel chatModel,
        @Qualifier("weatherService") WeatherService weatherService
    ) {
        return AiServices.builder(Assistant.class)
            .chatModel(chatModel)
            .tools(weatherService) // Spring bean injected as tool
            .build();
    }

    // Transactional tools
    @Service
    public class DatabaseTool {
        @Tool("Save data to database")
        @Transactional // Spring manages transaction
        public String saveData(@P("Data to save") String data) {
            repository.save(data);
            return "Saved";
        }
    }

    // AOP for logging/metrics
    @Aspect
    @Component
    public class AiServiceAspect {
        @Around("@annotation(Tool)")
        public Object logToolExecution(ProceedingJoinPoint pjp) throws Throwable {
            log.info("Executing tool: {}", pjp.getSignature().getName());
            long start = System.currentTimeMillis();
            try {
                return pjp.proceed();
            } finally {
                log.info("Tool executed in {} ms",
                    System.currentTimeMillis() - start);
            }
        }
    }
}

Spring Boot Starter Example:

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-spring-boot-starter</artifactId>
    <version>${langchain4j.version}</version>
</dependency>
# application.yml
langchain4j:
  open-ai:
    api-key: ${OPENAI_API_KEY}
    model-name: gpt-4
    temperature: 0.7
  chat-memory:
    max-messages: 10

Micronaut Integration

Micronaut can provide:

  • Compile-time dependency injection
  • Bean validation integration
  • Reactive streams support
  • AOT (Ahead-of-Time) compilation
@Factory
public class LangChain4jFactory {
    @Singleton
    public AiServicesFactory micronautAiServicesFactory(BeanContext beanContext) {
        return new MicronautAiServicesFactory(beanContext);
    }
}

Micronaut Advantages:

// Compile-time DI (no reflection)
@Singleton
public class MyAiService {
    private final ChatModel chatModel;
    private final WeatherTool weatherTool;

    // Constructor injection (compile-time)
    public MyAiService(ChatModel chatModel, WeatherTool weatherTool) {
        this.chatModel = chatModel;
        this.weatherTool = weatherTool;
    }
}

// Bean validation integration
public class ValidatedTool {
    @Tool("Process validated input")
    public String process(
        @NotNull @Size(min = 1, max = 100) String input
    ) {
        // Micronaut validates before method execution
        return "Processed: " + input;
    }
}

// Reactive streams
public class ReactiveAssistant {
    @Post("/chat")
    public Flux<String> chat(@Body String message) {
        return Flux.from(aiService.chatStream(message));
    }
}

Custom Framework Integration

// Generic framework integration pattern
public class CustomFrameworkIntegration {

    // 1. Implement SPI factories
    public class CustomAiServicesFactory implements AiServicesFactory {
        @Override
        public <T> T create(AiServiceContext context) {
            return createWithCustomFeatures(context);
        }
    }

    // 2. Register via ServiceLoader
    // META-INF/services/dev.langchain4j.spi.services.AiServicesFactory

    // 3. Provide framework-specific features
    private <T> T createWithCustomFeatures(AiServiceContext context) {
        return (T) Proxy.newProxyInstance(
            context.aiServiceClass().getClassLoader(),
            new Class[]{context.aiServiceClass()},
            new CustomInvocationHandler(context)
        );
    }

    // 4. Custom invocation handler
    private class CustomInvocationHandler implements InvocationHandler {
        private final AiServiceContext context;

        public CustomInvocationHandler(AiServiceContext context) {
            this.context = context;
        }

        @Override
        public Object invoke(Object proxy, Method method, Object[] args) {
            // Add custom behavior:
            // - Dependency injection
            // - Transaction management
            // - Security checks
            // - Metrics/logging
            // - Error handling
            return invokeWithCustomBehavior(method, args);
        }
    }
}

Best Practices

  1. Use ServiceLoader: Follow Java's standard SPI pattern for discovery

    • Create META-INF/services files correctly
    • Use fully qualified class names
    • Ensure no-arg constructor exists
  2. Document Requirements: Clearly document what your SPI implementation provides

    • List dependencies
    • Describe features added
    • Document configuration options
  3. Version Compatibility: Ensure SPI implementations are compatible with LangChain4j version

    • Test against specific versions
    • Document version requirements
    • Handle API changes gracefully
  4. Fallback Behavior: Provide sensible defaults if custom SPI is not available

    • Graceful degradation
    • Clear error messages
    • Don't break core functionality
  5. Testing: Test SPI implementations thoroughly with ServiceLoader

    • Unit tests for implementation
    • Integration tests with ServiceLoader
    • Test with and without SPI present
  6. Thread Safety: Ensure SPI implementations are thread-safe

    • Factories called concurrently
    • Stateless or properly synchronized
    • Document thread safety guarantees
  7. Performance: Be mindful of performance impact, especially for frequently called SPIs

    • Avoid heavy initialization
    • Cache when appropriate
    • Profile critical paths
  8. Error Handling: Provide clear error messages when SPI implementation fails

    • Use descriptive exception messages
    • Include context information
    • Log failures appropriately

Advanced Topics

SPI Priority and Ordering

// When multiple implementations exist, control priority
@Priority(100) // Lower number = higher priority
public class HighPriorityFactory implements AiServicesFactory {
    // This will be selected over lower priority factories
}

@Priority(500)
public class LowPriorityFactory implements AiServicesFactory {
    // Fallback implementation
}

Conditional SPI Loading

// Load SPI only if condition met
public class ConditionalFactory implements AiServicesFactory {
    public ConditionalFactory() {
        // Check if conditions met
        if (!isEnvironmentReady()) {
            throw new UnsupportedOperationException(
                "Required environment not available"
            );
        }
    }

    private boolean isEnvironmentReady() {
        // Check for required classes, system properties, etc.
        try {
            Class.forName("com.required.Dependency");
            return true;
        } catch (ClassNotFoundException e) {
            return false;
        }
    }
}

SPI with Configuration

// SPI that reads configuration
public class ConfigurableFactory implements AiServicesFactory {
    private final Config config;

    public ConfigurableFactory() {
        // Load configuration
        this.config = loadConfig();
    }

    private Config loadConfig() {
        // From system properties
        String prop = System.getProperty("langchain4j.factory.config");

        // From environment
        String env = System.getenv("LANGCHAIN4J_CONFIG");

        // From config file
        // ...

        return new Config(prop, env);
    }

    @Override
    public <T> T create(AiServiceContext context) {
        // Use configuration
        return createConfigured(context, config);
    }
}

Testing SPI Implementations

@Test
public void testSpiDiscovery() {
    // Test that SPI is discovered
    ServiceLoader<AiServicesFactory> loader =
        ServiceLoader.load(AiServicesFactory.class);

    boolean found = false;
    for (AiServicesFactory factory : loader) {
        if (factory instanceof MyCustomFactory) {
            found = true;
            break;
        }
    }

    assertTrue(found, "Custom factory not discovered");
}

@Test
public void testSpiImplementation() {
    // Test SPI behavior
    MyCustomFactory factory = new MyCustomFactory();

    AiServiceConfig config = createTestConfig();
    AiServiceContext context = factory.create(config);

    assertNotNull(context);
    // Test custom behavior
    assertTrue(context.hasCustomFeature());
}

@Test
public void testSpiFallback() {
    // Test fallback when SPI not available
    // Remove SPI from classpath temporarily
    // Verify default implementation is used
}

@Test
public void testSpiThreadSafety() throws Exception {
    MyCustomFactory factory = new MyCustomFactory();

    // Concurrent access test
    int threadCount = 10;
    CountDownLatch latch = new CountDownLatch(threadCount);
    List<Throwable> errors = new CopyOnWriteArrayList<>();

    for (int i = 0; i < threadCount; i++) {
        new Thread(() -> {
            try {
                factory.create(createTestConfig());
            } catch (Throwable t) {
                errors.add(t);
            } finally {
                latch.countDown();
            }
        }).start();
    }

    latch.await();
    assertTrue(errors.isEmpty(), "Thread safety issues: " + errors);
}

Available SPIs Summary

SPI InterfacePurposePackage
AiServiceContextFactoryCustom AI service context creationdev.langchain4j.spi.services
AiServicesFactoryCustom AI service implementationdev.langchain4j.spi.services
TokenStreamAdapterCustom token stream handlingdev.langchain4j.spi.services
GuardrailServiceBuilderFactoryCustom guardrail servicesdev.langchain4j.service.guardrail.spi
InMemoryEmbeddingStoreJsonCodecFactoryCustom embedding store serializationdev.langchain4j.spi.store.embedding.inmemory
ClassMetadataProviderCustom class metadata retrievaldev.langchain4j.classloading

These SPIs enable deep integration with frameworks while keeping the core LangChain4j library framework-agnostic.

Related Resources

Official Integrations

  • langchain4j-spring-boot-starter: Spring Boot integration
  • langchain4j-quarkus: Quarkus integration
  • langchain4j-micronaut: Micronaut integration

Documentation

  • ServiceLoader JavaDoc: Java Platform documentation
  • LangChain4j GitHub: Examples and integration guides
  • Framework-specific integration docs

Community

  • GitHub Discussions: Framework integration questions
  • Stack Overflow: Technical Q&A
  • Discord/Slack: Community support

This comprehensive SPI documentation provides production-grade details for coding agents to implement custom integrations with LangChain4j.

Install with Tessl CLI

npx tessl i tessl/maven-dev-langchain4j--langchain4j

docs

ai-services.md

chains.md

classification.md

data-types.md

document-processing.md

embedding-store.md

guardrails.md

index.md

memory.md

messages.md

models.md

output-parsing.md

prompts.md

rag.md

request-response.md

spi.md

tools.md

README.md

tile.json