CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-com-embabel-agent--embabel-agent-openai

OpenAI compatible model factory for the Embabel Agent Framework

Overview
Eval results
Files

api-reference.mddocs/

API Reference

Complete API documentation for Embabel Agent OpenAI.

OpenAiCompatibleModelFactory

Factory class for creating OpenAI-compatible LLM and embedding services.

/**
 * Generic support for OpenAI compatible models.
 *
 * @param baseUrl The base URL of the OpenAI API. Null for OpenAI default (https://api.openai.com).
 * @param apiKey The API key for the OpenAI compatible provider, or null for no authentication.
 * @param completionsPath Custom path for completions endpoint (optional, null uses provider default)
 * @param embeddingsPath Custom path for embeddings endpoint (optional, null uses provider default)
 * @param observationRegistry Micrometer observation registry for observability
 * @param requestFactory Optional HTTP request factory provider (defaults to empty)
 */
open class OpenAiCompatibleModelFactory(
    val baseUrl: String?,
    private val apiKey: String?,
    private val completionsPath: String?,
    private val embeddingsPath: String?,
    private val observationRegistry: ObservationRegistry,
    private val requestFactory: ObjectProvider<ClientHttpRequestFactory> = ObjectProviders.empty()
)

openAiCompatibleLlm

Creates an LLM service for OpenAI-compatible models.

/**
 * Creates an LLM service for OpenAI-compatible models
 *
 * @param model Model name/identifier (e.g., "gpt-4", "gpt-3.5-turbo", "llama-3-70b")
 * @param pricingModel Pricing model for cost tracking (use PricingModel.usdPer1MTokens or PricingModel.ALL_YOU_CAN_EAT)
 * @param provider Provider name (e.g., "OpenAI", "Azure OpenAI", "Ollama", "Custom Provider")
 * @param knowledgeCutoffDate Knowledge cutoff date (optional, null if unknown)
 * @param optionsConverter Options converter (defaults to OpenAiChatOptionsConverter)
 * @param retryTemplate Retry template for resilience (defaults to Spring AI default)
 * @return LlmService configured with the specified model
 */
@JvmOverloads
fun openAiCompatibleLlm(
    model: String,
    pricingModel: PricingModel,
    provider: String,
    knowledgeCutoffDate: LocalDate?,
    optionsConverter: OptionsConverter<*> = OpenAiChatOptionsConverter,
    retryTemplate: RetryTemplate = RetryUtils.DEFAULT_RETRY_TEMPLATE
): LlmService<*>

Parameters:

  • model: The model identifier. For OpenAI: "gpt-4", "gpt-4-turbo", "gpt-3.5-turbo", "gpt-5-turbo". For custom providers: use whatever identifier your API expects.
  • pricingModel: Use PricingModel.usdPer1MTokens(inputCost, outputCost) for pay-per-token models or PricingModel.ALL_YOU_CAN_EAT for free/fixed-price models.
  • provider: Human-readable provider name for logging and observability. Can be any string.
  • knowledgeCutoffDate: Last date the model was trained on. Use null if unknown or not applicable.
  • optionsConverter: Converts portable LlmOptions to OpenAI-specific options. Choose based on model capabilities:
    • OpenAiChatOptionsConverter (default): Safe default for most models
    • StandardOpenAiOptionsConverter: For models supporting all parameters
    • Gpt5ChatOptionsConverter: For GPT-5 models (no temperature support)
  • retryTemplate: Spring RetryTemplate for handling transient failures. Defaults to reasonable retry policy.

Returns: LlmService<*> instance (specifically SpringAiLlmService)

Example:

val service = factory.openAiCompatibleLlm(
    model = "gpt-4-turbo",
    pricingModel = PricingModel.usdPer1MTokens(10.0, 30.0),
    provider = "OpenAI",
    knowledgeCutoffDate = LocalDate.of(2023, 12, 1),
    optionsConverter = StandardOpenAiOptionsConverter
)

openAiCompatibleEmbeddingService

Creates an embedding service for OpenAI-compatible models.

/**
 * Creates an embedding service for OpenAI-compatible models
 *
 * @param model Embedding model name/identifier (e.g., "text-embedding-3-small", "text-embedding-3-large")
 * @param provider Provider name (e.g., "OpenAI", "Azure OpenAI")
 * @return EmbeddingService configured with the specified model
 */
fun openAiCompatibleEmbeddingService(
    model: String,
    provider: String
): EmbeddingService

Parameters:

  • model: The embedding model identifier. For OpenAI: "text-embedding-3-small", "text-embedding-3-large", "text-embedding-ada-002".
  • provider: Human-readable provider name for logging and observability.

Returns: EmbeddingService instance (specifically SpringAiEmbeddingService)

Example:

val service = factory.openAiCompatibleEmbeddingService(
    model = "text-embedding-3-large",
    provider = "OpenAI"
)

Protected Members (for Subclassing)

These members are available when extending OpenAiCompatibleModelFactory:

/**
 * Logger instance for the factory class
 */
protected val logger: Logger

/**
 * The configured OpenAI API instance
 */
protected val openAiApi: OpenAiApi

/**
 * Creates a Spring AI ChatModel with the configured API
 *
 * @param model Model name/identifier
 * @param retryTemplate Retry template for resilience
 * @return ChatModel configured with the specified model
 */
protected fun chatModelOf(
    model: String,
    retryTemplate: RetryTemplate
): ChatModel

See Extending the Factory for usage examples.

Options Converters

Options converters transform portable LlmOptions into provider-specific ChatOptions.

OptionsConverter Interface

/**
 * Interface for converting portable LlmOptions to provider-specific ChatOptions
 */
fun interface OptionsConverter<O : ChatOptions> {
    fun convertOptions(options: LlmOptions): O
}

OpenAiChatOptionsConverter

Default options converter for OpenAI models.

/**
 * Default options converter for OpenAI models.
 * Safe default that works with most OpenAI models.
 * Some models may not support all options.
 */
object OpenAiChatOptionsConverter : OptionsConverter<OpenAiChatOptions> {
    override fun convertOptions(options: LlmOptions): OpenAiChatOptions
}

Use when: You're using standard OpenAI models and want a safe default.

StandardOpenAiOptionsConverter

Options converter for models that support all OpenAI parameters.

/**
 * Options converter for OpenAI models that support all parameters.
 * Explicitly supports: temperature, topP, maxTokens, presencePenalty, frequencyPenalty.
 */
object StandardOpenAiOptionsConverter : OptionsConverter<OpenAiChatOptions> {
    override fun convertOptions(options: LlmOptions): OpenAiChatOptions
}

Use when: You're certain your model supports all OpenAI parameters and want explicit control.

Supported parameters:

  • temperature: Controls randomness (0.0-2.0)
  • topP: Nucleus sampling parameter
  • maxTokens: Maximum tokens to generate
  • presencePenalty: Penalty for new topics
  • frequencyPenalty: Penalty for repetition

Gpt5ChatOptionsConverter

Options converter for GPT-5 models that don't support temperature adjustment.

/**
 * Options converter for GPT-5 models that don't support temperature adjustment.
 * Logs a warning if temperature is set to a non-default value (anything other than 1.0).
 * Supports: topP, maxTokens, presencePenalty, frequencyPenalty.
 * Does NOT support: temperature
 */
object Gpt5ChatOptionsConverter : OptionsConverter<OpenAiChatOptions> {
    override fun convertOptions(options: LlmOptions): OpenAiChatOptions
}

Use when: You're using GPT-5 models.

Behavior: If temperature != 1.0, logs a warning and ignores the parameter.

Supported parameters:

  • topP: Nucleus sampling parameter
  • maxTokens: Maximum tokens to generate
  • presencePenalty: Penalty for new topics
  • frequencyPenalty: Penalty for repetition

See Options Converters for detailed comparison and usage guidance.

Return Types

LlmService

Framework-agnostic LLM service abstraction from embabel-agent-api.

/**
 * LLM service interface
 */
interface LlmService<THIS : LlmService<THIS>> {
    fun createMessageSender(options: LlmOptions): LlmMessageSender
    fun withKnowledgeCutoffDate(date: LocalDate): THIS
    fun withPromptContributor(promptContributor: PromptContributor): THIS
}

Implementation: openAiCompatibleLlm returns SpringAiLlmService which implements this interface.

EmbeddingService

Interface for embedding text in vector space from embabel-agent-common.

/**
 * Embedding service interface
 */
interface EmbeddingService {
    fun embed(text: String): FloatArray
    fun embed(texts: List<String>): List<FloatArray>
    val dimensions: Int
}

Implementation: openAiCompatibleEmbeddingService returns SpringAiEmbeddingService which implements this interface.

PricingModel

Pricing models for cost tracking (from embabel-agent-common).

/**
 * Factory methods for creating pricing models
 */
object PricingModel {
    /**
     * Per-million-tokens pricing
     *
     * @param usdPer1mInputTokens Cost in USD per 1 million input tokens
     * @param usdPer1mOutputTokens Cost in USD per 1 million output tokens
     * @return PricingModel instance
     */
    fun usdPer1MTokens(usdPer1mInputTokens: Double, usdPer1mOutputTokens: Double): PricingModel

    /**
     * All-you-can-eat pricing (free or fixed cost, no per-token tracking)
     */
    val ALL_YOU_CAN_EAT: PricingModel
}

Examples:

// GPT-4 pricing
val gpt4Pricing = PricingModel.usdPer1MTokens(30.0, 60.0)

// Free local model
val freePricing = PricingModel.ALL_YOU_CAN_EAT

Dependencies

This package requires:

Required:

  • com.embabel.agent:embabel-agent-api - Core LLM service interfaces
  • org.springframework.ai:spring-ai-openai - Spring AI OpenAI integration
  • io.micrometer:micrometer-observation - Observability support
  • org.springframework:spring-beans - Spring dependency injection
  • org.springframework:spring-web - HTTP client support

Runtime:

  • org.slf4j:slf4j-api - Logging
  • com.fasterxml.jackson.core:jackson-databind - JSON serialization

Install with Tessl CLI

npx tessl i tessl/maven-com-embabel-agent--embabel-agent-openai

docs

api-reference.md

configuration.md

extending.md

index.md

java-usage.md

options-converters.md

quickstart.md

spring-integration.md

use-cases.md

tile.json