LangChain4j integration for Google Vertex AI models including chat, language, embedding, image, and scoring capabilities
—
Quality
Pending
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Text generation using Google Vertex AI language models (text-bison). Implements LangChain4j LanguageModel interface for simple text completion.
public class VertexAiLanguageModel implements LanguageModel {
public Response<String> generate(String prompt);
public static Builder builder();
}import dev.langchain4j.model.vertexai.VertexAiLanguageModel;
import dev.langchain4j.model.language.LanguageModel;
import dev.langchain4j.model.output.Response;VertexAiLanguageModel model = VertexAiLanguageModel.builder()
.endpoint("https://us-central1-aiplatform.googleapis.com/v1/")
.project("your-project-id")
.location("us-central1")
.publisher("google")
.modelName("text-bison@001")
.build();
Response<String> response = model.generate("Write a short poem about clouds");
System.out.println(response.content());text-bison@001 - PaLM 2 text generationtext-bison@002 - PaLM 2 text generation (updated)text-bison-32k - Extended context version (32k tokens)endpoint - API endpoint URLproject - Google Cloud Project IDlocation - GCP regionpublisher - Model publisher ("google")modelName - Model name/versiontemperature (Double) - Randomness 0.0-1.0 (default: varies by model)maxOutputTokens (Integer) - Max response length (default: 200)topK (Integer) - Top-K samplingtopP (Double) - Nucleus sampling 0.0-1.0maxRetries (Integer) - Retry attempts (default: 3)LanguageModel: Simple text completion with generate(String prompt) method. Single-turn generation.
ChatModel: Conversation interface with message history. Multi-turn conversations.
Use LanguageModel for:
Use ChatModel for:
Install with Tessl CLI
npx tessl i tessl/maven-dev-langchain4j--langchain4j-vertex-ai