or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

docs

aqa.mdchat-models.mdembeddings.mdindex.mdllm-models.mdsafety-config.mdvector-store.md
tile.json

tessl/pypi-langchain-google-genai

An integration package connecting Google's genai package and LangChain

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
pypipkg:pypi/langchain-google-genai@2.1.x

To install, run

npx @tessl/cli install tessl/pypi-langchain-google-genai@2.1.0

index.mddocs/

LangChain Google GenAI

A comprehensive Python integration package that connects Google's Generative AI models with the LangChain framework. Provides seamless access to Google's cutting-edge AI capabilities including Gemini chat models, embeddings, vector storage, and advanced features like attributed question answering and multimodal inputs.

Package Information

  • Package Name: langchain-google-genai
  • Package Type: pypi
  • Language: Python
  • Installation: pip install langchain-google-genai

Core Imports

from langchain_google_genai import ChatGoogleGenerativeAI, GoogleGenerativeAI

For embeddings:

from langchain_google_genai import GoogleGenerativeAIEmbeddings

For vector store and AQA:

from langchain_google_genai import GoogleVectorStore, GenAIAqa, AqaInput, AqaOutput

For safety and configuration:

from langchain_google_genai import HarmBlockThreshold, HarmCategory, Modality

Basic Usage

Chat Model

from langchain_google_genai import ChatGoogleGenerativeAI

# Initialize with API key from environment (GOOGLE_API_KEY)
llm = ChatGoogleGenerativeAI(model="gemini-2.5-pro")

# Simple text generation
response = llm.invoke("Explain quantum computing in simple terms")
print(response.content)

# With streaming
for chunk in llm.stream("Write a short story about AI"):
    print(chunk.content, end="", flush=True)

LLM Model

from langchain_google_genai import GoogleGenerativeAI

llm = GoogleGenerativeAI(model="gemini-2.5-pro")
result = llm.invoke("Once upon a time in the world of AI...")
print(result)

Embeddings

from langchain_google_genai import GoogleGenerativeAIEmbeddings

embeddings = GoogleGenerativeAIEmbeddings(model="models/gemini-embedding-001")

# Single query embedding
query_vector = embeddings.embed_query("What is machine learning?")

# Batch document embeddings
doc_vectors = embeddings.embed_documents([
    "Machine learning is a subset of AI",
    "Deep learning uses neural networks",
    "Natural language processing handles text"
])

Architecture

The package provides several key components that integrate with Google's Generative AI services:

  • Chat Models & LLMs: Direct interfaces to Google's Gemini models for conversational AI and text generation
  • Embeddings: High-quality text vectorization using Google's embedding models
  • Vector Store: Managed semantic search and retrieval using Google's infrastructure
  • AQA (Attributed Question Answering): Grounded question answering with source attribution
  • Safety Controls: Comprehensive content filtering and safety settings
  • Multimodal Support: Integration with Google's multimodal capabilities for text, images, and audio

The package maintains full compatibility with LangChain's ecosystem while providing access to Google's latest AI innovations including Gemini 2.0 Flash with advanced reasoning capabilities.

Capabilities

Chat Models

Advanced conversational AI with support for tool calling, structured outputs, streaming, safety controls, and multimodal inputs including text, images, and audio.

class ChatGoogleGenerativeAI:
    def __init__(
        self,
        *,
        model: str,
        google_api_key: Optional[SecretStr] = None,
        temperature: float = 0.7,
        max_output_tokens: Optional[int] = None,
        top_p: Optional[float] = None,
        top_k: Optional[int] = None,
        safety_settings: Optional[Dict[HarmCategory, HarmBlockThreshold]] = None,
        **kwargs
    )
    
    def invoke(self, input: LanguageModelInput, config: Optional[RunnableConfig] = None, **kwargs) -> BaseMessage
    def stream(self, input: LanguageModelInput, config: Optional[RunnableConfig] = None, **kwargs) -> Iterator[ChatGenerationChunk]
    def bind_tools(self, tools: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable, BaseTool]], **kwargs) -> Runnable
    def with_structured_output(self, schema: Union[Dict, Type[BaseModel]], **kwargs) -> Runnable

Chat Models

LLM Models

Simple text generation interface providing direct access to Google's Gemini models for completion-style tasks.

class GoogleGenerativeAI:
    def __init__(
        self,
        *,
        model: str,
        google_api_key: Optional[SecretStr] = None,
        temperature: float = 0.7,
        max_output_tokens: Optional[int] = None,
        **kwargs
    )
    
    def invoke(self, input: Union[str, List[BaseMessage]], config: Optional[RunnableConfig] = None, **kwargs) -> str
    def stream(self, input: Union[str, List[BaseMessage]], config: Optional[RunnableConfig] = None, **kwargs) -> Iterator[str]

LLM Models

Embeddings

High-quality text embeddings for semantic search, similarity analysis, and machine learning applications with batching support and configurable task types.

class GoogleGenerativeAIEmbeddings:
    def __init__(
        self,
        *,
        model: str,
        task_type: Optional[str] = None,
        google_api_key: Optional[SecretStr] = None,
        **kwargs
    )
    
    def embed_query(self, text: str, **kwargs) -> List[float]
    def embed_documents(self, texts: List[str], **kwargs) -> List[List[float]]

Embeddings

Vector Store

Managed semantic search and document retrieval using Google's vector store infrastructure with support for corpus and document management.

class GoogleVectorStore:
    def __init__(self, *, corpus_id: str, document_id: Optional[str] = None)
    
    def add_texts(self, texts: Iterable[str], metadatas: Optional[List[Dict]] = None, **kwargs) -> List[str]
    def similarity_search(self, query: str, k: int = 4, **kwargs) -> List[Document]
    def similarity_search_with_score(self, query: str, k: int = 4, **kwargs) -> List[Tuple[Document, float]]
    
    @classmethod
    def create_corpus(cls, corpus_id: Optional[str] = None, display_name: Optional[str] = None) -> "GoogleVectorStore"
    
    @classmethod
    def from_texts(cls, texts: List[str], **kwargs) -> "GoogleVectorStore"

Vector Store

Attributed Question Answering (AQA)

Grounded question answering that provides responses based exclusively on provided source passages with full attribution.

class AqaInput:
    prompt: str
    source_passages: List[str]

class AqaOutput:
    answer: str
    attributed_passages: List[str]
    answerable_probability: float

class GenAIAqa:
    def __init__(self, *, answer_style: int = 1)
    def invoke(self, input: AqaInput, config: Optional[RunnableConfig] = None, **kwargs) -> AqaOutput

Attributed Question Answering

Safety and Configuration

Content safety controls and configuration options for responsible AI deployment with comprehensive filtering capabilities.

# Enums from Google AI
HarmCategory  # Categories of potentially harmful content
HarmBlockThreshold  # Threshold levels for content filtering
Modality  # Generation modality options

# Exception classes  
class DoesNotExistsException(Exception): ...

Safety and Configuration

Types

# Input/Output Types
LanguageModelInput = Union[str, List[BaseMessage], Dict]
SafetySettingDict = TypedDict('SafetySettingDict', {
    'category': HarmCategory,
    'threshold': HarmBlockThreshold
})

# Authentication
SecretStr = pydantic.SecretStr

# LangChain Integration Types
BaseMessage = langchain_core.messages.BaseMessage
Document = langchain_core.documents.Document
VectorStore = langchain_core.vectorstores.VectorStore
Embeddings = langchain_core.embeddings.Embeddings
BaseChatModel = langchain_core.language_models.chat_models.BaseChatModel
BaseLLM = langchain_core.language_models.llms.BaseLLM
Runnable = langchain_core.runnables.Runnable
RunnableConfig = langchain_core.runnables.config.RunnableConfig

# Additional Type Helpers
Optional = typing.Optional
Union = typing.Union
List = typing.List
Dict = typing.Dict
Any = typing.Any
Sequence = typing.Sequence
Tuple = typing.Tuple