CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/npm-langchain--core

Core LangChain.js abstractions and schemas for building applications with Large Language Models

Pending
Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

SecuritybySnyk

Pending

The risk profile of this skill

Overview
Eval results
Files

caches.mddocs/

Caching

Caching framework for optimizing repeated operations by storing and retrieving previously computed results. Caching improves performance and reduces API costs in LangChain applications.

Capabilities

Base Cache

Abstract base class for all cache implementations.

/**
 * Abstract base class for caching implementations
 * @template T - Type of cached values
 */
abstract class BaseCache<T = any> {
  constructor();
  
  /** Look up cached value by prompt and model key */
  abstract lookup(prompt: string, llmKey: string): Promise<T | null>;
  
  /** Store value in cache */
  abstract update(prompt: string, llmKey: string, value: T): Promise<void>;
  
  /** Set default key encoder function */
  makeDefaultKeyEncoder(keyEncoderFn: HashKeyEncoder): void;
}

In-Memory Cache

Simple in-memory cache implementation.

/**
 * In-memory cache implementation
 * @template T - Type of cached values
 */
class InMemoryCache<T = any> extends BaseCache<T> {
  /** Internal cache storage */
  private cache: Map<string, T>;
  
  constructor();
  
  /** Get cached value */
  async lookup(prompt: string, llmKey: string): Promise<T | null>;
  
  /** Store value in cache */
  async update(prompt: string, llmKey: string, value: T): Promise<void>;
  
  /** Get global cache instance */
  static global(): InMemoryCache;
  
  /** Clear all cached values */
  clear(): void;
}

Usage Examples:

import { InMemoryCache } from "@langchain/core/caches";

// Create cache instance
const cache = new InMemoryCache<string>();

// Store value
await cache.update("What is 2+2?", "gpt-3.5-turbo", "2+2 equals 4");

// Retrieve value
const cached = await cache.lookup("What is 2+2?", "gpt-3.5-turbo");
console.log(cached); // "2+2 equals 4"

// Use global cache instance
const globalCache = InMemoryCache.global();
await globalCache.update("Hello", "model-key", "Hi there!");

// Clear cache
cache.clear();

Cache Utilities

Utility functions for cache key generation and serialization.

/**
 * Generate cache key from strings (deprecated)
 * @deprecated Use proper key encoding instead
 */
function getCacheKey(...strings: string[]): string;

/**
 * Serialize generation for caching
 */
function serializeGeneration(generation: Generation): StoredGeneration;

/**
 * Deserialize stored generation from cache
 */
function deserializeStoredGeneration(storedGeneration: StoredGeneration): Generation;

/**
 * Hash key encoder function type
 */
type HashKeyEncoder = (key: string) => string;

Usage Examples:

import { serializeGeneration, deserializeStoredGeneration } from "@langchain/core/caches";

// Serialize generation for storage
const generation = {
  text: "Hello world",
  generationInfo: { model: "gpt-3.5-turbo", tokens: 10 }
};

const stored = serializeGeneration(generation);
console.log(stored); // Serialized format suitable for caching

// Deserialize from cache
const restored = deserializeStoredGeneration(stored);
console.log(restored.text); // "Hello world"

Cache Integration

Using Cache with Language Models

// Example of setting up caching with a language model
import { InMemoryCache } from "@langchain/core/caches";

// Create global cache
const cache = InMemoryCache.global();

// Cache would be used automatically by language models
// when they perform lookups before making API calls
const result1 = await model.invoke("Explain quantum physics"); // API call
const result2 = await model.invoke("Explain quantum physics"); // Retrieved from cache

Types

interface Generation {
  /** Generated text */
  text: string;
  /** Additional generation metadata */
  generationInfo?: Record<string, unknown>;
}

interface StoredGeneration {
  /** Serialized text */
  text: string;
  /** Serialized generation info */
  generationInfo?: Record<string, unknown>;
}

interface CacheInterface<T = any> {
  lookup(prompt: string, llmKey: string): Promise<T | null>;
  update(prompt: string, llmKey: string, value: T): Promise<void>;
}

docs

agents.md

caches.md

callbacks.md

documents.md

embeddings.md

index.md

language-models.md

memory-storage.md

messages.md

output-parsers.md

prompts.md

retrievers.md

runnables.md

tools.md

vectorstores.md

tile.json