Universal Sentence Encoder for generating text embeddings using TensorFlow.js
—
Specialized Universal Sentence Encoder for question-answering applications, generating 100-dimensional embeddings optimized for matching questions with answers. The dot product of query and response embeddings measures how well the answer fits the question.
Creates and loads the Universal Sentence Encoder QnA model specifically trained for question-answering tasks.
/**
* Load the Universal Sentence Encoder QnA model
* @returns Promise that resolves to UniversalSentenceEncoderQnA instance
*/
function loadQnA(): Promise<UniversalSentenceEncoderQnA>;Usage Example:
import * as use from '@tensorflow-models/universal-sentence-encoder';
// Load the QnA model
const qnaModel = await use.loadQnA();Specialized class for generating question and answer embeddings with 100-dimensional output vectors optimized for Q&A matching.
class UniversalSentenceEncoderQnA {
/**
* Load the TensorFlow.js GraphModel from TFHub
* @returns Promise that resolves to the loaded QnA GraphModel
*/
loadModel(): Promise<tf.GraphModel>;
/**
* Initialize the QnA model and tokenizer
*/
load(): Promise<void>;
/**
* Generate query and response embeddings for question-answering
* @param input - Object containing queries, responses, and optional contexts
* @returns Object with queryEmbedding and responseEmbedding tensors
*/
embed(input: ModelInput): ModelOutput;
}The QnA model processes questions and answers simultaneously to generate paired embeddings.
interface ModelInput {
/** Array of question strings */
queries: string[];
/** Array of answer strings */
responses: string[];
/** Optional array of context strings for answers (must match responses length) */
contexts?: string[];
}
interface ModelOutput {
/** Tensor containing embeddings for all queries [queries.length, 100] */
queryEmbedding: tf.Tensor;
/** Tensor containing embeddings for all responses [responses.length, 100] */
responseEmbedding: tf.Tensor;
}Usage Examples:
import * as use from '@tensorflow-models/universal-sentence-encoder';
import * as tf from '@tensorflow/tfjs-core';
// Load the QnA model
const model = await use.loadQnA();
// Basic Q&A embedding
const input = {
queries: ['How are you feeling today?', 'What is the capital of China?'],
responses: [
'I\'m not feeling very well.',
'Beijing is the capital of China.',
'You have five fingers on your hand.'
]
};
const embeddings = model.embed(input);
// embeddings.queryEmbedding shape: [2, 100]
// embeddings.responseEmbedding shape: [3, 100]
// Calculate similarity scores
const scores = tf.matMul(
embeddings.queryEmbedding,
embeddings.responseEmbedding,
false,
true
);
const scoresData = await scores.data();
console.log('Q&A matching scores:', scoresData);With Context Example:
// Q&A with context
const inputWithContext = {
queries: ['What is the main ingredient?'],
responses: ['It is tomatoes'],
contexts: ['We are discussing pizza sauce recipes']
};
const contextEmbeddings = model.embed(inputWithContext);The QnA model loads from a specific TensorFlow Hub model trained for question-answering tasks.
Model URL: https://tfhub.dev/google/tfjs-model/universal-sentence-encoder-qa-ondevice/1
Key Features:
Process multiple question-answer pairs efficiently:
const batchInput = {
queries: [
'What is machine learning?',
'How does AI work?',
'What is deep learning?'
],
responses: [
'Machine learning is a subset of AI that enables computers to learn.',
'AI works by processing data through algorithms.',
'Deep learning uses neural networks with multiple layers.',
'Python is a programming language.',
'The weather is nice today.'
]
};
const result = model.embed(batchInput);
// Find best matches for each query
const similarityMatrix = tf.matMul(
result.queryEmbedding,
result.responseEmbedding,
false,
true
);
// Get best match indices
const bestMatches = tf.argMax(similarityMatrix, 1);
console.log('Best answer indices:', await bestMatches.data());import * as tf from '@tensorflow/tfjs-core';
interface ModelInput {
queries: string[];
responses: string[];
contexts?: string[];
}
interface ModelOutput {
queryEmbedding: tf.Tensor;
responseEmbedding: tf.Tensor;
}Install with Tessl CLI
npx tessl i tessl/npm-tensorflow-models--universal-sentence-encoder