0
# Question-Answering Embeddings
1
2
Specialized Universal Sentence Encoder for question-answering applications, generating 100-dimensional embeddings optimized for matching questions with answers. The dot product of query and response embeddings measures how well the answer fits the question.
3
4
## Capabilities
5
6
### Load QnA Model
7
8
Creates and loads the Universal Sentence Encoder QnA model specifically trained for question-answering tasks.
9
10
```typescript { .api }
11
/**
12
* Load the Universal Sentence Encoder QnA model
13
* @returns Promise that resolves to UniversalSentenceEncoderQnA instance
14
*/
15
function loadQnA(): Promise<UniversalSentenceEncoderQnA>;
16
```
17
18
**Usage Example:**
19
20
```typescript
21
import * as use from '@tensorflow-models/universal-sentence-encoder';
22
23
// Load the QnA model
24
const qnaModel = await use.loadQnA();
25
```
26
27
### Universal Sentence Encoder QnA Class
28
29
Specialized class for generating question and answer embeddings with 100-dimensional output vectors optimized for Q&A matching.
30
31
```typescript { .api }
32
class UniversalSentenceEncoderQnA {
33
/**
34
* Load the TensorFlow.js GraphModel from TFHub
35
* @returns Promise that resolves to the loaded QnA GraphModel
36
*/
37
loadModel(): Promise<tf.GraphModel>;
38
39
/**
40
* Initialize the QnA model and tokenizer
41
*/
42
load(): Promise<void>;
43
44
/**
45
* Generate query and response embeddings for question-answering
46
* @param input - Object containing queries, responses, and optional contexts
47
* @returns Object with queryEmbedding and responseEmbedding tensors
48
*/
49
embed(input: ModelInput): ModelOutput;
50
}
51
```
52
53
### QnA Model Input and Output
54
55
The QnA model processes questions and answers simultaneously to generate paired embeddings.
56
57
```typescript { .api }
58
interface ModelInput {
59
/** Array of question strings */
60
queries: string[];
61
/** Array of answer strings */
62
responses: string[];
63
/** Optional array of context strings for answers (must match responses length) */
64
contexts?: string[];
65
}
66
67
interface ModelOutput {
68
/** Tensor containing embeddings for all queries [queries.length, 100] */
69
queryEmbedding: tf.Tensor;
70
/** Tensor containing embeddings for all responses [responses.length, 100] */
71
responseEmbedding: tf.Tensor;
72
}
73
```
74
75
**Usage Examples:**
76
77
```typescript
78
import * as use from '@tensorflow-models/universal-sentence-encoder';
79
import * as tf from '@tensorflow/tfjs-core';
80
81
// Load the QnA model
82
const model = await use.loadQnA();
83
84
// Basic Q&A embedding
85
const input = {
86
queries: ['How are you feeling today?', 'What is the capital of China?'],
87
responses: [
88
'I\'m not feeling very well.',
89
'Beijing is the capital of China.',
90
'You have five fingers on your hand.'
91
]
92
};
93
94
const embeddings = model.embed(input);
95
// embeddings.queryEmbedding shape: [2, 100]
96
// embeddings.responseEmbedding shape: [3, 100]
97
98
// Calculate similarity scores
99
const scores = tf.matMul(
100
embeddings.queryEmbedding,
101
embeddings.responseEmbedding,
102
false,
103
true
104
);
105
106
const scoresData = await scores.data();
107
console.log('Q&A matching scores:', scoresData);
108
```
109
110
**With Context Example:**
111
112
```typescript
113
// Q&A with context
114
const inputWithContext = {
115
queries: ['What is the main ingredient?'],
116
responses: ['It is tomatoes'],
117
contexts: ['We are discussing pizza sauce recipes']
118
};
119
120
const contextEmbeddings = model.embed(inputWithContext);
121
```
122
123
### Model Configuration
124
125
The QnA model loads from a specific TensorFlow Hub model trained for question-answering tasks.
126
127
**Model URL:** `https://tfhub.dev/google/tfjs-model/universal-sentence-encoder-qa-ondevice/1`
128
129
**Key Features:**
130
- **Input Limit**: 192 tokens per input string
131
- **Embedding Dimension**: 100 (vs 512 for standard USE)
132
- **Vocabulary**: 8k SentencePiece vocabulary
133
- **Context Support**: Optional context strings for enhanced answer embeddings
134
135
## Advanced Usage
136
137
### Batch Processing
138
139
Process multiple question-answer pairs efficiently:
140
141
```typescript
142
const batchInput = {
143
queries: [
144
'What is machine learning?',
145
'How does AI work?',
146
'What is deep learning?'
147
],
148
responses: [
149
'Machine learning is a subset of AI that enables computers to learn.',
150
'AI works by processing data through algorithms.',
151
'Deep learning uses neural networks with multiple layers.',
152
'Python is a programming language.',
153
'The weather is nice today.'
154
]
155
};
156
157
const result = model.embed(batchInput);
158
159
// Find best matches for each query
160
const similarityMatrix = tf.matMul(
161
result.queryEmbedding,
162
result.responseEmbedding,
163
false,
164
true
165
);
166
167
// Get best match indices
168
const bestMatches = tf.argMax(similarityMatrix, 1);
169
console.log('Best answer indices:', await bestMatches.data());
170
```
171
172
## Types
173
174
```typescript { .api }
175
import * as tf from '@tensorflow/tfjs-core';
176
177
interface ModelInput {
178
queries: string[];
179
responses: string[];
180
contexts?: string[];
181
}
182
183
interface ModelOutput {
184
queryEmbedding: tf.Tensor;
185
responseEmbedding: tf.Tensor;
186
}
187
```