or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

advanced-classes.mdcore-operations.mdextensions.mdindex.mditerators.mdstreaming.md

iterators.mddocs/

0

# Iterator Support

1

2

Lazy evaluation support for processing large datasets efficiently using JavaScript iterators and async iterators, enabling memory-efficient handling of massive data collections.

3

4

## Capabilities

5

6

### packIter Function

7

8

Converts an iterable source of objects into an iterable of MessagePack buffers, supporting both synchronous and asynchronous iteration patterns.

9

10

```javascript { .api }

11

/**

12

* Creates an iterable that packs objects from the source iterable to MessagePack buffers

13

* @param objectIterator - Source iterable/iterator (sync or async)

14

* @param options - msgpackr pack options

15

* @returns IterableIterator for sync sources, AsyncIterableIterator for async sources

16

*/

17

function packIter(

18

objectIterator: Iterable | Iterator | AsyncIterable | AsyncIterator,

19

options?: Options

20

): IterableIterator<Buffer> | Promise<AsyncIterableIterator<Buffer>>;

21

```

22

23

**Usage Examples:**

24

25

```javascript

26

import { packIter } from "msgpackr";

27

28

// Basic array iteration

29

const data = [

30

{ id: 1, name: "Alice" },

31

{ id: 2, name: "Bob" },

32

{ id: 3, name: "Charlie" }

33

];

34

35

for (const buffer of packIter(data)) {

36

console.log('Packed buffer size:', buffer.length);

37

// Send buffer over network, write to file, etc.

38

}

39

40

// Generator function source

41

function* generateUsers() {

42

for (let i = 1; i <= 1000; i++) {

43

yield { id: i, name: `User${i}`, created: new Date() };

44

}

45

}

46

47

// Memory-efficient processing of large dataset

48

for (const buffer of packIter(generateUsers(), { useRecords: true })) {

49

// Process each buffer individually - only one object in memory at a time

50

await sendToServer(buffer);

51

}

52

53

// Set iteration

54

const userSet = new Set([

55

{ role: "admin", name: "Alice" },

56

{ role: "user", name: "Bob" }

57

]);

58

59

for (const buffer of packIter(userSet)) {

60

console.log('Packed set item');

61

}

62

```

63

64

### Async Iterator Support

65

66

```javascript

67

import { packIter } from "msgpackr";

68

69

// Async generator source

70

async function* fetchUserPages() {

71

let page = 1;

72

while (page <= 10) {

73

const response = await fetch(`/api/users?page=${page}`);

74

const users = await response.json();

75

for (const user of users) {

76

yield user;

77

}

78

page++;

79

}

80

}

81

82

// Process async iterator

83

const asyncIterable = packIter(fetchUserPages(), {

84

useRecords: true,

85

structuredClone: true

86

});

87

88

for await (const buffer of asyncIterable) {

89

console.log('Processed async buffer:', buffer.length);

90

}

91

92

// Promise-based async iterator

93

const promiseIterator = Promise.resolve(someAsyncIterator);

94

const packedPromise = packIter(promiseIterator);

95

96

for await (const buffer of packedPromise) {

97

// Handle packed buffers from promise-resolved iterator

98

}

99

```

100

101

### unpackIter Function

102

103

Converts an iterable source of MessagePack buffers into an iterable of JavaScript objects, handling incomplete data and buffer boundaries automatically.

104

105

```javascript { .api }

106

/**

107

* Creates an iterable that unpacks MessagePack buffers to objects

108

* @param bufferIterator - Source iterable/iterator of buffers (sync or async)

109

* @param options - msgpackr unpack options

110

* @returns IterableIterator for sync sources, AsyncIterableIterator for async sources

111

*/

112

function unpackIter(

113

bufferIterator: Iterable | Iterator | AsyncIterable | AsyncIterator,

114

options?: Options

115

): IterableIterator<any> | Promise<AsyncIterableIterator<any>>;

116

```

117

118

**Usage Examples:**

119

120

```javascript

121

import { unpackIter } from "msgpackr";

122

123

// Unpack from buffer array

124

const buffers = [buffer1, buffer2, buffer3]; // MessagePack buffers

125

126

for (const object of unpackIter(buffers)) {

127

console.log('Unpacked object:', object);

128

}

129

130

// File reading with chunks

131

import { createReadStream } from "fs";

132

133

async function* readFileChunks(filename) {

134

const stream = createReadStream(filename);

135

for await (const chunk of stream) {

136

yield chunk;

137

}

138

}

139

140

// Process file chunks containing multiple MessagePack values

141

for await (const object of unpackIter(readFileChunks("data.msgpack"))) {

142

// Each object is automatically unpacked from the stream

143

console.log('Object from file:', object);

144

}

145

146

// Network data processing

147

async function* receiveNetworkData() {

148

const socket = await connectToServer();

149

for await (const chunk of socket) {

150

yield Buffer.from(chunk);

151

}

152

}

153

154

for await (const data of unpackIter(receiveNetworkData(), { useRecords: true })) {

155

console.log('Received network data:', data);

156

}

157

```

158

159

### Handling Incomplete Data

160

161

The unpackIter function automatically handles incomplete MessagePack data across buffer boundaries.

162

163

```javascript

164

import { unpackIter } from "msgpackr";

165

166

// Generator that yields partial buffers

167

function* partialBuffers() {

168

const fullBuffer = /* complete MessagePack buffer */;

169

170

// Yield buffer in small chunks that may split MessagePack boundaries

171

for (let i = 0; i < fullBuffer.length; i += 10) {

172

yield fullBuffer.slice(i, i + 10);

173

}

174

}

175

176

try {

177

for (const object of unpackIter(partialBuffers())) {

178

// Objects are correctly reconstructed even when

179

// MessagePack data spans multiple chunks

180

console.log('Reconstructed object:', object);

181

}

182

} catch (error) {

183

if (error.incomplete) {

184

console.log('Incomplete data at end:', error.values);

185

console.log('Last position:', error.lastPosition);

186

}

187

}

188

```

189

190

### Encode/Decode Aliases

191

192

Alternative naming conventions for the iterator functions.

193

194

```javascript { .api }

195

/**

196

* Alias for packIter - encodes objects to MessagePack buffers

197

*/

198

const encodeIter: typeof packIter;

199

200

/**

201

* Alias for unpackIter - decodes MessagePack buffers to objects

202

*/

203

const decodeIter: typeof unpackIter;

204

```

205

206

**Usage Examples:**

207

208

```javascript

209

import { encodeIter, decodeIter } from "msgpackr";

210

211

// Using encode/decode terminology

212

const objects = [{ a: 1 }, { b: 2 }];

213

const buffers = [];

214

215

for (const buffer of encodeIter(objects)) {

216

buffers.push(buffer);

217

}

218

219

for (const object of decodeIter(buffers)) {

220

console.log('Decoded:', object);

221

}

222

```

223

224

## Advanced Usage Patterns

225

226

### Pipeline Processing

227

228

Combining iterators for complex data processing pipelines.

229

230

```javascript

231

import { packIter, unpackIter } from "msgpackr";

232

233

// Transform pipeline: objects -> buffers -> objects

234

async function* transformPipeline(sourceData) {

235

// Pack to buffers

236

const packedIterator = packIter(sourceData, { useRecords: true });

237

238

// Simulate network transmission or storage

239

const networkBuffers = [];

240

for (const buffer of packedIterator) {

241

// Could send over network, save to disk, etc.

242

networkBuffers.push(buffer);

243

}

244

245

// Unpack back to objects

246

for (const object of unpackIter(networkBuffers, { useRecords: true })) {

247

// Apply transformations

248

yield {

249

...object,

250

processed: true,

251

timestamp: new Date()

252

};

253

}

254

}

255

256

// Use the pipeline

257

const sourceData = generateLargeDataset();

258

for await (const processedObject of transformPipeline(sourceData)) {

259

console.log('Processed:', processedObject);

260

}

261

```

262

263

### Memory-Efficient Batch Processing

264

265

```javascript

266

import { packIter, unpackIter } from "msgpackr";

267

268

// Process large datasets in chunks without loading everything into memory

269

async function* batchProcessor(dataSource, batchSize = 100) {

270

let batch = [];

271

272

for await (const item of dataSource) {

273

batch.push(item);

274

275

if (batch.length >= batchSize) {

276

// Pack the batch

277

const packedBatch = [];

278

for (const buffer of packIter(batch, { useRecords: true })) {

279

packedBatch.push(buffer);

280

}

281

282

// Process and yield results

283

for (const result of unpackIter(packedBatch, { useRecords: true })) {

284

yield result;

285

}

286

287

batch = []; // Clear batch to free memory

288

}

289

}

290

291

// Process remaining items

292

if (batch.length > 0) {

293

for (const buffer of packIter(batch, { useRecords: true })) {

294

for (const result of unpackIter([buffer], { useRecords: true })) {

295

yield result;

296

}

297

}

298

}

299

}

300

```

301

302

### Error Handling in Iterator Chains

303

304

```javascript

305

import { packIter, unpackIter } from "msgpackr";

306

307

function* safeIteratorChain(source) {

308

try {

309

// Pack with error handling

310

const packedIterator = packIter(source, { useRecords: true });

311

const buffers = [];

312

313

for (const buffer of packedIterator) {

314

buffers.push(buffer);

315

}

316

317

// Unpack with error recovery

318

for (const object of unpackIter(buffers, { useRecords: true })) {

319

yield object;

320

}

321

322

} catch (error) {

323

console.error('Iterator chain error:', error);

324

325

if (error.incomplete) {

326

// Handle incomplete data

327

console.log('Recovered partial data:', error.values);

328

for (const partialValue of error.values || []) {

329

yield partialValue;

330

}

331

}

332

333

throw error; // Re-throw if not recoverable

334

}

335

}

336

```

337

338

## Performance Considerations

339

340

- **Memory Efficiency**: Iterators process one item at a time, keeping memory usage constant

341

- **Record Structures**: Use `useRecords: true` for optimal performance with repeated structures

342

- **Async vs Sync**: Choose appropriate iterator type based on data source characteristics

343

- **Buffer Sizes**: Consider buffer sizes when dealing with network or file I/O

344

- **Error Recovery**: Implement proper error handling for production use cases

345

346

```javascript

347

// High-performance iterator configuration

348

const performantOptions = {

349

useRecords: true,

350

sequential: true,

351

bundleStrings: true

352

};

353

354

for (const buffer of packIter(largeDataset, performantOptions)) {

355

// Optimized processing

356

}

357

```