0
# Files & Uploads
1
2
Comprehensive file management and large file upload operations for the OpenAI Node.js SDK. Handle individual file uploads, retrieve file metadata, manage file lifecycle, and upload large files using multipart uploads.
3
4
## Capabilities
5
6
### Files
7
8
Core file management operations including upload, retrieval, listing, deletion, and content access. Supports multiple file purposes (assistants, fine-tuning, batch processing, vision).
9
10
```typescript { .api }
11
// Upload a file
12
create(params: FileCreateParams): Promise<FileObject>;
13
14
// Retrieve file metadata
15
retrieve(fileID: string): Promise<FileObject>;
16
17
// List all files with pagination
18
list(params?: FileListParams): Promise<FileObjectsPage>;
19
20
// Delete a file
21
delete(fileID: string): Promise<FileDeleted>;
22
23
// Get file contents
24
content(fileID: string): Promise<Response>;
25
26
// Poll until file processing completes
27
waitForProcessing(id: string, options?: WaitForProcessingOptions): Promise<FileObject>;
28
29
interface FileObject {
30
id: string;
31
filename: string;
32
bytes: number;
33
created_at: number;
34
purpose: FilePurpose;
35
status: 'uploaded' | 'processed' | 'error';
36
expires_at?: number;
37
object: 'file';
38
}
39
40
type FilePurpose = 'assistants' | 'batch' | 'fine-tune' | 'vision' | 'user_data' | 'evals';
41
42
interface FileDeleted {
43
id: string;
44
deleted: boolean;
45
object: 'file';
46
}
47
```
48
49
[File Operations](./files-uploads.md#file-operations)
50
51
### Uploads
52
53
Multipart upload API for handling large files (up to 8 GB). Break files into 64 MB parts and upload in parallel for improved reliability and performance.
54
55
```typescript { .api }
56
// Initiate a multipart upload
57
create(params: UploadCreateParams): Promise<Upload>;
58
59
// Cancel an in-progress upload
60
cancel(uploadID: string): Promise<Upload>;
61
62
// Complete a multipart upload
63
complete(uploadID: string, params: UploadCompleteParams): Promise<Upload>;
64
65
// Upload a file part
66
parts.create(uploadID: string, params: PartCreateParams): Promise<UploadPart>;
67
68
interface Upload {
69
id: string;
70
filename: string;
71
bytes: number;
72
created_at: number;
73
expires_at: number;
74
purpose: FilePurpose;
75
status: 'pending' | 'completed' | 'cancelled' | 'expired';
76
file?: FileObject | null;
77
object: 'upload';
78
}
79
80
interface UploadPart {
81
id: string;
82
created_at: number;
83
upload_id: string;
84
object: 'upload.part';
85
}
86
```
87
88
[Upload Operations](./files-uploads.md#upload-operations)
89
90
### toFile Helper
91
92
Convert various data types (strings, buffers, streams, fetch responses) into File objects for upload operations.
93
94
```typescript { .api }
95
async function toFile(
96
value: ToFileInput | PromiseLike<ToFileInput>,
97
name?: string | null | undefined,
98
options?: FilePropertyBag | undefined,
99
): Promise<File>;
100
101
type ToFileInput =
102
| FileLike
103
| ResponseLike
104
| Exclude<BlobLikePart, string>
105
| AsyncIterable<BlobLikePart>;
106
107
type FilePropertyBag = {
108
type?: string;
109
lastModified?: number;
110
};
111
```
112
113
[toFile Helper](./files-uploads.md#tofile-helper)
114
115
---
116
117
## File Operations
118
119
Upload, retrieve, list, and manage files in the OpenAI platform.
120
121
### Create (Upload File)
122
123
Upload a single file to OpenAI. Supports up to 512 MB individual files and multiple file purposes.
124
125
```typescript { .api }
126
create(params: FileCreateParams): Promise<FileObject>;
127
128
interface FileCreateParams {
129
/** The File object to upload */
130
file: Uploadable;
131
132
/** Intended purpose: 'assistants', 'batch', 'fine-tune', 'vision', 'user_data', 'evals' */
133
purpose: FilePurpose;
134
135
/** Optional expiration policy */
136
expires_after?: {
137
anchor: 'created_at';
138
seconds: number; // 3600 (1 hour) to 2592000 (30 days)
139
};
140
}
141
```
142
143
#### Example: Simple Text File Upload
144
145
```typescript
146
import OpenAI, { toFile } from "openai";
147
148
const client = new OpenAI();
149
150
// Upload from string
151
const file1 = await client.files.create({
152
file: await toFile("Training data content here", "training.txt", { type: "text/plain" }),
153
purpose: "fine-tune",
154
});
155
156
console.log(`Uploaded file: ${file1.id}`);
157
```
158
159
#### Example: Upload from Buffer
160
161
```typescript
162
import OpenAI, { toFile } from "openai";
163
import * as fs from "fs";
164
165
const client = new OpenAI();
166
167
// Upload from file system
168
const fileBuffer = fs.readFileSync("./training-data.jsonl");
169
const file = await client.files.create({
170
file: await toFile(fileBuffer, "training-data.jsonl", {
171
type: "application/x-ndjson",
172
}),
173
purpose: "fine-tune",
174
});
175
176
console.log(`Uploaded: ${file.filename} (ID: ${file.id})`);
177
```
178
179
#### Example: Upload with Expiration
180
181
```typescript
182
import OpenAI, { toFile } from "openai";
183
184
const client = new OpenAI();
185
186
// File expires 24 hours after creation
187
const file = await client.files.create({
188
file: await toFile("batch job data", "batch.jsonl", { type: "application/x-ndjson" }),
189
purpose: "batch",
190
expires_after: {
191
anchor: "created_at",
192
seconds: 86400, // 24 hours
193
},
194
});
195
196
console.log(`File will expire at: ${new Date(file.expires_at * 1000).toISOString()}`);
197
```
198
199
#### Example: Upload from Fetch Response
200
201
```typescript
202
import OpenAI, { toFile } from "openai";
203
204
const client = new OpenAI();
205
206
// Upload from remote URL
207
const response = await fetch("https://example.com/training-data.jsonl");
208
const file = await client.files.create({
209
file: await toFile(response, "remote-data.jsonl", {
210
type: "application/x-ndjson",
211
}),
212
purpose: "fine-tune",
213
});
214
215
console.log(`Uploaded from remote: ${file.id}`);
216
```
217
218
### Retrieve
219
220
Get metadata for a specific file.
221
222
```typescript { .api }
223
retrieve(fileID: string): Promise<FileObject>;
224
```
225
226
#### Example: Get File Metadata
227
228
```typescript
229
const file = await client.files.retrieve("file-123abc");
230
console.log({
231
id: file.id,
232
name: file.filename,
233
size: `${(file.bytes / 1024 / 1024).toFixed(2)} MB`,
234
created: new Date(file.created_at * 1000).toISOString(),
235
purpose: file.purpose,
236
status: file.status,
237
});
238
```
239
240
### List
241
242
List all files with pagination support. Filter by purpose.
243
244
```typescript { .api }
245
list(params?: FileListParams): Promise<FileObjectsPage>;
246
247
interface FileListParams extends CursorPageParams {
248
order?: 'asc' | 'desc'; // Sort by created_at
249
purpose?: string; // Filter by purpose
250
after?: string; // Pagination cursor
251
limit?: number; // Items per page (max 100)
252
}
253
```
254
255
#### Example: List All Files
256
257
```typescript
258
// Iterate all files
259
for await (const file of await client.files.list()) {
260
console.log(`${file.filename} (${file.purpose})`);
261
}
262
```
263
264
#### Example: List with Pagination
265
266
```typescript
267
// Get first page
268
const firstPage = await client.files.list({ limit: 10 });
269
270
// Manually get next page
271
if (firstPage.hasNextPage()) {
272
const nextPage = await firstPage.getNextPage();
273
console.log(`Next page has ${nextPage.data.length} files`);
274
}
275
276
// Or iterate pages
277
for await (const page of (await client.files.list()).iterPages()) {
278
console.log(`Processing page with ${page.data.length} files`);
279
}
280
```
281
282
#### Example: Filter by Purpose
283
284
```typescript
285
// List only fine-tuning files
286
for await (const file of await client.files.list({ purpose: "fine-tune" })) {
287
console.log(`Fine-tuning file: ${file.filename}`);
288
}
289
290
// List only assistant files
291
const assistantFiles = await client.files.list({ purpose: "assistants" });
292
console.log(`${assistantFiles.data.length} assistant files found`);
293
```
294
295
#### Example: Sort and Pagination
296
297
```typescript
298
// Get newest files first
299
const newestFiles = await client.files.list({
300
order: "desc",
301
limit: 20,
302
});
303
304
console.log(`Latest 20 files (newest first):`);
305
for (const file of newestFiles.data) {
306
console.log(` - ${file.filename} (${new Date(file.created_at * 1000).toLocaleDateString()})`);
307
}
308
```
309
310
### Delete
311
312
Delete a file and remove it from all vector stores.
313
314
```typescript { .api }
315
delete(fileID: string): Promise<FileDeleted>;
316
```
317
318
#### Example: Delete File
319
320
```typescript
321
const result = await client.files.delete("file-123abc");
322
if (result.deleted) {
323
console.log(`File ${result.id} successfully deleted`);
324
}
325
```
326
327
### Content
328
329
Retrieve the actual file contents.
330
331
```typescript { .api }
332
content(fileID: string): Promise<Response>;
333
```
334
335
#### Example: Download File Content
336
337
```typescript
338
// Get file content as response
339
const response = await client.files.content("file-123abc");
340
const buffer = await response.arrayBuffer();
341
const text = new TextDecoder().decode(buffer);
342
console.log(text);
343
344
// Save to disk (Node.js)
345
import * as fs from "fs";
346
const buffer = await response.arrayBuffer();
347
fs.writeFileSync("./downloaded-file.jsonl", Buffer.from(buffer));
348
```
349
350
### Wait for Processing
351
352
Poll until file processing completes. Useful for batch and fine-tuning files.
353
354
```typescript { .api }
355
async waitForProcessing(
356
id: string,
357
options?: { pollInterval?: number; maxWait?: number }
358
): Promise<FileObject>;
359
```
360
361
Default poll interval: 5 seconds, max wait: 30 minutes. Returns the processed file when status is 'processed', 'error', or 'deleted'.
362
363
#### Example: Wait for Processing
364
365
```typescript
366
import OpenAI from "openai";
367
368
const client = new OpenAI();
369
370
// Upload a fine-tuning file and wait for processing
371
const file = await client.files.create({
372
file: await client.beta.files.toFile(
373
jsonlData,
374
"training.jsonl",
375
{ type: "application/x-ndjson" }
376
),
377
purpose: "fine-tune",
378
});
379
380
// Wait until processing completes
381
const processedFile = await client.files.waitForProcessing(file.id);
382
383
if (processedFile.status === "processed") {
384
console.log("File processed successfully!");
385
// Now safe to use in fine-tuning
386
} else if (processedFile.status === "error") {
387
console.error("File processing failed:", processedFile.status_details);
388
}
389
```
390
391
#### Example: Custom Poll Interval
392
393
```typescript
394
// Check every 2 seconds, timeout after 5 minutes
395
const file = await client.files.waitForProcessing("file-123abc", {
396
pollInterval: 2000, // 2 seconds
397
maxWait: 5 * 60 * 1000, // 5 minutes
398
});
399
400
console.log("File ready!");
401
```
402
403
---
404
405
## Upload Operations
406
407
Use multipart uploads for large files (up to 8 GB). Enables parallel uploads and better reliability.
408
409
### Create Upload Session
410
411
Initiate a multipart upload session.
412
413
```typescript { .api }
414
create(params: UploadCreateParams): Promise<Upload>;
415
416
interface UploadCreateParams {
417
/** Total size in bytes of the file being uploaded */
418
bytes: number;
419
420
/** Filename for the resulting file */
421
filename: string;
422
423
/** MIME type (must match file purpose requirements) */
424
mime_type: string;
425
426
/** Intended purpose: 'assistants', 'batch', 'fine-tune', 'vision', 'user_data' */
427
purpose: FilePurpose;
428
429
/** Optional expiration policy */
430
expires_after?: {
431
anchor: 'created_at';
432
seconds: number;
433
};
434
}
435
```
436
437
#### Example: Start Large File Upload
438
439
```typescript
440
import OpenAI, { toFile } from "openai";
441
import * as fs from "fs";
442
443
const client = new OpenAI();
444
445
// Get file size
446
const fileSize = fs.statSync("./large-dataset.jsonl").size;
447
448
// Initiate upload
449
const upload = await client.uploads.create({
450
filename: "large-dataset.jsonl",
451
mime_type: "application/x-ndjson",
452
bytes: fileSize,
453
purpose: "batch",
454
});
455
456
console.log(`Upload session created: ${upload.id}`);
457
console.log(`Expires at: ${new Date(upload.expires_at * 1000).toISOString()}`);
458
```
459
460
### Upload Parts
461
462
Upload file chunks (max 64 MB each). Can be uploaded in parallel.
463
464
```typescript { .api }
465
parts.create(uploadID: string, params: PartCreateParams): Promise<UploadPart>;
466
467
interface PartCreateParams {
468
/** The chunk of bytes for this part */
469
data: Uploadable;
470
}
471
```
472
473
#### Example: Upload File in Parts
474
475
```typescript
476
import OpenAI, { toFile } from "openai";
477
import * as fs from "fs";
478
479
const client = new OpenAI();
480
481
const filePath = "./large-file.jsonl";
482
const fileSize = fs.statSync(filePath).size;
483
const PART_SIZE = 50 * 1024 * 1024; // 50 MB
484
485
// Create upload session
486
const upload = await client.uploads.create({
487
filename: "large-file.jsonl",
488
mime_type: "application/x-ndjson",
489
bytes: fileSize,
490
purpose: "batch",
491
});
492
493
console.log(`Starting upload: ${upload.id}`);
494
495
// Upload file in parts
496
const partIds: string[] = [];
497
const fileStream = fs.createReadStream(filePath, {
498
highWaterMark: PART_SIZE,
499
});
500
501
let partNum = 0;
502
for await (const chunk of fileStream) {
503
const part = await client.uploads.parts.create(upload.id, {
504
data: await toFile(chunk, `part-${partNum}`, {
505
type: "application/octet-stream",
506
}),
507
});
508
509
partIds.push(part.id);
510
console.log(`Uploaded part ${partNum + 1}/${Math.ceil(fileSize / PART_SIZE)}`);
511
partNum++;
512
}
513
514
console.log(`All parts uploaded: ${partIds.length} parts`);
515
```
516
517
#### Example: Parallel Part Uploads
518
519
```typescript
520
import OpenAI, { toFile } from "openai";
521
import * as fs from "fs";
522
523
const client = new OpenAI();
524
525
const filePath = "./large-file.jsonl";
526
const fileSize = fs.statSync(filePath).size;
527
const PART_SIZE = 50 * 1024 * 1024; // 50 MB
528
const MAX_CONCURRENT = 4; // 4 parallel uploads
529
530
// Create upload session
531
const upload = await client.uploads.create({
532
filename: "large-file.jsonl",
533
mime_type: "application/x-ndjson",
534
bytes: fileSize,
535
purpose: "batch",
536
});
537
538
// Read file into chunks
539
const chunks: Buffer[] = [];
540
const fileData = fs.readFileSync(filePath);
541
for (let i = 0; i < fileData.length; i += PART_SIZE) {
542
chunks.push(fileData.slice(i, i + PART_SIZE));
543
}
544
545
console.log(`Uploading ${chunks.length} parts in parallel (${MAX_CONCURRENT} concurrent)`);
546
547
// Upload with concurrency limit
548
const partIds: string[] = [];
549
for (let i = 0; i < chunks.length; i += MAX_CONCURRENT) {
550
const batch = chunks.slice(i, i + MAX_CONCURRENT);
551
const uploadPromises = batch.map(async (chunk, idx) => {
552
const partNum = i + idx + 1;
553
const part = await client.uploads.parts.create(upload.id, {
554
data: await toFile(chunk, `part-${partNum}`, {
555
type: "application/octet-stream",
556
}),
557
});
558
console.log(`Completed part ${partNum}/${chunks.length}`);
559
return part.id;
560
});
561
562
const batchIds = await Promise.all(uploadPromises);
563
partIds.push(...batchIds);
564
}
565
566
console.log(`All ${partIds.length} parts uploaded successfully`);
567
```
568
569
### Complete Upload
570
571
Finalize the multipart upload and create the file.
572
573
```typescript { .api }
574
complete(uploadID: string, params: UploadCompleteParams): Promise<Upload>;
575
576
interface UploadCompleteParams {
577
/** Ordered list of part IDs */
578
part_ids: Array<string>;
579
580
/** Optional MD5 checksum for verification */
581
md5?: string;
582
}
583
```
584
585
#### Example: Complete Upload and Verify
586
587
```typescript
588
import OpenAI, { toFile } from "openai";
589
import * as fs from "fs";
590
import * as crypto from "crypto";
591
592
const client = new OpenAI();
593
594
// ... (previous upload code)
595
596
// Calculate MD5 for verification
597
const fileData = fs.readFileSync("./large-file.jsonl");
598
const md5 = crypto.createHash("md5").update(fileData).digest("hex");
599
600
// Complete the upload
601
const completedUpload = await client.uploads.complete(upload.id, {
602
part_ids: partIds,
603
md5: md5,
604
});
605
606
if (completedUpload.status === "completed" && completedUpload.file) {
607
console.log(`Upload completed!`);
608
console.log(`File ID: ${completedUpload.file.id}`);
609
console.log(`File name: ${completedUpload.file.filename}`);
610
console.log(`File size: ${completedUpload.file.bytes} bytes`);
611
612
// File is now ready to use
613
} else {
614
console.error("Upload failed to complete");
615
}
616
```
617
618
### Cancel Upload
619
620
Cancel an in-progress upload.
621
622
```typescript { .api }
623
cancel(uploadID: string): Promise<Upload>;
624
```
625
626
#### Example: Cancel Upload
627
628
```typescript
629
const upload = await client.uploads.cancel(uploadID);
630
if (upload.status === "cancelled") {
631
console.log("Upload cancelled successfully");
632
}
633
```
634
635
---
636
637
## Complete Multipart Upload Workflow
638
639
Full end-to-end example of handling a large file upload.
640
641
```typescript
642
import OpenAI, { toFile } from "openai";
643
import * as fs from "fs";
644
import * as crypto from "crypto";
645
646
const client = new OpenAI();
647
648
async function uploadLargeFile(filePath: string, purpose: string) {
649
const fileSize = fs.statSync(filePath).size;
650
const fileName = path.basename(filePath);
651
const PART_SIZE = 50 * 1024 * 1024; // 50 MB
652
const MAX_CONCURRENT = 4;
653
654
console.log(`Uploading ${fileName} (${(fileSize / 1024 / 1024).toFixed(2)} MB)`);
655
656
// Step 1: Create upload session
657
const upload = await client.uploads.create({
658
filename: fileName,
659
mime_type: "application/x-ndjson",
660
bytes: fileSize,
661
purpose: purpose,
662
});
663
664
console.log(`Session created: ${upload.id}`);
665
666
try {
667
// Step 2: Upload parts in parallel
668
const fileData = fs.readFileSync(filePath);
669
const chunks: Buffer[] = [];
670
for (let i = 0; i < fileData.length; i += PART_SIZE) {
671
chunks.push(fileData.slice(i, i + PART_SIZE));
672
}
673
674
const partIds: string[] = [];
675
for (let i = 0; i < chunks.length; i += MAX_CONCURRENT) {
676
const batch = chunks.slice(i, i + MAX_CONCURRENT);
677
const uploadPromises = batch.map(async (chunk, idx) => {
678
const partNum = i + idx + 1;
679
const part = await client.uploads.parts.create(upload.id, {
680
data: await toFile(chunk, `part-${partNum}`),
681
});
682
console.log(`Part ${partNum}/${chunks.length} uploaded`);
683
return part.id;
684
});
685
686
const batchIds = await Promise.all(uploadPromises);
687
partIds.push(...batchIds);
688
}
689
690
// Step 3: Complete upload with MD5
691
const md5 = crypto
692
.createHash("md5")
693
.update(fileData)
694
.digest("hex");
695
696
const completed = await client.uploads.complete(upload.id, {
697
part_ids: partIds,
698
md5: md5,
699
});
700
701
if (completed.status === "completed" && completed.file) {
702
console.log(`Upload successful!`);
703
console.log(`File ID: ${completed.file.id}`);
704
console.log(`File size: ${completed.file.bytes} bytes`);
705
return completed.file;
706
} else {
707
throw new Error("Upload completed but file not ready");
708
}
709
} catch (error) {
710
console.error("Upload failed, cancelling...");
711
await client.uploads.cancel(upload.id);
712
throw error;
713
}
714
}
715
716
// Usage
717
const file = await uploadLargeFile("./training-data.jsonl", "batch");
718
console.log(`Ready to use file: ${file.id}`);
719
```
720
721
---
722
723
## toFile Helper
724
725
Convert various data sources into File objects for upload operations.
726
727
### Function Signature
728
729
```typescript { .api }
730
async function toFile(
731
value: ToFileInput | PromiseLike<ToFileInput>,
732
name?: string | null | undefined,
733
options?: FilePropertyBag | undefined,
734
): Promise<File>;
735
736
type ToFileInput =
737
| FileLike // File, Blob, etc.
738
| ResponseLike // fetch Response
739
| Exclude<BlobLikePart, string>
740
| AsyncIterable<BlobLikePart>;
741
742
type FilePropertyBag = {
743
type?: string; // MIME type
744
lastModified?: number; // Timestamp
745
};
746
```
747
748
### Supported Input Types
749
750
- **File objects**: Passed through directly
751
- **Blob objects**: Converted to File
752
- **Buffers**: `Buffer`, `Uint8Array`, `ArrayBuffer`
753
- **Strings**: Text content
754
- **Fetch Responses**: Downloaded and converted
755
- **Streams**: Node.js `ReadableStream`, async iterables
756
- **Promises**: Resolved before conversion
757
758
### Example: Convert String to File
759
760
```typescript
761
import OpenAI, { toFile } from "openai";
762
763
const client = new OpenAI();
764
765
const file = await toFile("Hello, World!", "greeting.txt", {
766
type: "text/plain",
767
});
768
769
const response = await client.files.create({
770
file: file,
771
purpose: "assistants",
772
});
773
```
774
775
### Example: Convert Buffer to File
776
777
```typescript
778
import OpenAI, { toFile } from "openai";
779
import * as fs from "fs";
780
781
const buffer = fs.readFileSync("./data.jsonl");
782
const file = await toFile(buffer, "data.jsonl", {
783
type: "application/x-ndjson",
784
});
785
```
786
787
### Example: Convert Stream to File
788
789
```typescript
790
import OpenAI, { toFile } from "openai";
791
import * as fs from "fs";
792
793
const stream = fs.createReadStream("./large-file.bin");
794
const file = await toFile(stream, "large-file.bin", {
795
type: "application/octet-stream",
796
});
797
```
798
799
### Example: Convert Fetch Response to File
800
801
```typescript
802
import OpenAI, { toFile } from "openai";
803
804
const response = await fetch("https://example.com/data.jsonl");
805
const file = await toFile(response, "data.jsonl", {
806
type: "application/x-ndjson",
807
});
808
```
809
810
### Example: Convert with Name Inference
811
812
```typescript
813
import OpenAI, { toFile } from "openai";
814
815
const client = new OpenAI();
816
817
// toFile will infer name from response URL
818
const response = await fetch("https://cdn.example.com/data.jsonl?token=abc123");
819
const file = await toFile(response); // Name: "data.jsonl"
820
821
// Or from stream path (Node.js)
822
const stream = fs.createReadStream("./training.jsonl");
823
const file2 = await toFile(stream); // Name: "training.jsonl"
824
```
825
826
---
827
828
## File Purposes Reference
829
830
Different purposes support different file types and have different processing requirements.
831
832
### assistants
833
834
For use with the Assistants API. Supports various document types for file search and analysis.
835
836
```typescript
837
const file = await client.files.create({
838
file: await toFile(docContent, "document.pdf"),
839
purpose: "assistants",
840
});
841
```
842
843
### batch
844
845
For batch processing jobs. Uses `.jsonl` format with specific structure.
846
847
```typescript
848
const file = await client.files.create({
849
file: await toFile(batchData, "requests.jsonl", {
850
type: "application/x-ndjson",
851
}),
852
purpose: "batch",
853
expires_after: {
854
anchor: "created_at",
855
seconds: 30 * 24 * 60 * 60, // 30 days default
856
},
857
});
858
```
859
860
### fine-tune
861
862
For fine-tuning training data. Requires `.jsonl` format with specific message structure.
863
864
```typescript
865
const trainingData = [
866
{ messages: [
867
{ role: "system", content: "You are helpful" },
868
{ role: "user", content: "Hello" },
869
{ role: "assistant", content: "Hi there!" }
870
]},
871
// more examples...
872
].map(obj => JSON.stringify(obj)).join("\n");
873
874
const file = await client.files.create({
875
file: await toFile(trainingData, "training.jsonl", {
876
type: "application/x-ndjson",
877
}),
878
purpose: "fine-tune",
879
});
880
```
881
882
### vision
883
884
Images for vision model fine-tuning.
885
886
```typescript
887
const file = await client.files.create({
888
file: await toFile(imageBuffer, "image.png", { type: "image/png" }),
889
purpose: "vision",
890
});
891
```
892
893
### user_data
894
895
Flexible general-purpose file storage.
896
897
```typescript
898
const file = await client.files.create({
899
file: await toFile(customData, "data.txt"),
900
purpose: "user_data",
901
});
902
```
903
904
### evals
905
906
For evaluation datasets.
907
908
```typescript
909
const file = await client.files.create({
910
file: await toFile(evalData, "eval-set.jsonl", {
911
type: "application/x-ndjson",
912
}),
913
purpose: "evals",
914
});
915
```
916
917
---
918
919
## Error Handling
920
921
Handle common file operation errors gracefully.
922
923
```typescript
924
import OpenAI, { APIError, NotFoundError, BadRequestError } from "openai";
925
926
const client = new OpenAI();
927
928
try {
929
const file = await client.files.retrieve("invalid-id");
930
} catch (error) {
931
if (error instanceof NotFoundError) {
932
console.log("File not found");
933
} else if (error instanceof BadRequestError) {
934
console.log("Invalid file parameters");
935
} else if (error instanceof APIError) {
936
console.log(`API error: ${error.status} - ${error.message}`);
937
}
938
}
939
```
940
941
Example: Retry failed uploads
942
943
```typescript
944
async function uploadWithRetry(
945
filePath: string,
946
maxRetries: number = 3
947
): Promise<FileObject> {
948
for (let i = 0; i < maxRetries; i++) {
949
try {
950
return await client.files.create({
951
file: fs.createReadStream(filePath),
952
purpose: "batch",
953
});
954
} catch (error) {
955
if (i === maxRetries - 1) throw error;
956
console.log(`Upload attempt ${i + 1} failed, retrying...`);
957
await new Promise((resolve) => setTimeout(resolve, 1000 * (i + 1)));
958
}
959
}
960
throw new Error("Upload failed after max retries");
961
}
962
```
963
964
---
965
966
## Type Reference
967
968
### FileObject
969
970
```typescript { .api }
971
interface FileObject {
972
id: string; // File identifier
973
filename: string; // Original filename
974
bytes: number; // File size in bytes
975
created_at: number; // Unix timestamp (seconds)
976
object: 'file';
977
purpose: FilePurpose; // 'assistants' | 'batch' | 'fine-tune' | 'vision' | 'user_data' | 'evals'
978
status: 'uploaded' | 'processed' | 'error';
979
expires_at?: number; // Unix timestamp when file expires
980
status_details?: string; // Deprecated: error details for fine-tuning
981
}
982
```
983
984
### FileDeleted
985
986
```typescript { .api }
987
interface FileDeleted {
988
id: string;
989
deleted: boolean;
990
object: 'file';
991
}
992
```
993
994
### FileListParams
995
996
```typescript { .api }
997
interface FileListParams {
998
order?: 'asc' | 'desc'; // Sort by created_at
999
purpose?: string; // Filter by purpose
1000
after?: string; // Cursor for pagination
1001
limit?: number; // Results per page (max 100)
1002
}
1003
```
1004
1005
### Upload
1006
1007
```typescript { .api }
1008
interface Upload {
1009
id: string; // Upload session identifier
1010
filename: string; // Resulting filename
1011
bytes: number; // Total bytes being uploaded
1012
created_at: number; // Unix timestamp (seconds)
1013
expires_at: number; // Expiration time (1 hour from creation)
1014
purpose: FilePurpose;
1015
status: 'pending' | 'completed' | 'cancelled' | 'expired';
1016
file?: FileObject | null; // Resulting file (when completed)
1017
object: 'upload';
1018
}
1019
```
1020
1021
### UploadPart
1022
1023
```typescript { .api }
1024
interface UploadPart {
1025
id: string; // Part identifier
1026
created_at: number; // Unix timestamp (seconds)
1027
upload_id: string; // Associated upload session ID
1028
object: 'upload.part';
1029
}
1030
```
1031
1032
### UploadCreateParams
1033
1034
```typescript { .api }
1035
interface UploadCreateParams {
1036
bytes: number; // Total file size
1037
filename: string; // Filename for resulting file
1038
mime_type: string; // MIME type (e.g., 'application/x-ndjson')
1039
purpose: FilePurpose;
1040
expires_after?: {
1041
anchor: 'created_at';
1042
seconds: number; // 3600 to 2592000
1043
};
1044
}
1045
```
1046
1047
### UploadCompleteParams
1048
1049
```typescript { .api }
1050
interface UploadCompleteParams {
1051
part_ids: Array<string>; // Ordered list of part IDs
1052
md5?: string; // Optional MD5 checksum for verification
1053
}
1054
```
1055
1056
---
1057
1058
## Related Resources
1059
1060
- [Assistants API](./assistants.md) - Using files with assistants
1061
- [Batch API](./batches.md) - Batch file processing
1062
- [Fine-tuning API](./fine-tuning.md) - Training file formats
1063
- [Vector Stores](./vector-stores.md) - File storage and retrieval
1064