CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/npm-openai

The official TypeScript library for the OpenAI API

Pending

Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

Overview
Eval results
Files

files-uploads.mddocs/

Files & Uploads

Comprehensive file management and large file upload operations for the OpenAI Node.js SDK. Handle individual file uploads, retrieve file metadata, manage file lifecycle, and upload large files using multipart uploads.

Capabilities

Files

Core file management operations including upload, retrieval, listing, deletion, and content access. Supports multiple file purposes (assistants, fine-tuning, batch processing, vision).

// Upload a file
create(params: FileCreateParams): Promise<FileObject>;

// Retrieve file metadata
retrieve(fileID: string): Promise<FileObject>;

// List all files with pagination
list(params?: FileListParams): Promise<FileObjectsPage>;

// Delete a file
delete(fileID: string): Promise<FileDeleted>;

// Get file contents
content(fileID: string): Promise<Response>;

// Poll until file processing completes
waitForProcessing(id: string, options?: WaitForProcessingOptions): Promise<FileObject>;

interface FileObject {
  id: string;
  filename: string;
  bytes: number;
  created_at: number;
  purpose: FilePurpose;
  status: 'uploaded' | 'processed' | 'error';
  expires_at?: number;
  object: 'file';
}

type FilePurpose = 'assistants' | 'batch' | 'fine-tune' | 'vision' | 'user_data' | 'evals';

interface FileDeleted {
  id: string;
  deleted: boolean;
  object: 'file';
}

File Operations

Uploads

Multipart upload API for handling large files (up to 8 GB). Break files into 64 MB parts and upload in parallel for improved reliability and performance.

// Initiate a multipart upload
create(params: UploadCreateParams): Promise<Upload>;

// Cancel an in-progress upload
cancel(uploadID: string): Promise<Upload>;

// Complete a multipart upload
complete(uploadID: string, params: UploadCompleteParams): Promise<Upload>;

// Upload a file part
parts.create(uploadID: string, params: PartCreateParams): Promise<UploadPart>;

interface Upload {
  id: string;
  filename: string;
  bytes: number;
  created_at: number;
  expires_at: number;
  purpose: FilePurpose;
  status: 'pending' | 'completed' | 'cancelled' | 'expired';
  file?: FileObject | null;
  object: 'upload';
}

interface UploadPart {
  id: string;
  created_at: number;
  upload_id: string;
  object: 'upload.part';
}

Upload Operations

toFile Helper

Convert various data types (strings, buffers, streams, fetch responses) into File objects for upload operations.

async function toFile(
  value: ToFileInput | PromiseLike<ToFileInput>,
  name?: string | null | undefined,
  options?: FilePropertyBag | undefined,
): Promise<File>;

type ToFileInput =
  | FileLike
  | ResponseLike
  | Exclude<BlobLikePart, string>
  | AsyncIterable<BlobLikePart>;

type FilePropertyBag = {
  type?: string;
  lastModified?: number;
};

toFile Helper


File Operations

Upload, retrieve, list, and manage files in the OpenAI platform.

Create (Upload File)

Upload a single file to OpenAI. Supports up to 512 MB individual files and multiple file purposes.

create(params: FileCreateParams): Promise<FileObject>;

interface FileCreateParams {
  /** The File object to upload */
  file: Uploadable;

  /** Intended purpose: 'assistants', 'batch', 'fine-tune', 'vision', 'user_data', 'evals' */
  purpose: FilePurpose;

  /** Optional expiration policy */
  expires_after?: {
    anchor: 'created_at';
    seconds: number; // 3600 (1 hour) to 2592000 (30 days)
  };
}

Example: Simple Text File Upload

import OpenAI, { toFile } from "openai";

const client = new OpenAI();

// Upload from string
const file1 = await client.files.create({
  file: await toFile("Training data content here", "training.txt", { type: "text/plain" }),
  purpose: "fine-tune",
});

console.log(`Uploaded file: ${file1.id}`);

Example: Upload from Buffer

import OpenAI, { toFile } from "openai";
import * as fs from "fs";

const client = new OpenAI();

// Upload from file system
const fileBuffer = fs.readFileSync("./training-data.jsonl");
const file = await client.files.create({
  file: await toFile(fileBuffer, "training-data.jsonl", {
    type: "application/x-ndjson",
  }),
  purpose: "fine-tune",
});

console.log(`Uploaded: ${file.filename} (ID: ${file.id})`);

Example: Upload with Expiration

import OpenAI, { toFile } from "openai";

const client = new OpenAI();

// File expires 24 hours after creation
const file = await client.files.create({
  file: await toFile("batch job data", "batch.jsonl", { type: "application/x-ndjson" }),
  purpose: "batch",
  expires_after: {
    anchor: "created_at",
    seconds: 86400, // 24 hours
  },
});

console.log(`File will expire at: ${new Date(file.expires_at * 1000).toISOString()}`);

Example: Upload from Fetch Response

import OpenAI, { toFile } from "openai";

const client = new OpenAI();

// Upload from remote URL
const response = await fetch("https://example.com/training-data.jsonl");
const file = await client.files.create({
  file: await toFile(response, "remote-data.jsonl", {
    type: "application/x-ndjson",
  }),
  purpose: "fine-tune",
});

console.log(`Uploaded from remote: ${file.id}`);

Retrieve

Get metadata for a specific file.

retrieve(fileID: string): Promise<FileObject>;

Example: Get File Metadata

const file = await client.files.retrieve("file-123abc");
console.log({
  id: file.id,
  name: file.filename,
  size: `${(file.bytes / 1024 / 1024).toFixed(2)} MB`,
  created: new Date(file.created_at * 1000).toISOString(),
  purpose: file.purpose,
  status: file.status,
});

List

List all files with pagination support. Filter by purpose.

list(params?: FileListParams): Promise<FileObjectsPage>;

interface FileListParams extends CursorPageParams {
  order?: 'asc' | 'desc';  // Sort by created_at
  purpose?: string;         // Filter by purpose
  after?: string;           // Pagination cursor
  limit?: number;           // Items per page (max 100)
}

Example: List All Files

// Iterate all files
for await (const file of await client.files.list()) {
  console.log(`${file.filename} (${file.purpose})`);
}

Example: List with Pagination

// Get first page
const firstPage = await client.files.list({ limit: 10 });

// Manually get next page
if (firstPage.hasNextPage()) {
  const nextPage = await firstPage.getNextPage();
  console.log(`Next page has ${nextPage.data.length} files`);
}

// Or iterate pages
for await (const page of (await client.files.list()).iterPages()) {
  console.log(`Processing page with ${page.data.length} files`);
}

Example: Filter by Purpose

// List only fine-tuning files
for await (const file of await client.files.list({ purpose: "fine-tune" })) {
  console.log(`Fine-tuning file: ${file.filename}`);
}

// List only assistant files
const assistantFiles = await client.files.list({ purpose: "assistants" });
console.log(`${assistantFiles.data.length} assistant files found`);

Example: Sort and Pagination

// Get newest files first
const newestFiles = await client.files.list({
  order: "desc",
  limit: 20,
});

console.log(`Latest 20 files (newest first):`);
for (const file of newestFiles.data) {
  console.log(`  - ${file.filename} (${new Date(file.created_at * 1000).toLocaleDateString()})`);
}

Delete

Delete a file and remove it from all vector stores.

delete(fileID: string): Promise<FileDeleted>;

Example: Delete File

const result = await client.files.delete("file-123abc");
if (result.deleted) {
  console.log(`File ${result.id} successfully deleted`);
}

Content

Retrieve the actual file contents.

content(fileID: string): Promise<Response>;

Example: Download File Content

// Get file content as response
const response = await client.files.content("file-123abc");
const buffer = await response.arrayBuffer();
const text = new TextDecoder().decode(buffer);
console.log(text);

// Save to disk (Node.js)
import * as fs from "fs";
const buffer = await response.arrayBuffer();
fs.writeFileSync("./downloaded-file.jsonl", Buffer.from(buffer));

Wait for Processing

Poll until file processing completes. Useful for batch and fine-tuning files.

async waitForProcessing(
  id: string,
  options?: { pollInterval?: number; maxWait?: number }
): Promise<FileObject>;

Default poll interval: 5 seconds, max wait: 30 minutes. Returns the processed file when status is 'processed', 'error', or 'deleted'.

Example: Wait for Processing

import OpenAI from "openai";

const client = new OpenAI();

// Upload a fine-tuning file and wait for processing
const file = await client.files.create({
  file: await client.beta.files.toFile(
    jsonlData,
    "training.jsonl",
    { type: "application/x-ndjson" }
  ),
  purpose: "fine-tune",
});

// Wait until processing completes
const processedFile = await client.files.waitForProcessing(file.id);

if (processedFile.status === "processed") {
  console.log("File processed successfully!");
  // Now safe to use in fine-tuning
} else if (processedFile.status === "error") {
  console.error("File processing failed:", processedFile.status_details);
}

Example: Custom Poll Interval

// Check every 2 seconds, timeout after 5 minutes
const file = await client.files.waitForProcessing("file-123abc", {
  pollInterval: 2000,    // 2 seconds
  maxWait: 5 * 60 * 1000, // 5 minutes
});

console.log("File ready!");

Upload Operations

Use multipart uploads for large files (up to 8 GB). Enables parallel uploads and better reliability.

Create Upload Session

Initiate a multipart upload session.

create(params: UploadCreateParams): Promise<Upload>;

interface UploadCreateParams {
  /** Total size in bytes of the file being uploaded */
  bytes: number;

  /** Filename for the resulting file */
  filename: string;

  /** MIME type (must match file purpose requirements) */
  mime_type: string;

  /** Intended purpose: 'assistants', 'batch', 'fine-tune', 'vision', 'user_data' */
  purpose: FilePurpose;

  /** Optional expiration policy */
  expires_after?: {
    anchor: 'created_at';
    seconds: number;
  };
}

Example: Start Large File Upload

import OpenAI, { toFile } from "openai";
import * as fs from "fs";

const client = new OpenAI();

// Get file size
const fileSize = fs.statSync("./large-dataset.jsonl").size;

// Initiate upload
const upload = await client.uploads.create({
  filename: "large-dataset.jsonl",
  mime_type: "application/x-ndjson",
  bytes: fileSize,
  purpose: "batch",
});

console.log(`Upload session created: ${upload.id}`);
console.log(`Expires at: ${new Date(upload.expires_at * 1000).toISOString()}`);

Upload Parts

Upload file chunks (max 64 MB each). Can be uploaded in parallel.

parts.create(uploadID: string, params: PartCreateParams): Promise<UploadPart>;

interface PartCreateParams {
  /** The chunk of bytes for this part */
  data: Uploadable;
}

Example: Upload File in Parts

import OpenAI, { toFile } from "openai";
import * as fs from "fs";

const client = new OpenAI();

const filePath = "./large-file.jsonl";
const fileSize = fs.statSync(filePath).size;
const PART_SIZE = 50 * 1024 * 1024; // 50 MB

// Create upload session
const upload = await client.uploads.create({
  filename: "large-file.jsonl",
  mime_type: "application/x-ndjson",
  bytes: fileSize,
  purpose: "batch",
});

console.log(`Starting upload: ${upload.id}`);

// Upload file in parts
const partIds: string[] = [];
const fileStream = fs.createReadStream(filePath, {
  highWaterMark: PART_SIZE,
});

let partNum = 0;
for await (const chunk of fileStream) {
  const part = await client.uploads.parts.create(upload.id, {
    data: await toFile(chunk, `part-${partNum}`, {
      type: "application/octet-stream",
    }),
  });

  partIds.push(part.id);
  console.log(`Uploaded part ${partNum + 1}/${Math.ceil(fileSize / PART_SIZE)}`);
  partNum++;
}

console.log(`All parts uploaded: ${partIds.length} parts`);

Example: Parallel Part Uploads

import OpenAI, { toFile } from "openai";
import * as fs from "fs";

const client = new OpenAI();

const filePath = "./large-file.jsonl";
const fileSize = fs.statSync(filePath).size;
const PART_SIZE = 50 * 1024 * 1024; // 50 MB
const MAX_CONCURRENT = 4; // 4 parallel uploads

// Create upload session
const upload = await client.uploads.create({
  filename: "large-file.jsonl",
  mime_type: "application/x-ndjson",
  bytes: fileSize,
  purpose: "batch",
});

// Read file into chunks
const chunks: Buffer[] = [];
const fileData = fs.readFileSync(filePath);
for (let i = 0; i < fileData.length; i += PART_SIZE) {
  chunks.push(fileData.slice(i, i + PART_SIZE));
}

console.log(`Uploading ${chunks.length} parts in parallel (${MAX_CONCURRENT} concurrent)`);

// Upload with concurrency limit
const partIds: string[] = [];
for (let i = 0; i < chunks.length; i += MAX_CONCURRENT) {
  const batch = chunks.slice(i, i + MAX_CONCURRENT);
  const uploadPromises = batch.map(async (chunk, idx) => {
    const partNum = i + idx + 1;
    const part = await client.uploads.parts.create(upload.id, {
      data: await toFile(chunk, `part-${partNum}`, {
        type: "application/octet-stream",
      }),
    });
    console.log(`Completed part ${partNum}/${chunks.length}`);
    return part.id;
  });

  const batchIds = await Promise.all(uploadPromises);
  partIds.push(...batchIds);
}

console.log(`All ${partIds.length} parts uploaded successfully`);

Complete Upload

Finalize the multipart upload and create the file.

complete(uploadID: string, params: UploadCompleteParams): Promise<Upload>;

interface UploadCompleteParams {
  /** Ordered list of part IDs */
  part_ids: Array<string>;

  /** Optional MD5 checksum for verification */
  md5?: string;
}

Example: Complete Upload and Verify

import OpenAI, { toFile } from "openai";
import * as fs from "fs";
import * as crypto from "crypto";

const client = new OpenAI();

// ... (previous upload code)

// Calculate MD5 for verification
const fileData = fs.readFileSync("./large-file.jsonl");
const md5 = crypto.createHash("md5").update(fileData).digest("hex");

// Complete the upload
const completedUpload = await client.uploads.complete(upload.id, {
  part_ids: partIds,
  md5: md5,
});

if (completedUpload.status === "completed" && completedUpload.file) {
  console.log(`Upload completed!`);
  console.log(`File ID: ${completedUpload.file.id}`);
  console.log(`File name: ${completedUpload.file.filename}`);
  console.log(`File size: ${completedUpload.file.bytes} bytes`);

  // File is now ready to use
} else {
  console.error("Upload failed to complete");
}

Cancel Upload

Cancel an in-progress upload.

cancel(uploadID: string): Promise<Upload>;

Example: Cancel Upload

const upload = await client.uploads.cancel(uploadID);
if (upload.status === "cancelled") {
  console.log("Upload cancelled successfully");
}

Complete Multipart Upload Workflow

Full end-to-end example of handling a large file upload.

import OpenAI, { toFile } from "openai";
import * as fs from "fs";
import * as crypto from "crypto";

const client = new OpenAI();

async function uploadLargeFile(filePath: string, purpose: string) {
  const fileSize = fs.statSync(filePath).size;
  const fileName = path.basename(filePath);
  const PART_SIZE = 50 * 1024 * 1024; // 50 MB
  const MAX_CONCURRENT = 4;

  console.log(`Uploading ${fileName} (${(fileSize / 1024 / 1024).toFixed(2)} MB)`);

  // Step 1: Create upload session
  const upload = await client.uploads.create({
    filename: fileName,
    mime_type: "application/x-ndjson",
    bytes: fileSize,
    purpose: purpose,
  });

  console.log(`Session created: ${upload.id}`);

  try {
    // Step 2: Upload parts in parallel
    const fileData = fs.readFileSync(filePath);
    const chunks: Buffer[] = [];
    for (let i = 0; i < fileData.length; i += PART_SIZE) {
      chunks.push(fileData.slice(i, i + PART_SIZE));
    }

    const partIds: string[] = [];
    for (let i = 0; i < chunks.length; i += MAX_CONCURRENT) {
      const batch = chunks.slice(i, i + MAX_CONCURRENT);
      const uploadPromises = batch.map(async (chunk, idx) => {
        const partNum = i + idx + 1;
        const part = await client.uploads.parts.create(upload.id, {
          data: await toFile(chunk, `part-${partNum}`),
        });
        console.log(`Part ${partNum}/${chunks.length} uploaded`);
        return part.id;
      });

      const batchIds = await Promise.all(uploadPromises);
      partIds.push(...batchIds);
    }

    // Step 3: Complete upload with MD5
    const md5 = crypto
      .createHash("md5")
      .update(fileData)
      .digest("hex");

    const completed = await client.uploads.complete(upload.id, {
      part_ids: partIds,
      md5: md5,
    });

    if (completed.status === "completed" && completed.file) {
      console.log(`Upload successful!`);
      console.log(`File ID: ${completed.file.id}`);
      console.log(`File size: ${completed.file.bytes} bytes`);
      return completed.file;
    } else {
      throw new Error("Upload completed but file not ready");
    }
  } catch (error) {
    console.error("Upload failed, cancelling...");
    await client.uploads.cancel(upload.id);
    throw error;
  }
}

// Usage
const file = await uploadLargeFile("./training-data.jsonl", "batch");
console.log(`Ready to use file: ${file.id}`);

toFile Helper

Convert various data sources into File objects for upload operations.

Function Signature

async function toFile(
  value: ToFileInput | PromiseLike<ToFileInput>,
  name?: string | null | undefined,
  options?: FilePropertyBag | undefined,
): Promise<File>;

type ToFileInput =
  | FileLike          // File, Blob, etc.
  | ResponseLike      // fetch Response
  | Exclude<BlobLikePart, string>
  | AsyncIterable<BlobLikePart>;

type FilePropertyBag = {
  type?: string;         // MIME type
  lastModified?: number; // Timestamp
};

Supported Input Types

  • File objects: Passed through directly
  • Blob objects: Converted to File
  • Buffers: Buffer, Uint8Array, ArrayBuffer
  • Strings: Text content
  • Fetch Responses: Downloaded and converted
  • Streams: Node.js ReadableStream, async iterables
  • Promises: Resolved before conversion

Example: Convert String to File

import OpenAI, { toFile } from "openai";

const client = new OpenAI();

const file = await toFile("Hello, World!", "greeting.txt", {
  type: "text/plain",
});

const response = await client.files.create({
  file: file,
  purpose: "assistants",
});

Example: Convert Buffer to File

import OpenAI, { toFile } from "openai";
import * as fs from "fs";

const buffer = fs.readFileSync("./data.jsonl");
const file = await toFile(buffer, "data.jsonl", {
  type: "application/x-ndjson",
});

Example: Convert Stream to File

import OpenAI, { toFile } from "openai";
import * as fs from "fs";

const stream = fs.createReadStream("./large-file.bin");
const file = await toFile(stream, "large-file.bin", {
  type: "application/octet-stream",
});

Example: Convert Fetch Response to File

import OpenAI, { toFile } from "openai";

const response = await fetch("https://example.com/data.jsonl");
const file = await toFile(response, "data.jsonl", {
  type: "application/x-ndjson",
});

Example: Convert with Name Inference

import OpenAI, { toFile } from "openai";

const client = new OpenAI();

// toFile will infer name from response URL
const response = await fetch("https://cdn.example.com/data.jsonl?token=abc123");
const file = await toFile(response); // Name: "data.jsonl"

// Or from stream path (Node.js)
const stream = fs.createReadStream("./training.jsonl");
const file2 = await toFile(stream); // Name: "training.jsonl"

File Purposes Reference

Different purposes support different file types and have different processing requirements.

assistants

For use with the Assistants API. Supports various document types for file search and analysis.

const file = await client.files.create({
  file: await toFile(docContent, "document.pdf"),
  purpose: "assistants",
});

batch

For batch processing jobs. Uses .jsonl format with specific structure.

const file = await client.files.create({
  file: await toFile(batchData, "requests.jsonl", {
    type: "application/x-ndjson",
  }),
  purpose: "batch",
  expires_after: {
    anchor: "created_at",
    seconds: 30 * 24 * 60 * 60, // 30 days default
  },
});

fine-tune

For fine-tuning training data. Requires .jsonl format with specific message structure.

const trainingData = [
  { messages: [
    { role: "system", content: "You are helpful" },
    { role: "user", content: "Hello" },
    { role: "assistant", content: "Hi there!" }
  ]},
  // more examples...
].map(obj => JSON.stringify(obj)).join("\n");

const file = await client.files.create({
  file: await toFile(trainingData, "training.jsonl", {
    type: "application/x-ndjson",
  }),
  purpose: "fine-tune",
});

vision

Images for vision model fine-tuning.

const file = await client.files.create({
  file: await toFile(imageBuffer, "image.png", { type: "image/png" }),
  purpose: "vision",
});

user_data

Flexible general-purpose file storage.

const file = await client.files.create({
  file: await toFile(customData, "data.txt"),
  purpose: "user_data",
});

evals

For evaluation datasets.

const file = await client.files.create({
  file: await toFile(evalData, "eval-set.jsonl", {
    type: "application/x-ndjson",
  }),
  purpose: "evals",
});

Error Handling

Handle common file operation errors gracefully.

import OpenAI, { APIError, NotFoundError, BadRequestError } from "openai";

const client = new OpenAI();

try {
  const file = await client.files.retrieve("invalid-id");
} catch (error) {
  if (error instanceof NotFoundError) {
    console.log("File not found");
  } else if (error instanceof BadRequestError) {
    console.log("Invalid file parameters");
  } else if (error instanceof APIError) {
    console.log(`API error: ${error.status} - ${error.message}`);
  }
}

Example: Retry failed uploads

async function uploadWithRetry(
  filePath: string,
  maxRetries: number = 3
): Promise<FileObject> {
  for (let i = 0; i < maxRetries; i++) {
    try {
      return await client.files.create({
        file: fs.createReadStream(filePath),
        purpose: "batch",
      });
    } catch (error) {
      if (i === maxRetries - 1) throw error;
      console.log(`Upload attempt ${i + 1} failed, retrying...`);
      await new Promise((resolve) => setTimeout(resolve, 1000 * (i + 1)));
    }
  }
  throw new Error("Upload failed after max retries");
}

Type Reference

FileObject

interface FileObject {
  id: string;                    // File identifier
  filename: string;              // Original filename
  bytes: number;                 // File size in bytes
  created_at: number;            // Unix timestamp (seconds)
  object: 'file';
  purpose: FilePurpose;          // 'assistants' | 'batch' | 'fine-tune' | 'vision' | 'user_data' | 'evals'
  status: 'uploaded' | 'processed' | 'error';
  expires_at?: number;           // Unix timestamp when file expires
  status_details?: string;       // Deprecated: error details for fine-tuning
}

FileDeleted

interface FileDeleted {
  id: string;
  deleted: boolean;
  object: 'file';
}

FileListParams

interface FileListParams {
  order?: 'asc' | 'desc';       // Sort by created_at
  purpose?: string;              // Filter by purpose
  after?: string;                // Cursor for pagination
  limit?: number;                // Results per page (max 100)
}

Upload

interface Upload {
  id: string;                    // Upload session identifier
  filename: string;              // Resulting filename
  bytes: number;                 // Total bytes being uploaded
  created_at: number;            // Unix timestamp (seconds)
  expires_at: number;            // Expiration time (1 hour from creation)
  purpose: FilePurpose;
  status: 'pending' | 'completed' | 'cancelled' | 'expired';
  file?: FileObject | null;      // Resulting file (when completed)
  object: 'upload';
}

UploadPart

interface UploadPart {
  id: string;                    // Part identifier
  created_at: number;            // Unix timestamp (seconds)
  upload_id: string;             // Associated upload session ID
  object: 'upload.part';
}

UploadCreateParams

interface UploadCreateParams {
  bytes: number;                 // Total file size
  filename: string;              // Filename for resulting file
  mime_type: string;             // MIME type (e.g., 'application/x-ndjson')
  purpose: FilePurpose;
  expires_after?: {
    anchor: 'created_at';
    seconds: number;             // 3600 to 2592000
  };
}

UploadCompleteParams

interface UploadCompleteParams {
  part_ids: Array<string>;       // Ordered list of part IDs
  md5?: string;                  // Optional MD5 checksum for verification
}

Related Resources

Install with Tessl CLI

npx tessl i tessl/npm-openai

docs

assistants.md

audio.md

batches-evals.md

chat-completions.md

client-configuration.md

containers.md

conversations.md

embeddings.md

files-uploads.md

fine-tuning.md

helpers-audio.md

helpers-zod.md

images.md

index.md

realtime.md

responses-api.md

vector-stores.md

videos.md

tile.json