Standalone utilities for converting FormData instances to multipart/form-data Blobs. These utilities are useful for HTTP libraries that need explicit Blob handling or when working with environments where automatic FormData conversion isn't available.
A pure function that converts any FormData instance to a properly formatted multipart/form-data Blob with correct boundary generation and field encoding.
/**
* Converts FormData to a multipart/form-data Blob
* @param formData - FormData instance to convert (any FormData-like object)
* @param BlobClass - Optional Blob constructor to use (defaults to global Blob)
* @returns Blob with proper multipart/form-data content type and boundary
*/
function formDataToBlob(formData: FormData, BlobClass?: typeof Blob): Blob;Usage Examples:
import { formDataToBlob } from "formdata-polyfill/formdata-to-blob.js";
// Convert any FormData instance
const fd = new FormData();
fd.append("name", "John Doe");
fd.append("message", "Hello\nWorld"); // Line breaks are properly handled
const blob = formDataToBlob(fd);
console.log(blob.type); // "multipart/form-data; boundary=----formdata-polyfill-..."
console.log(blob.size); // Size of the encoded multipart data
// Use with fetch
fetch("/api/submit", {
method: "POST",
body: blob,
});Advanced usage with custom Blob class:
import { formDataToBlob } from "formdata-polyfill/formdata-to-blob.js";
import { Blob } from "fetch-blob"; // Custom Blob implementation
const fd = new FormData();
fd.append("data", "test");
// Use custom Blob implementation
const blob = formDataToBlob(fd, Blob);The conversion utility properly handles File objects, Blob objects, and string values with correct encoding and metadata preservation.
String Field Handling:
const fd = new FormData();
fd.append("text", "Hello\nWorld");
fd.append("special", 'Contains "quotes" and \r\n line breaks');
const blob = formDataToBlob(fd);
// String values are properly encoded with:
// - Line break normalization (\n → \r\n, \r → \r\n)
// - Special character escaping in field names
// - Proper content disposition headersFile Field Handling:
const fd = new FormData();
// File with metadata
const file = new File(["file content"], "document.txt", {
type: "text/plain",
lastModified: Date.now(),
});
fd.append("document", file);
// Blob with custom filename
const blob = new Blob(["binary data"], { type: "application/octet-stream" });
fd.append("data", blob, "custom-filename.bin");
const multipartBlob = formDataToBlob(fd);
// File objects preserve:
// - Filename in Content-Disposition header
// - Content-Type from file.type
// - Binary data without encodingThe utility generates unique boundaries for each conversion to ensure proper multipart separation.
Boundary Format:
const fd = new FormData();
fd.append("test", "value");
const blob = formDataToBlob(fd);
// Generated boundary format: ----formdata-polyfill-{random}
// Example: ----formdata-polyfill-0.8234567123456789
console.log(blob.type); // "multipart/form-data; boundary=----formdata-polyfill-..."The conversion handles proper encoding of field names, filenames, and values according to multipart/form-data specifications.
Field Name Escaping:
const fd = new FormData();
// Special characters in field names are escaped
fd.append('field\nname', 'value'); // \n becomes %0D%0A
fd.append('field\rname', 'value'); // \r becomes %0D%0A
fd.append('field"name', 'value'); // " becomes %22
const blob = formDataToBlob(fd);
// Produces properly escaped Content-Disposition headersFilename Escaping:
const fd = new FormData();
const file = new File(["content"], 'file\nname.txt');
fd.append("upload", file);
// Filename special characters are escaped:
// \n → %0A, \r → %0D, " → %22
const blob = formDataToBlob(fd);Value Normalization:
const fd = new FormData();
// Line endings in string values are normalized
fd.append("textarea", "Line 1\nLine 2\rLine 3\r\nLine 4");
const blob = formDataToBlob(fd);
// All line endings become \r\n in the final outputNode.js HTTP Server Usage:
import { formDataToBlob } from "formdata-polyfill/formdata-to-blob.js";
import { Readable } from "node:stream";
import http from "node:http";
const fd = new FormData();
fd.append("name", "John");
fd.append("file", new File(["content"], "test.txt"));
const blob = formDataToBlob(fd);
// Create HTTP request with proper headers
const stream = Readable.from(blob.stream());
const req = http.request("http://example.com/upload", {
method: "POST",
headers: {
"Content-Type": blob.type,
"Content-Length": blob.size,
},
});
stream.pipe(req);Axios Integration:
import axios from "axios";
import { formDataToBlob } from "formdata-polyfill/formdata-to-blob.js";
const fd = new FormData();
fd.append("data", "test");
const blob = formDataToBlob(fd);
// Convert blob to buffer for axios
const buffer = Buffer.from(await blob.arrayBuffer());
const response = await axios.post("https://httpbin.org/post", buffer, {
headers: {
"Content-Type": blob.type,
},
});Undici/Node.js Fetch Usage:
import { formDataToBlob } from "formdata-polyfill/formdata-to-blob.js";
const fd = new FormData();
fd.append("message", "Hello World");
const blob = formDataToBlob(fd);
// Works directly with Node.js fetch (18+) or undici
const response = await fetch("https://httpbin.org/post", {
method: "POST",
body: blob,
headers: {
"Content-Type": blob.type,
},
});The conversion utility is designed for efficiency:
Streaming Example:
const fd = new FormData();
// Large file (won't be loaded into memory immediately)
const largeFile = new File([/* large data */], "large-file.dat");
fd.append("upload", largeFile);
const blob = formDataToBlob(fd);
// File content is only read when blob is consumed
const stream = blob.stream();
// Stream can be piped directly without memory concerns