High-performance streaming CSV parsing library for Node.js with configurable options and transformation support
—
Pending
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Pending
The risk profile of this skill
Factory functions for creating CSV parser streams from different input sources. Each function returns a configured CsvParserStream instance that can be used with Node.js stream APIs.
Creates a new CSV parser stream with the specified options.
/**
* Creates a new CSV parser stream with configurable options
* @param args - Parser configuration options
* @returns CsvParserStream instance for processing CSV data
*/
function parse<I extends Row, O extends Row>(args?: ParserOptionsArgs): CsvParserStream<I, O>;Usage Examples:
import { parse } from "@fast-csv/parse";
// Create parser with headers
const parser = parse({ headers: true });
parser
.on("data", row => console.log(row))
.on("end", () => console.log("Parsing complete"));
// Pipe data to parser
someReadableStream.pipe(parser);Creates a CSV parser stream that reads from a file path.
/**
* Parse CSV data from a file
* @param location - File path to read from
* @param options - Parser configuration options
* @returns CsvParserStream connected to file input
*/
function parseFile<I extends Row, O extends Row>(
location: string,
options: ParserOptionsArgs = {}
): CsvParserStream<I, O>;Usage Examples:
import { parseFile } from "@fast-csv/parse";
// Parse file with headers
parseFile("data.csv", { headers: true })
.on("error", error => console.error(error))
.on("data", row => console.log(row))
.on("end", rowCount => console.log(`Parsed ${rowCount} rows`));
// Parse with custom delimiter
parseFile("data.tsv", {
headers: true,
delimiter: "\t"
})
.on("data", row => console.log(row));Creates a CSV parser stream that reads from an existing readable stream.
/**
* Parse CSV data from a readable stream
* @param stream - Input readable stream
* @param options - Parser configuration options
* @returns CsvParserStream connected to input stream
*/
function parseStream<I extends Row, O extends Row>(
stream: NodeJS.ReadableStream,
options?: ParserOptionsArgs
): CsvParserStream<I, O>;Usage Examples:
import * as fs from "fs";
import { parseStream } from "@fast-csv/parse";
// Parse from file stream
const stream = fs.createReadStream("data.csv");
parseStream(stream, { headers: true })
.on("data", row => console.log(row));
// Parse from HTTP response
fetch("https://example.com/data.csv")
.then(response => parseStream(response.body, { headers: true }))
.then(parser => {
parser.on("data", row => console.log(row));
});Creates a CSV parser stream that reads from a string.
/**
* Parse CSV data from a string
* @param string - CSV string content to parse
* @param options - Parser configuration options
* @returns CsvParserStream processing the string data
*/
function parseString<I extends Row, O extends Row>(
string: string,
options?: ParserOptionsArgs
): CsvParserStream<I, O>;Usage Examples:
import { parseString } from "@fast-csv/parse";
const csvData = `name,age,city
Alice,25,New York
Bob,30,London`;
parseString(csvData, { headers: true })
.on("data", row => console.log(row))
.on("end", () => console.log("Done"));
// Output: { name: "Alice", age: "25", city: "New York" }
// Output: { name: "Bob", age: "30", city: "London" }
// Parse without headers (returns arrays)
parseString(csvData, { headers: false })
.on("data", row => console.log(row));
// Output: ["name", "age", "city"]
// Output: ["Alice", "25", "New York"]
// Output: ["Bob", "30", "London"]All parsing functions return CsvParserStream instances that support method chaining with transform and validate operations:
import { parseFile } from "@fast-csv/parse";
parseFile("data.csv", { headers: true })
.transform(row => ({
id: parseInt(row.id),
name: row.name.toUpperCase(),
active: row.active === "true"
}))
.validate(row => row.id > 0)
.on("data", row => console.log(row))
.on("error", error => console.error(error));