File edition helpers working on top of mem-fs
—
Pending
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Pending
The risk profile of this skill
Stream-based transformation utilities for processing files during commit operations, enabling custom file processing pipelines with async support.
Create a Node.js Transform stream that processes files during commit operations with async file writing support.
/**
* Create a Transform stream for committing files asynchronously
* @returns Transform stream that processes MemFsEditorFile objects
*/
function createCommitTransform(): Transform;
interface Transform extends NodeJS.ReadWriteStream {
/** Transform stream in object mode for processing file objects */
objectMode: true;
/** Process each file through commitFileAsync and pass to next stream */
_transform(file: MemFsEditorFile, encoding: string, callback: TransformCallback): void;
}
interface TransformCallback {
/** Callback function to signal completion or error */
(error?: Error | null, data?: MemFsEditorFile): void;
}import { createCommitTransform } from "mem-fs-editor/transform";
import { create as createMemFs } from "mem-fs";
import { create as createEditor } from "mem-fs-editor";
import { pipeline } from "stream";
import { promisify } from "util";
const pipelineAsync = promisify(pipeline);
const store = createMemFs();
const fs = createEditor(store);
// Basic usage with commit transform
const commitTransform = createCommitTransform();
// Create some files
fs.write("file1.txt", "Content 1");
fs.write("file2.txt", "Content 2");
// Use in a stream pipeline for custom file processing
await pipelineAsync(
store.stream(), // Source stream of files
createCommitTransform(), // Transform files and commit to disk
// Additional transforms can be added here
);
// Advanced usage with custom transform chain
import { Transform } from "stream";
// Create a custom transform to modify files before commit
const customTransform = new Transform({
objectMode: true,
transform(file, encoding, callback) {
// Add header to all text files
if (file.path.endsWith('.txt') && file.contents) {
const header = Buffer.from('// Auto-generated header\n');
file.contents = Buffer.concat([header, file.contents]);
}
callback(null, file);
}
});
// Chain transforms: custom processing -> commit to disk
await pipelineAsync(
store.stream(),
customTransform,
createCommitTransform()
);The transform module integrates seamlessly with the commit system to provide streaming file processing:
import { createCommitTransform } from "mem-fs-editor/transform";
import { create as createMemFs } from "mem-fs";
import { create as createEditor } from "mem-fs-editor";
const store = createMemFs();
const fs = createEditor(store);
// Write multiple files
fs.write("src/index.js", "console.log('Hello');");
fs.write("src/utils.js", "export const helper = () => {};");
fs.write("README.md", "# My Project");
// Use transform in commit process
const commitTransform = createCommitTransform();
// Process files in streaming fashion
const fileStream = store.stream();
fileStream.pipe(commitTransform);
// Handle completion and errors
commitTransform.on('finish', () => {
console.log('All files committed successfully');
});
commitTransform.on('error', (error) => {
console.error('Error during commit:', error);
});The createCommitTransform function creates a Transform stream with these characteristics:
commitFileAsync for proper async disk operationsThis enables building complex file processing pipelines where files can be transformed, validated, and committed in a streaming fashion with full async support and proper error handling.