CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/npm-i18next-scanner

Scan your code, extract translation keys/values, and merge them into i18n resource files.

Pending
Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

SecuritybySnyk

Pending

The risk profile of this skill

Overview
Eval results
Files

stream.mddocs/

Transform Stream API

Stream-based interface for processing files in build pipelines. Creates transform streams that can be used with vinyl-fs and build tools like Gulp and Grunt for automated translation key extraction during the build process.

Capabilities

Create Stream

Creates a transform stream for parsing i18n keys from files in a pipeline.

/**
 * Creates a transform stream for parsing i18n keys from files
 * @param options - Parser configuration options
 * @param customTransform - Optional custom transform function
 * @param customFlush - Optional custom flush function
 * @returns Node.js transform stream
 */
function createStream(
  options: ParserOptions,
  customTransform?: CustomTransform,
  customFlush?: CustomFlush
): NodeJS.ReadWriteStream

Usage Examples:

const scanner = require('i18next-scanner');
const vfs = require('vinyl-fs');
const sort = require('gulp-sort');

// Basic stream usage
const options = {
  lngs: ['en', 'de'],
  ns: ['translation'],
  defaultNs: 'translation',
  resource: {
    loadPath: 'i18n/{{lng}}/{{ns}}.json',
    savePath: 'i18n/{{lng}}/{{ns}}.json'
  }
};

vfs.src(['src/**/*.{js,jsx}'])
  .pipe(sort())
  .pipe(scanner.createStream(options))
  .pipe(vfs.dest('./'));

Default Export Stream

Convenience factory function that creates a transform stream (same as createStream).

/**
 * Convenience API that creates a scanner transform stream
 * @param options - Parser configuration options
 * @param customTransform - Optional custom transform function
 * @param customFlush - Optional custom flush function
 * @returns Node.js transform stream
 */
function scanner(
  options: ParserOptions,
  customTransform?: CustomTransform,
  customFlush?: CustomFlush
): NodeJS.ReadWriteStream

Usage Examples:

const scanner = require('i18next-scanner');
const vfs = require('vinyl-fs');

// Using default export (same as createStream)
vfs.src(['src/**/*.js'])
  .pipe(scanner(options))
  .pipe(vfs.dest('./'));

Custom Transform Function

Provides access to the parser instance for advanced file processing and key extraction customization.

/**
 * Custom transform function signature
 * @param file - Vinyl file object being processed
 * @param encoding - File encoding
 * @param done - Callback to signal completion
 */
type CustomTransform = (file: VinylFile, encoding: string, done: () => void) => void

Usage Examples:

const fs = require('fs');
const chalk = require('chalk');

const customTransform = function(file, enc, done) {
  const parser = this.parser; // Access to parser instance
  const content = fs.readFileSync(file.path, enc);
  let count = 0;

  // Custom parsing with additional function names
  parser.parseFuncFromString(content, { 
    list: ['i18next._', 'i18next.__', '$t'] 
  }, (key, options) => {
    parser.set(key, Object.assign({}, options, {
      nsSeparator: false,
      keySeparator: false
    }));
    ++count;
  });

  if (count > 0) {
    console.log(`Found ${chalk.cyan(count)} keys in ${chalk.yellow(file.relative)}`);
  }

  done();
};

// Use custom transform
vfs.src(['src/**/*.js'])
  .pipe(scanner(options, customTransform))
  .pipe(vfs.dest('./'));

Custom Flush Function

Controls the final output generation and file writing behavior.

/**
 * Custom flush function signature
 * @param done - Callback to signal completion
 */
type CustomFlush = (done: () => void) => void

Usage Examples:

const VinylFile = require('vinyl');

const customFlush = function(done) {
  const parser = this.parser;
  const resStore = parser.get({ sort: true });
  
  // Custom output processing
  Object.keys(resStore).forEach((lng) => {
    const namespaces = resStore[lng];
    
    Object.keys(namespaces).forEach((ns) => {
      const obj = namespaces[ns];
      const outputPath = `custom-output/${lng}/${ns}.json`;
      
      // Create custom formatted output
      const customFormat = {
        metadata: {
          generated: new Date().toISOString(),
          version: '1.0.0'
        },
        translations: obj
      };
      
      const contents = Buffer.from(JSON.stringify(customFormat, null, 2));
      
      this.push(new VinylFile({
        path: outputPath,
        contents: contents
      }));
    });
  });
  
  done();
};

// Use custom flush
vfs.src(['src/**/*.js'])
  .pipe(scanner(options, null, customFlush))
  .pipe(vfs.dest('./'));

Gulp Integration

Complete example of integrating with Gulp build system.

Usage Examples:

const gulp = require('gulp');
const sort = require('gulp-sort');
const scanner = require('i18next-scanner');

gulp.task('i18n', function() {
  return gulp.src(['src/**/*.{js,jsx}'])
    .pipe(sort()) // Sort files in stream by path
    .pipe(scanner({
      debug: true,
      func: {
        list: ['i18next.t', 'i18n.t', 't'],
        extensions: ['.js', '.jsx']
      },
      trans: {
        component: 'Trans',
        i18nKey: 'i18nKey',
        defaultsKey: 'defaults',
        extensions: ['.jsx'],
        fallbackKey: function(ns, value) {
          return value;
        }
      },
      lngs: ['en', 'de', 'fr'],
      ns: ['common', 'validation'],
      defaultLng: 'en',
      defaultNs: 'common',
      defaultValue: '__STRING_NOT_TRANSLATED__',
      resource: {
        loadPath: 'locales/{{lng}}/{{ns}}.json',
        savePath: 'locales/{{lng}}/{{ns}}.json',
        jsonIndent: 2,
        lineEnding: '\\n'
      },
      nsSeparator: false,
      keySeparator: false
    }))
    .pipe(gulp.dest('./'));
});

Grunt Integration

Integration with Grunt task runner using the provided Grunt task.

Usage Examples:

// Gruntfile.js
module.exports = function(grunt) {
  grunt.initConfig({
    i18next: {
      dev: {
        src: ['src/**/*.{js,jsx}'],
        dest: './',
        options: {
          debug: true,
          func: {
            list: ['i18next.t', 'i18n.t'],
            extensions: ['.js', '.jsx']
          },
          lngs: ['en', 'de'],
          ns: ['translation'],
          defaultNs: 'translation',
          resource: {
            loadPath: 'i18n/{{lng}}/{{ns}}.json',
            savePath: 'i18n/{{lng}}/{{ns}}.json'
          }
        }
      }
    }
  });

  grunt.loadNpmTasks('i18next-scanner');
  grunt.registerTask('default', ['i18next']);
};

Vinyl File Processing

The transform stream works with Vinyl file objects and processes them based on file extensions configured in the parser options.

File Processing Flow:

  1. File Filtering: Files are processed based on configured extensions:

    • attr.extensions: HTML files for attribute parsing
    • func.extensions: JavaScript files for function parsing
    • trans.extensions: JSX files for Trans component parsing
  2. Content Extraction: File contents are read and passed to appropriate parsers

  3. Key Extraction: Translation keys are extracted and stored in the parser instance

  4. Resource Generation: Final flush generates translation resource files

Types

interface VinylFile {
  path: string;
  contents: Buffer | NodeJS.ReadableStream | null;
  relative: string;
  base: string;
  cwd: string;
}

type CustomTransform = (
  file: VinylFile, 
  encoding: string, 
  done: () => void
) => void;

type CustomFlush = (done: () => void) => void;

interface StreamContext {
  parser: Parser;  // Access to parser instance
  push: (file: VinylFile) => void;  // Add file to output stream
}

docs

cli.md

index.md

parser.md

stream.md

tile.json