or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

docs

error-handling.mdindex.mdtoken-types.mdtokenization.mdtype-predicates.mdutilities.md
tile.json

tessl/npm-csstools--css-tokenizer

Specification-compliant CSS tokenizer following W3C CSS Syntax Level 3 for parsing CSS into tokens

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
npmpkg:npm/@csstools/css-tokenizer@3.0.x

To install, run

npx @tessl/cli install tessl/npm-csstools--css-tokenizer@3.0.0

index.mddocs/

@csstools/css-tokenizer

A specification-compliant CSS tokenizer that follows the W3C CSS Syntax Level 3 specification for parsing CSS into tokens. Provides both streaming tokenization through the tokenizer() function and batch tokenization through the tokenize() helper function, enabling developers to process CSS source code at a low level with strong ties to the CSS specification.

Package Information

  • Package Name: @csstools/css-tokenizer
  • Package Type: npm
  • Language: TypeScript
  • Installation: npm install @csstools/css-tokenizer
  • Node.js: Requires Node.js >= 18

Core Imports

import { 
  tokenize, 
  tokenizer, 
  CSSToken, 
  TokenType,
  ParseError,
  ParseErrorMessage,
  mirrorVariant,
  mirrorVariantType
} from "@csstools/css-tokenizer";

For CommonJS:

const { 
  tokenize, 
  tokenizer, 
  TokenType,
  ParseError,
  ParseErrorMessage,
  mirrorVariant,
  mirrorVariantType
} = require("@csstools/css-tokenizer");

Basic Usage

import { tokenize, tokenizer } from "@csstools/css-tokenizer";

// Batch tokenization - convert entire CSS string to array of tokens
const myCSS = `@media only screen and (min-width: 768rem) {
  .foo {
    content: 'Some content!' !important;
  }
}`;

const tokens = tokenize({
  css: myCSS,
});

console.log(tokens);

// Streaming tokenization - process tokens one by one
const t = tokenizer({
  css: myCSS,
});

while (!t.endOfFile()) {
  const token = t.nextToken();
  console.log(token);
}

Architecture

@csstools/css-tokenizer is built around several key components:

  • Core Tokenization: Two main functions (tokenize, tokenizer) providing batch and streaming approaches
  • Token Types: Comprehensive type system covering all CSS token types from the W3C specification
  • Type Predicates: Utility functions for type checking and token validation
  • Token Utilities: Functions for token manipulation, cloning, and string conversion
  • Error Handling: Detailed parse error reporting with source position information
  • Specification Compliance: Strict adherence to W3C CSS Syntax Level 3 tokenization rules

Capabilities

Core Tokenization

Primary tokenization functions for converting CSS strings into structured token arrays or creating streaming tokenizers for processing CSS token by token.

function tokenize(
  input: { css: { valueOf(): string }, unicodeRangesAllowed?: boolean },
  options?: { onParseError?: (error: ParseError) => void }
): Array<CSSToken>;

function tokenizer(
  input: { css: { valueOf(): string }, unicodeRangesAllowed?: boolean },
  options?: { onParseError?: (error: ParseError) => void }
): { nextToken: () => CSSToken, endOfFile: () => boolean };

Core Tokenization

Token Types and Interfaces

Complete type definitions for all CSS token types following the W3C specification, including enums, interfaces, and utility types.

type CSSToken = TokenAtKeyword | TokenBadString | TokenBadURL | TokenCDC | TokenCDO |
  TokenColon | TokenComma | TokenComment | TokenDelim | TokenDimension | TokenEOF |
  TokenFunction | TokenHash | TokenIdent | TokenNumber | TokenPercentage |
  TokenSemicolon | TokenString | TokenURL | TokenWhitespace | TokenOpenParen |
  TokenCloseParen | TokenOpenSquare | TokenCloseSquare | TokenOpenCurly |
  TokenCloseCurly | TokenUnicodeRange;

enum TokenType {
  Comment = 'comment',
  AtKeyword = 'at-keyword-token',
  BadString = 'bad-string-token',
  // ... all token types
}

Token Types

Type Predicates

Type guard functions for runtime token type checking and validation, providing type-safe ways to work with tokens.

function isToken(value: any): value is CSSToken;
function isTokenNumeric(token: CSSToken): token is NumericToken;
function isTokenAtKeyword(token: CSSToken): token is TokenAtKeyword;
// ... all type predicates

Type Predicates

Token Utilities

Utility functions for token manipulation, cloning, stringification, and mutation operations.

function stringify(...tokens: Array<CSSToken>): string;
function cloneTokens(tokens: Array<CSSToken>): Array<CSSToken>;
function mutateIdent(ident: TokenIdent, newValue: string): void;
function mutateUnit(ident: TokenDimension, newUnit: string): void;

Token Utilities

Token Mirror Functions

Utility functions for getting corresponding bracket/parenthesis tokens and token types.

function mirrorVariant(token: CSSToken): CSSToken | null;
function mirrorVariantType(type: TokenType): TokenType | null;

Token Types - Mirror Functions

Error Handling

Parse error classes and error handling mechanisms for tokenization errors with detailed source position information.

class ParseError {
  sourceStart: number;
  sourceEnd: number;
  parserState: Array<string>;
}

class ParseErrorWithToken extends ParseError {
  token: CSSToken;
}

Error Handling

Types

interface Token<T extends TokenType, U> extends Array<T | string | number | U> {
  /** The type of token */
  0: T;
  /** The token representation (string used when stringifying) */
  1: string;
  /** Start position of representation */
  2: number;
  /** End position of representation */
  3: number;
  /** Extra data (parsed value, unescaped, unquoted, converted, etc.) */
  4: U;
}

type NumericToken = TokenNumber | TokenDimension | TokenPercentage;

enum NumberType {
  Integer = 'integer',
  Number = 'number'
}

enum HashType {
  Unrestricted = 'unrestricted',
  ID = 'id'
}