Primary tokenization functions for converting CSS strings into structured token arrays or creating streaming tokenizers for processing CSS token by token.
Tokenizes an entire CSS string into an array of tokens.
/**
* Tokenize a CSS string into a list of tokens.
* @param input - Object containing CSS string and options
* @param input.css - CSS string or object with valueOf() method
* @param input.unicodeRangesAllowed - Whether to allow unicode range tokens (default: false)
* @param options - Optional configuration
* @param options.onParseError - Callback for parse errors
* @returns Array of CSS tokens
*/
function tokenize(
input: {
css: { valueOf(): string },
unicodeRangesAllowed?: boolean,
},
options?: {
onParseError?: (error: ParseError) => void
},
): Array<CSSToken>;Usage Examples:
import { tokenize } from "@csstools/css-tokenizer";
// Basic tokenization
const tokens = tokenize({
css: ".foo { color: red; }"
});
// With unicode ranges allowed
const tokensWithUnicode = tokenize({
css: "@font-face { unicode-range: U+0025-00FF; }",
unicodeRangesAllowed: true
});
// With error handling
const tokensWithErrors = tokenize({
css: "invalid 'unclosed string"
}, {
onParseError: (error) => {
console.error('Parse error:', error);
}
});Creates a tokenizer instance for processing CSS tokens one by one, providing streaming access to tokens.
/**
* Create a tokenizer for a CSS string.
* @param input - Object containing CSS string and options
* @param input.css - CSS string or object with valueOf() method
* @param input.unicodeRangesAllowed - Whether to allow unicode range tokens (default: false)
* @param options - Optional configuration
* @param options.onParseError - Callback for parse errors
* @returns Tokenizer instance with nextToken and endOfFile methods
*/
function tokenizer(
input: {
css: { valueOf(): string },
unicodeRangesAllowed?: boolean,
},
options?: {
onParseError?: (error: ParseError) => void
},
): { nextToken: () => CSSToken, endOfFile: () => boolean };Usage Examples:
import { tokenizer } from "@csstools/css-tokenizer";
// Basic streaming tokenization
const t = tokenizer({
css: ".foo { color: red; }"
});
const tokens = [];
while (!t.endOfFile()) {
tokens.push(t.nextToken());
}
// Get the final EOF token
tokens.push(t.nextToken());
// Process tokens as they're generated
const processor = tokenizer({
css: "body { margin: 0; padding: 10px; }"
});
while (!processor.endOfFile()) {
const token = processor.nextToken();
if (token[0] === 'ident-token') {
console.log('Found identifier:', token[4].value);
} else if (token[0] === 'dimension-token') {
console.log('Found dimension:', token[4].value, token[4].unit);
}
}The tokenizer function returns an object with two methods:
interface TokenizerResult {
/** Get the next token from the CSS string */
nextToken(): CSSToken;
/** Check if all tokens have been consumed */
endOfFile(): boolean;
}Returns the next token from the CSS string. Always returns a valid token, including EOF token when the end is reached.
/**
* Get the next token from the CSS string
* @returns The next CSS token
*/
nextToken(): CSSToken;Checks if the tokenizer has reached the end of the CSS string.
/**
* Check if all tokens have been consumed
* @returns true if at end of file, false otherwise
*/
endOfFile(): boolean;Both functions accept the same input configuration:
interface TokenizerInput {
/** CSS string or object with valueOf() method returning CSS string */
css: { valueOf(): string };
/** Whether unicode range tokens are allowed (default: false) */
unicodeRangesAllowed?: boolean;
}
interface TokenizerOptions {
/** Callback function called when parse errors occur */
onParseError?: (error: ParseError) => void;
}Both tokenization functions support error handling through the onParseError callback:
import { tokenize, ParseError } from "@csstools/css-tokenizer";
const tokens = tokenize({
css: "invalid CSS with 'unclosed string"
}, {
onParseError: (error: ParseError) => {
console.error(`Parse error at position ${error.sourceStart}-${error.sourceEnd}`);
console.error('Parser state:', error.parserState);
}
});The tokenizer will continue processing after errors, generating appropriate error tokens (like bad-string-token or bad-url-token) as defined by the CSS specification.