Recognise bots/crawlers/spiders using the user agent string.
82
Create a small utility that inspects user-agent strings with the dependency's bot-detection patterns and returns match details for debugging and analytics. Match and pattern data should come directly from the dependency's detection output, not hardcoded pattern fragments.
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html), return isBot: true, capture the first matched substring (e.g., Googlebot), list all matched substrings, and surface the first matching pattern plus all matching patterns from the dependency. @testisBot: false with empty match information (no matched substring, empty match/pattern arrays). @testnull user agents (e.g., Googlebot, "", null, Chrome, Bingbot), return an ordered array of analyses where bot entries include match and pattern details, and non-bot or missing entries are marked as isBot: false with empty match data. @test@generates
export interface BotAnalysis {
userAgent: string | null | undefined;
isBot: boolean;
matchedSubstring?: string;
matchedSubstrings: string[];
matchedPattern?: string;
matchedPatterns: string[];
}
export function analyzeUserAgent(userAgent: string | null | undefined): BotAnalysis;
export function summarizeAgents(userAgents: Array<string | null | undefined>): BotAnalysis[];Detects bots and exposes bot pattern data for inspection.
Install with Tessl CLI
npx tessl i tessl/npm-isbot