Recognise bots/crawlers/spiders using the user agent string.
82
Build a detector that customizes the dependency's bot pattern list so teams can toggle known crawlers on or off while adding their own identifiers. The detector should expose the final pattern list and simple helpers for boolean checks and match inspection.
remove containing "googlebot" returns false for "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" while still returning true for "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)", and returns true for "PartnerCrawler/1.0" when "partnercrawler" is provided in add. @testnull or undefined user agents to the detector returns false without throwing. @testpatterns() after building a detector with remove: ["googlebot"] and add: ["partnercrawler"] returns an array that omits any base entries containing "googlebot" (case-insensitive) and appends "partnercrawler" as the last entry. @test"PartnerCrawler/1.0", match() returns the pattern string that triggers detection ("partnercrawler"). For "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36", match() returns null. @test@generates
export interface DetectorConfig {
add?: string[];
remove?: string[];
}
export interface BotDetector {
detect(userAgent?: string | null): boolean;
match(userAgent?: string | null): string | null;
patterns(): string[];
}
export function buildBotDetector(config?: DetectorConfig): BotDetector;Provides the base bot pattern list and detection helpers used to build custom detectors.
@satisfied-by
Install with Tessl CLI
npx tessl i tessl/npm-isbot