Use this skill when the user asks to "set up parsing", "create parsing rule", "extract fields from logs", "regex extraction", "log parsing", "enrich logs", "add context to logs", "custom enrichment table", "lookup table", "geo enrichment", "create metric from logs", "events to metrics", "convert logs to metrics", "generate metrics from events", "recording rule", "precomputed metrics", "PromQL recording", "configure data pipeline", "transform log data", "data processing rules", "rule group", "enrichment settings", "E2M definition", "labels cardinality", "bulk delete rules", "enrichment limits", "search enrichment table", or wants to configure how Coralogix processes, enriches, or transforms ingested data.
79
74%
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Passed
No known issues
Optimize this skill with Tessl
npx tessl skill review --optimize ./skills/cx-data-pipeline/SKILL.mdUse this skill when configuring how Coralogix processes, enriches, and transforms data. It covers parsing rules (extract structured fields from raw logs), enrichments (add context from lookup tables), Events2Metrics (derive metrics from log/span events), and recording rules (precompute PromQL expressions).
| Command | Subcommands | Purpose |
|---|---|---|
cx parsing-rules | list, get, create, update, delete, bulk-delete, usage-limits | Manage log parsing rules |
cx enrichments | list, add, remove, overwrite, limit, settings | Manage enrichment rules |
cx enrichments custom | list, get, create, update, delete, search | Manage custom enrichment tables |
cx e2m | list, get, create, update, delete, labels-cardinality, limits | Manage Events2Metrics definitions |
cx recording-rules | list, get, create, update, delete | Manage Prometheus recording rule groups |
Key flags:
--from-file <path> (or - for stdin)-o json for structured output and -p <profile> for profile selectioncx parsing-rules update and cx recording-rules update require both --from-file and the rule group IDcx enrichments custom search requires --id <table-id> and --query <text>cx parsing-rules bulk-delete requires --ids <id1> <id2> ...These commands use complex JSON structures. Always template from an existing resource to avoid format errors:
# 1. Get an existing resource as a template
cx parsing-rules get <rule-group-id> -o json > template.json
# 2. Modify the template (change fields, remove the ID for create operations)
# 3. Create or update
cx parsing-rules create --from-file template.json
cx parsing-rules update --from-file template.json <rule-group-id>This pattern applies to all create/update operations across all 4 commands. It prevents payload format errors that are the #1 cause of failed attempts.
cx parsing-rules list -o json
cx parsing-rules list -o json | jq '[.[] | {id, name, enabled, rule_count: (.rules | length)}]'cx parsing-rules get <existing-rule-group-id> -o json > rule-template.jsonEdit the template for your new service, then:
cx parsing-rules create --from-file rule-template.jsonQuery recent logs to confirm fields are extracted (load cx-telemetry-querying for log querying):
cx logs 'source logs | filter $d.subsystem == "my-service" | limit 10' -o jsoncx parsing-rules usage-limits -o jsoncx enrichments list -o json
cx enrichments settings -o json
cx enrichments limit -o jsoncx enrichments custom list -o json
cx enrichments custom create --from-file table-definition.jsoncx enrichments add --from-file enrichment-rules.jsoncx enrichments custom search --id <table-id> --query "search term"Query logs on hot storage (FrequentSearch tier) to confirm enriched fields appear. Avoid querying archive for verification - ingestion delays can cause false negatives.
cx logs 'source logs | filter $d.enriched_field != null | limit 5' -o jsonDecide the metric name, labels, and aggregation type before creating.
cx e2m limits -o json
cx e2m labels-cardinality -o jsoncx e2m list -o json
cx e2m get <existing-e2m-id> -o json > e2m-template.jsoncx e2m create --from-file e2m-definition.jsonConfirm the new metric appears (load cx-telemetry-querying for metrics querying):
cx metrics search --name "new_metric_name"cx recording-rules list -o json
cx recording-rules list -o json | jq '[.[] | {id, name, rules: [.rules[]?.record]}]'cx recording-rules get <existing-id> -o json > recording-rule-template.jsoncx recording-rules create --from-file recording-rule-group.jsonConfirm the precomputed metric is available (load cx-telemetry-querying for metrics querying):
cx metrics query "new_precomputed_metric" --time nowcx <command> get <id> -o json > template.json before any create-o json - all payload inspection and creation should use JSON outputcx parsing-rules usage-limits and cx e2m limits before creating to avoid hitting capscx parsing-rules bulk-delete --ids for cleanup, not individual deletescx-telemetry-querying - discover what data is available before configuring pipeline, and verify parsing results and enriched fields via log/metrics queriesdefdc4d
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.