CtrlK
BlogDocsLog inGet started
Tessl Logo

databricks-pipelines

Develop Lakeflow Spark Declarative Pipelines (formerly Delta Live Tables) on Databricks. Use when building batch or streaming data pipelines with Python or SQL. Invoke BEFORE starting implementation.

89

Quality

87%

Does it follow best practices?

Impact

No eval scenarios have been run

SecuritybySnyk

Advisory

Suggest reviewing before use

SKILL.md
Quality
Evals
Security

Security

2 findings — 2 medium severity. This skill can be installed but you should review these findings before use.

Medium

W011: Third-party content exposure detected (indirect prompt injection risk)

What this means

The skill exposes the agent to untrusted, user-generated content from public third-party sources, creating a risk of indirect prompt injection. This includes browsing arbitrary URLs, reading social media posts or forum comments, and analyzing content from unknown websites.

Why it was flagged

Third-party content exposure detected (high risk: 0.80). The skill’s workflow and API docs explicitly require reading from external/untrusted sources (e.g., Auto Loader/read_files "s3://bucket/path" in references/auto-loader(-sql).md, read_kafka/read_pubsub/read_pulsar in references/sql-basics.md, and streaming reads like spark.readStream.table(...) in SKILL.md), so the agent is expected to ingest and act on arbitrary third‑party/user-provided data as part of pipeline flows.

Report incorrect finding
Medium

W012: Unverifiable external dependency detected (runtime URL that controls agent)

What this means

The skill fetches instructions or code from an external URL at runtime, and the fetched content directly controls the agent’s prompts or executes code. This dynamic dependency allows the external source to modify the agent’s behavior without any changes to the skill itself.

Why it was flagged

Potentially malicious external URL detected (high risk: 1.00). The skill includes a prerequisite install command that pipes remote code to a shell (curl -fsSL https://raw.githubusercontent.com/databricks/setup-cli/main/install.sh | sh) which would fetch and execute remote code and is presented as a required dependency (Databricks CLI) for using the skill.

Repository
databricks/databricks-agent-skills
Audited
Security analysis
Snyk

Is this your skill?

If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.