Databricks Lakebase Postgres: projects, scaling, connectivity, Lakebase synced tables, and Data API. Use when asked about Lakebase databases, OLTP storage, or connecting apps to Postgres on Databricks.
89
87%
Does it follow best practices?
Impact
—
No eval scenarios have been run
Passed
No known issues
Quality
Discovery
89%Based on the skill's description, can an agent find and select it at the right time? Clear, specific descriptions lead to better discovery.
This is a solid skill description with excellent trigger terms and completeness, clearly identifying both what the skill covers and when to use it. The main weakness is that the 'what' portion lists topic areas rather than concrete actions, reading more like a table of contents than a capability list. Adding action verbs would strengthen the specificity dimension.
Suggestions
Replace topic labels with concrete actions, e.g., 'Creates and manages Lakebase Postgres projects, configures scaling and connectivity, sets up synced tables, and queries via the Data API.'
| Dimension | Reasoning | Score |
|---|---|---|
Specificity | The description names the domain (Databricks Lakebase Postgres) and lists several topic areas (projects, scaling, connectivity, synced tables, Data API), but these are more like category labels than concrete actions. It doesn't use action verbs like 'configure', 'create', or 'troubleshoot'. | 2 / 3 |
Completeness | Clearly answers both 'what' (Databricks Lakebase Postgres covering projects, scaling, connectivity, synced tables, Data API) and 'when' with an explicit 'Use when asked about Lakebase databases, OLTP storage, or connecting apps to Postgres on Databricks.' | 3 / 3 |
Trigger Term Quality | Includes strong natural keywords users would say: 'Lakebase', 'Postgres', 'Databricks', 'OLTP', 'synced tables', 'Data API', 'connecting apps'. These cover multiple variations of how a user might phrase questions about this technology. | 3 / 3 |
Distinctiveness Conflict Risk | Highly distinctive with a clear niche around Databricks Lakebase Postgres specifically. The combination of 'Lakebase', 'Postgres on Databricks', and 'OLTP' creates a very specific trigger profile unlikely to conflict with general database or Databricks skills. | 3 / 3 |
Total | 11 / 12 Passed |
Implementation
85%Reviews the quality of instructions and guidance provided to agents. Good implementation is clear, handles edge cases, and produces reliable results.
This is a strong, well-structured skill that provides highly actionable CLI commands and workflows for Lakebase Postgres management. The progressive disclosure is excellent with clear references to supporting files, and workflow clarity is strong with validation steps and explicit warnings for destructive operations. The main area for improvement is conciseness—some sections (capabilities list, comparison table, troubleshooting) could be trimmed to reduce token usage without losing essential information.
| Dimension | Reasoning | Score |
|---|---|---|
Conciseness | The skill is generally efficient and avoids explaining basic concepts Claude already knows (e.g., what Postgres is), but it's quite long (~300 lines) with some sections that could be tightened—the capabilities bullet list, the provisioned vs autoscaling comparison table, and the troubleshooting table are comprehensive but border on verbose. The resource hierarchy explanation is useful but slightly over-documented for Claude. | 2 / 3 |
Actionability | Excellent actionability throughout—concrete CLI commands with exact syntax, copy-paste ready bash scripts, specific JSON payloads, SQL grant statements, and a scriptable single-command version for connecting to Postgres. The 'CLI Discovery' section correctly instructs to discover commands dynamically rather than guessing. | 3 / 3 |
Workflow Clarity | Multi-step workflows are clearly sequenced with explicit validation checkpoints. The project creation flow includes verification steps, the app deployment workflow has a clear 'deploy first' mandate with error recovery guidance for the common mistake of running locally first, and the SQL connection workflow is numbered with clear dependencies between steps. Destructive operations (delete project, drop schema) have explicit warnings and permission checks. | 3 / 3 |
Progressive Disclosure | The skill provides a clear overview with well-signaled one-level-deep references to three specific reference files (computes-and-scaling.md, connectivity.md, synced-tables.md) for detailed topics. It also appropriately delegates to the databricks-core and databricks-apps skills. The main content covers the essential workflows inline while pointing to references for deeper dives. | 3 / 3 |
Total | 11 / 12 Passed |
Validation
90%Checks the skill against the spec for correct structure and formatting. All validation checks must pass before discovery and implementation can be scored.
Validation — 10 / 11 Passed
Validation for skill structure
| Criteria | Description | Result |
|---|---|---|
frontmatter_unknown_keys | Unknown frontmatter key(s) found; consider removing or moving to metadata | Warning |
Total | 10 / 11 Passed | |
f1c9cf7
Table of Contents
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.