Author and maintain DevHub templates published at `dev.databricks.com/templates`. A template is the public name for any of three internal entry kinds — atomic snippets, multi-step end-to-end walkthroughs, and full deployable example apps. Use when creating, updating, or reorganizing any template-tier content.
85
81%
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Advisory
Suggest reviewing before use
Use this skill to add or update DevHub templates with consistent structure, metadata, and writing quality. Treat each template as both an execution prompt for AI coding agents and a learning walkthrough for human developers.
Internally the catalog is built from three kinds of content that compose into each other:
examples/<slug>/template/ codebase. Bundles recipes and cookbook narrative around runnable app code.So: recipes are the atoms, cookbooks compose recipes with additional context, and examples are cookbooks with shipped code.
User-facing, all three kinds are presented as one thing: template. The site at dev.databricks.com/templates, navigation, filters, copy-pasted prompts, llms.txt, and the /templates.md markdown index all say "template(s)" — never "guide", "recipe", or "cookbook". When you write user-facing or agent-facing copy (titles, descriptions, intros, prerequisites, references), say "template".
The internal kind names (recipe, cookbook, example) live only in code, file paths, and this skill — they never appear in shipped UI, markdown content, or generated indexes.
| Internal kind | Source location | Route at runtime | When to use |
|---|---|---|---|
recipe | content/recipes/<slug>/{content,prerequisites,deployment}.md + entry in recipes | /templates/<slug> | One atomic outcome, copy-pasteable in a single agent prompt. |
cookbook | Entry in cookbooks (composes recipes) + manual page src/pages/templates/<slug>.tsx | /templates/<slug> | End-to-end walkthrough composed from multiple recipes. |
example | content/examples/<slug>/content.md + full app source under examples/<slug>/template/ | /templates/<slug> | Full deployable codebase that bundles cookbooks/recipes plus runnable app. |
All three are registered in src/lib/recipes/recipes.ts, share a flat /templates/<slug> URL hierarchy, and must have globally unique slugs (the content-entries plugin asserts this at build time). Choose the kind that matches the shape of the work, not the customer-facing label.
recipe.cookbook composing existing recipes.template/ source tree, README runbook, optional pipelines and seed) → author an example.recipecontent/recipes/<slug>/content.md. Optionally add prerequisites.md and deployment.md siblings.content.md with:
## <Template Title>### 1. ..., ### 2. ...).bash blocks<PROFILE> and <workspace-url> for user-specific values#### References containing only high-signal links.src/lib/recipes/recipes.ts:
recipes with id, name, description, tags, servicesprerequisites only when strictly requiredrecipesInOrderprerequisites.md — Never In content.mdPrerequisites have a single home: content/<recipes|examples>/<slug>/prerequisites.md. The route plugin renders that file under a ## Prerequisites H2 above the content body and the agent-prompt composer attaches it ahead of the steps. Putting prerequisite text inside content.md duplicates the section, breaks step numbering when the duplicate is removed later, and produces noisy "Copy as Markdown" payloads.
Rules:
### Prerequisites, :::info[Prerequisites], ### N. Follow the prerequisite templates first, or any equivalent block to content.md. Move that content into prerequisites.md instead.content.md step numbering starts at ### 1. with the first real action — never with a "do these other templates first" preamble.prerequisites.md with a link to /templates/<slug> (relative — see Link Style).prerequisites.md as databricks CLI commands the user can run to verify each capability — not as prose in content.md.cookbooksrc/lib/recipes/recipes.ts:
cookbooks with id, name, description, recipeIdscreateCookbook() to derive tags and servicessrc/pages/templates/<slug>.tsx following the existing pattern:
CookbookDetail, cookbooks, useAllRecipeSections, useCookbookIntro, and composeCookbookMarkdown@site/content/recipes/<slug>/content.mdcookbooks.find((c) => c.id === "<slug>")rawMarkdown from cookbook.recipeIds joined with \n\n---\n\nrecipeIds order, separated with <hr />exampleAn example is a full working codebase plus narrative markdown. It bundles cookbooks/recipes as included resources and ships deployable code.
Create examples/<slug>/ with:
examples/<slug>/
template/ # full runnable tree (AppKit app + optional pipelines/seed/provisioning)
README.md # canonical provisioning, SQL, seed, and deploy runbook
databricks.yml # bundle config with REPLACE_ME placeholders
app.yaml # runtime env from bundle resources
package.json # app dependencies
appkit.plugins.json # plugin manifest
server/ # Express backend
client/ # React frontend
config/queries/ # SQL query files
provisioning/sql/ # baseline SQL (Unity Catalog, Postgres, etc.)
pipelines/ # Lakeflow pipelines (optional)
<pipeline-name>/
databricks.yml
resources/*.yml
src/**/*.sql or *.py
seed/ # seed script for demo data (optional)
seed.ts
package.jsonKey conventions:
template/ (not app/) so databricks apps init --template works.pipelines/, seed/, provisioning/sql/) live under template/. Do not leave pipelines/ or seed/ at the example root — template/README.md must describe the full path from zero to deployed app.REPLACE_ME placeholders for workspace-specific values (host, warehouse ID, catalog name, Lakebase project, etc.)..databricks/, node_modules/, or .env.silver.users); rely on the pipeline YAML catalog setting for catalog resolution..npmrc pointing to https://npm-proxy.dev.databricks.com/ if the app uses @databricks/appkit.config/queries/ run against the Databricks SQL Warehouse (Spark SQL dialect), NOT Lakebase Postgres. Use CURRENT_DATE() not NOW(), DATE_ADD(d, n) not d + INTERVAL, SUM(CASE WHEN ... THEN 1 ELSE 0 END) not COUNT(*) FILTER (WHERE ...). Reference Unity Catalog three-part names (e.g., catalog.schema.table).template/README.md (canonical runbook)This file is the single source of truth for operators and coding agents. The example detail page on DevHub points users here via clone + cd into template/; it must be complete enough to deploy without guessing.
Include, as appropriate:
client/, server/, pipelines/, seed/, provisioning/sql/.template/provisioning/sql/.cd into, databricks bundle deploy targets, dependencies between pipelines and app.template/seed/ (cd seed, npm install, DATABASE_URL=... npm run seed). Note Postgres prerequisites (e.g. REPLICA IDENTITY FULL).template/: install, build, databricks bundle deploy. Link pipeline deploys before/after as required.databricks apps init --template https://github.com/databricks/devhub/tree/main/examples/<slug> for users who scaffold instead of cloning.Do not maintain a separate long-form provisioning/README.md next to the SQL — duplicate instructions drift. Keep narrative in template/README.md only.
For examples with no Unity Catalog DDL, still add template/provisioning/sql/ with a comment-only file (e.g. 00_no_unity_catalog_ddl.sql) so every example has a predictable place for SQL.
Create content/examples/<slug>/content.md:
## <Template Title>.template/README.md — do not duplicate the runbook in this markdown.Update src/lib/recipes/recipes.ts:
examples using createExample().id, name, description, githubPath, initCommand, templateIds, recipeIds.templateIds references cookbooks the example builds upon.recipeIds references standalone recipes not already pulled in via a cookbook.createExample() derives tags and services.initCommand format: git clone --depth 1 https://github.com/databricks/devhub.git then cd devhub/examples/<slug>/template. Optional CLI scaffold: databricks apps init --template https://github.com/databricks/devhub/tree/main/examples/<slug>.githubPath is examples/<slug>.Images are optional. When omitted, the UI falls back to the generic card art. When you add them, they must conform to the DevHub image contracts that npm run verify:images enforces (the pre-commit hook runs this script).
Contract for template preview/gallery images:
static/img/examples/<slug>-<slot>-<theme>.png (e.g. saas-tracker-dashboard-light.png and saas-tracker-dashboard-dark.png).Always ship both a light and a dark variant. The site picks the matching image based on the visitor's color mode. Capture the same screen twice at the same viewport once with the app in light mode and once in dark mode so the carousel slides align.
Schema fields (all optional, same contract for all three template kinds):
previewImageLightUrl / previewImageDarkUrl — single theme-aware preview used on landing carousels, the /templates list card, and the detail hero when no galleryImages are set.galleryImages?: Array<{ lightUrl: string; darkUrl: string }> — themed slides for the example detail-page carousel.Style the app in the Databricks brand palette before capturing screenshots. Theme tokens live in src/css/custom.css; reuse the same hex values in the example app's own CSS / Tailwind config:
| Token | Hex | Role |
|---|---|---|
--db-navy | #0b2026 | Primary dark surface (dark-mode page background, sidebars, headers) |
--db-navy-light | #1b3139 | Secondary dark surface (dark-mode cards, raised panels) |
--db-lava | #ff3621 | Primary brand orange (buttons, highlights, focus states, badges) |
--db-lava-dark | #eb1600 | Hover / pressed state for the primary orange |
--db-lava-light | #ff5542 | Primary orange in dark mode (keeps contrast against navy) |
--db-oat-medium | #eeede9 | Cream accent (secondary buttons, muted rows, light chips) |
--db-bg | #f9f7f4 | Light-mode page background (soft off-white) |
--db-card | #ffffff | Light-mode cards / raised surfaces |
Screenshot guidance:
--db-bg + --db-card surfaces, navy text, orange accents.--db-navy + --db-navy-light surfaces, --db-lava-light accents, near-white text. Avoid pure-black CSS defaults.--db-lava / --db-lava-light) sparingly — primary CTAs, active state, single accents. Avoid saturating whole regions.template/client/tailwind.config.ts so new examples are on-brand by default.Run npm run fmt && npm run typecheck && npm run build && npm run test from the repo root. The content-entries plugin validates slug uniqueness across the whole catalog and generates routes automatically.
Two directories, two purposes. examples/ is committed source code with REPLACE_ME placeholders. ../../demos/<slug>/ (outside the repo) is the scratch workspace for installing, configuring, and deploying.
NEVER npm install, deploy, or write workspace-specific values inside examples/. ALWAYS work from the demos folder outside the repo.
NEVER reuse existing workspace resources (Lakebase projects, Genie spaces, apps, UC catalogs) unless the developer explicitly says to. Always create fresh resources for the dry run to avoid corrupting or overwriting existing data.
The demos folder must be outside the git repo because databricks bundle deploy respects .gitignore and will skip files in gitignored directories.
# 1. Copy the template tree into demos (outside the repo)
mkdir -p ../../demos/<slug>
cp -r examples/<slug>/template/* ../../demos/<slug>/
# 2. Fill in workspace-specific values
# Edit ../../demos/<slug>/databricks.yml — replace REPLACE_ME with real IDs
# 3. Install and build
cd ../../demos/<slug>
npm install
npm run build
# 4. Create required Databricks resources (use databricks-core and databricks-lakebase skills)
# - Lakebase project, branch, database
# - SQL Warehouse (or use default)
# - Genie Space (if used)
# - Unity Catalog table (if analytics queries need warehouse-accessible data)
# 5. Seed data (if the example has a seed script)
cd <devhub-repo>/examples/<slug>/template/seed
npm install
DATABASE_URL="postgresql://..." npm run seed
# 6. Deploy from demos
cd ../../demos/<slug>
databricks apps deploy --profile <PROFILE>
# 7. Get the app URL
databricks apps get <app-name> --profile <PROFILE>examples/<slug>/ in the repo, then re-copy to ../../demos/<slug>/ and retry.content/examples/<slug>/content.md or the relevant recipe under content/recipes/.examples/<slug>/template/seed/seed.ts.After verifying the deployed app works, delete ../../demos/<slug>/. Optionally tear down test resources if they were created just for testing.
solutionSolutions live at dev.databricks.com/solutions/<slug> and are launch posts, deep-dive write-ups, or curated perspectives on the Databricks developer stack. They sit alongside the linked Databricks Blog posts that the registry hand-picks.
A native (DevHub-authored) solution has two pieces:
src/lib/solutions/solutions.ts with id, title, description, tags, authors, and publishedAt. This is the single source of truth for the page title, summary, byline, and date — every render path (detail page, served .md, frontmatter for MCP consumers) reads from here.content/solutions/<slug>.md that contains only the article body.The detail page (src/components/solutions/solution-detail.tsx) renders solution.title as the page H1 and the description below it from the registry, then renders the markdown body underneath. To keep the rendered page from showing two stacked titles and to keep the registry as the single source of truth:
content/solutions/<slug>.md with a # H1 heading. The first line of the file must be the opening paragraph (lede).=== underline). Same reason.## and may go deeper (###, ####).prependSolutionFrontmatter (in api/content-markdown.ts) builds the served frontmatter entirely from the registry whenever the markdown is fetched as .md or via the docs MCP server. Any frontmatter in the source file is stripped before serving, so embedding it just creates drift.These rules are enforced mechanically by scripts/validate-content.mjs, which fails the pre-commit hook if any solution markdown contains a # ATX heading or a setext H1 underline.
src/lib/solutions/solutions.ts with type: "native", the canonical title / description, tags, authors (IDs from src/lib/solutions/authors.ts), and an ISO publishedAt (YYYY-MM-DD).content/solutions/<slug>.md. Open with the lede paragraph (no heading), then organize the rest with ## and deeper headings.npm run validate:content && npm run typecheck && npm run build to confirm the registry, slug, and H1 rule all pass before committing.All templates share a flat URL hierarchy:
| Internal kind | Public URL | Route source |
|---|---|---|
recipe | /templates/<slug> | Generated by content-entries plugin from content/recipes/<slug>/content.md |
cookbook | /templates/<slug> | Manual page in src/pages/templates/<slug>.tsx |
example | /templates/<slug> | Generated by content-entries plugin from content/examples/<slug>/content.md |
Slugs must be globally unique. The plugin throws at build time if any collision exists.
llms.txt, or /templates.md.When linking to another DevHub page (/templates/..., /docs/..., /solutions/...) from any markdown content (content/**/*.md, docs/**/*.md, intent files, dev-guidelines, about), use a root-relative path. Never hardcode https://dev.databricks.com/<path> inside markdown link or autolink syntax.
[Spin Up a Databricks App](/templates/spin-up-databricks-app)[Spin Up a Databricks App](https://dev.databricks.com/templates/spin-up-databricks-app)</llms.txt> and [ref]: /docs/start-here<https://dev.databricks.com/llms.txt> and [ref]: https://dev.databricks.com/docs/start-hereabsolutizeMarkdown in src/lib/copy-preamble.ts rewrites every root-relative link to the caller's origin when a page or markdown payload is served (Vercel functions, MCP server, in-browser "Copy as Markdown"), so relative links work transparently in localhost:3001, preview deployments, and production. Hardcoding the canonical origin makes in-site navigation a full reload and sends local-dev visitors to prod.
scripts/validate-content.mjs enforces this rule and fails the build on https://dev.databricks.com/(templates|docs|solutions)/... references inside markdown link, autolink, or reference-definition syntax.
Allowed exceptions (the validator skips these):
Website: https://dev.databricks.com, "fetch the index from https://dev.databricks.com/llms.txt"). rewriteOrigin substitutes the canonical origin at copy time, so these still resolve correctly.npx add-mcp https://dev.databricks.com/api/mcp — the install command must be canonical).github.com/..., docs.databricks.com/..., etc.) — always absolute.recipeIds order matches the rendered JSX order.content.md file contains a Prerequisites heading, admonition, or "follow these templates first" step — that content lives only in prerequisites.md.npm run validate:content && npm run fmt && npm run typecheck && npm run build && npm run test.examples/<slug>/ contains only REPLACE_ME placeholders — no real workspace hosts, warehouse IDs, Lakebase project names, or Genie space IDs.examples/<slug>/template/README.md covers provisioning (manual vs SQL), seeding, pipeline deploys, and app deploy end-to-end.content/recipes/set-up-your-local-dev-environment/content.md for atomic-template structure and tone.content/examples/agentic-support-console/content.md for example-template markdown style.src/lib/recipes/recipes.ts for all type contracts (Recipe, Cookbook, Example).src/pages/templates/app-with-lakebase.tsx for the cookbook composition pattern.src/components/examples/example-detail.tsx for example detail rendering.examples/agentic-support-console/template/README.md for a full example runbook (provisioning, SQL, seed, pipelines, deploy).plugins/content-entries.ts for slug parity and uniqueness validation.6338825
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.