This skill should be used when the user asks to "create Dataverse tables", "set up the data model", "setup dataverse", "create tables for my site", "setup dataverse schema", "create the database", "build my data model", or wants to create Dataverse tables, columns, and relationships for their Power Pages site based on a data model proposal.
Install with Tessl CLI
npx tessl i github:microsoft/power-platform-skills --skill setup-datamodel88
Quality
87%
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Guide the user through creating Dataverse tables, columns, and relationships for their Power Pages site. Follow a systematic approach: verify prerequisites, obtain a data model (via AI analysis or user-provided diagram), review and approve, then create all schema objects via OData API.
Initial request: $ARGUMENTS
Goal: Confirm PAC CLI authentication, acquire an Azure CLI token, and verify API access
Actions:
${CLAUDE_PLUGIN_ROOT}/references/dataverse-prerequisites.md to verify PAC CLI auth, acquire an Azure CLI token, and confirm API access. Store the environment URL as $envUrl.Output: Verified PAC CLI auth, valid Azure CLI token, confirmed API access, $envUrl stored
Goal: Determine whether the user will upload an existing ER diagram or let AI analyze the site
Actions:
Ask the user how they want to define the data model using the AskUserQuestion tool:
Question: "How would you like to define the data model for your site?"
| Option | Description |
|---|---|
| Upload an existing ER diagram | Provide an image (PNG/JPG) or Mermaid diagram of your existing data model |
| Let the Data Model Architect figure it out | The Data Model Architect will analyze your site's source code and propose a data model automatically |
Route to the appropriate path:
If the user chooses to upload an existing diagram:
Ask the user to provide their ER diagram. Supported formats:
Read tool to view the image and extract tables, columns, relationships, and cardinalities from itParse the diagram into the same structured format used by the data-model-architect agent:
pac env who)logicalName, displayName, status (new/modified/reused), columns, relationshipslogicalName, displayName, type, requiredQuery existing Dataverse tables (same as Phase 3 would) to mark each table as new, modified, or reused.
Generate a Mermaid ER diagram from the parsed data (if the user provided an image or text) for visual confirmation.
Proceed directly to Phase 4: Review Proposal with the parsed data model.
If the user chooses to let the Data Model Architect figure it out, proceed to Phase 3: Invoke Data Model Architect (the existing automated flow).
Output: Data model source chosen and, for Path A, parsed data model ready for review
Goal: Spawn the data-model-architect agent to autonomously analyze the site and propose a data model
Actions:
Use the Task tool to spawn the data-model-architect agent. This agent autonomously:
Spawn the agent:
Task tool:
subagent_type: general-purpose
prompt: |
You are the data-model-architect agent. Follow the instructions in
the agent definition file at:
${CLAUDE_PLUGIN_ROOT}/agents/data-model-architect.md
Analyze the current project and Dataverse environment, then propose
a complete data model. Return:
1. Publisher prefix
2. Table definitions (logicalName, displayName, status, columns, relationships)
3. Mermaid ER diagramWait for the agent to return its structured proposal before proceeding.
Output: Structured data model proposal from the agent (publisher prefix, table definitions, ER diagram)
Goal: Present the data model proposal to the user and get explicit approval before creating anything
Actions:
Present the data model proposal directly to the user as a formatted message, including:
Use AskUserQuestion to get approval:
| Question | Header | Options |
|---|---|---|
| Does this data model look correct? | Data Model Proposal | Approve and create tables (Recommended), Request changes, Cancel |
Only proceed to creation after explicit user approval.
Output: User-approved data model proposal
Goal: Refresh the token, verify what already exists in Dataverse, and build the creation plan to avoid duplicates
Actions:
Re-acquire the Azure CLI token (tokens expire after ~60 minutes):
$token = az account get-access-token --resource "$envUrl" --query accessToken -o tsvFor each table in the approved proposal marked as new, check whether it already exists:
$headers = @{ Authorization = "Bearer $token"; Accept = "application/json" }
Invoke-RestMethod -Uri "$envUrl/api/data/v9.2/EntityDefinitions(LogicalName='<table_logical_name>')" -Headers $headersFor tables marked as modified, verify the table exists (it should) and check which columns are missing.
From the pre-creation checks, build a list of:
Inform the user of any skipped items.
Output: Finalized creation plan with tables, columns, and relationships to create or skip
Goal: Create each approved table and its columns using the Dataverse OData Web API
Actions:
Refer to references/odata-api-patterns.md for full JSON body templates.
For each new table, POST to the EntityDefinitions endpoint:
$body = <JSON body from references/odata-api-patterns.md>
$headers = @{
Authorization = "Bearer $token"
"Content-Type" = "application/json"
Accept = "application/json"
}
Invoke-RestMethod -Method Post -Uri "$envUrl/api/data/v9.2/EntityDefinitions" -Headers $headers -Body $bodyUse the deep-insert pattern to create the table and its columns in a single POST request. See references/odata-api-patterns.md for the complete JSON structure.
For tables marked as modified, add new columns one at a time:
$body = <column JSON from references/odata-api-patterns.md>
Invoke-RestMethod -Method Post -Uri "$envUrl/api/data/v9.2/EntityDefinitions(LogicalName='<table>')/Attributes" -Headers $headers -Body $bodyTrack each creation attempt and its result (success/failure/skipped). Do NOT attempt automated rollback on failure — report failures and continue with remaining items.
If creating many tables, refresh the token between batches (every 3–4 tables) to avoid expiration:
$token = az account get-access-token --resource "$envUrl" --query accessToken -o tsvOutput: All approved tables and columns created (or failures reported)
Goal: Create all relationships between the newly created and existing tables
Actions:
Create lookup columns that establish 1:N relationships:
$body = <relationship JSON from references/odata-api-patterns.md>
Invoke-RestMethod -Method Post -Uri "$envUrl/api/data/v9.2/RelationshipDefinitions" -Headers $headers -Body $bodyCreate M:N relationships (intersect tables are created automatically):
$body = <M:N relationship JSON from references/odata-api-patterns.md>
Invoke-RestMethod -Method Post -Uri "$envUrl/api/data/v9.2/RelationshipDefinitions" -Headers $headers -Body $bodyTrack each relationship creation attempt. Report failures without rolling back.
Output: All approved relationships created (or failures reported)
Goal: Publish all customizations, verify tables exist, write the manifest, and present a summary
Actions:
Publish all customizations so the new tables and columns become available:
$publishBody = @{
ParameterXml = "<importexportxml><entities><entity>$( ($tables | ForEach-Object { $_.logicalName }) -join '</entity><entity>' )</entity></entities></importexportxml>"
} | ConvertTo-Json
Invoke-RestMethod -Method Post -Uri "$envUrl/api/data/v9.2/PublishXml" -Headers $headers -Body $publishBody -ContentType "application/json"See references/odata-api-patterns.md for the full PublishXml pattern.
For each created table, run a verification query:
Invoke-RestMethod -Uri "$envUrl/api/data/v9.2/EntityDefinitions(LogicalName='<table>')?`$select=LogicalName,DisplayName" -Headers $headersAfter successful verification, write .datamodel-manifest.json to the project root. This file records which tables and columns were verified to exist, and is used by the validation hook.
{
"environmentUrl": "https://org12345.crm.dynamics.com",
"tables": [
{
"logicalName": "cr123_project",
"displayName": "Project",
"status": "new",
"columns": [
{ "logicalName": "cr123_name", "type": "String" },
{ "logicalName": "cr123_description", "type": "Memo" }
]
}
]
}Use the Write tool to create this file at <PROJECT_ROOT>/.datamodel-manifest.json. Only include tables and columns that were confirmed to exist in Step 8.2. See ${CLAUDE_PLUGIN_ROOT}/references/datamodel-manifest-schema.md for the full schema specification.
Reference:
${CLAUDE_PLUGIN_ROOT}/references/skill-tracking-reference.md
Follow the skill tracking instructions in the reference to record this skill's usage. Use --skillName "SetupDatamodel".
Present a summary to the user:
| Table | Status | Columns | Relationships |
|---|---|---|---|
cr123_project (Project) | Created | 5 columns | 2 relationships |
contact (Contact) | Reused | 1 column added | — |
cr123_task (Task) | Created | 4 columns | 1 relationship |
Include:
.datamodel-manifest.json)After the summary, suggest:
/power-pages:add-sample-data/power-pages:integrate-webapi/power-pages:create-site/power-pages:deploy-siteOutput: Published customizations, verified tables, manifest written, summary presented
Before starting Phase 1, create a task list with all phases using TaskCreate:
| Task subject | activeForm | Description |
|---|---|---|
| Verify prerequisites | Verifying prerequisites | Confirm PAC CLI auth, acquire Azure CLI token, verify API access |
| Choose data model source | Choosing data model source | Ask user to upload ER diagram or let AI analyze the site |
| Invoke data model architect | Invoking data model architect | Spawn agent to analyze site and propose data model |
| Review and approve proposal | Reviewing proposal | Present data model proposal to user, get explicit approval |
| Pre-creation checks | Running pre-creation checks | Refresh token, query existing tables, build creation plan |
| Create tables and columns | Creating tables and columns | POST to OData API to create tables and columns |
| Create relationships | Creating relationships | POST to OData API to create 1:N and M:N relationships |
| Publish and verify | Publishing and verifying | Publish customizations, verify tables, write manifest, present summary |
Mark each task in_progress when starting it and completed when done via TaskUpdate. This gives the user visibility into progress and keeps the workflow deterministic.
Begin with Phase 1: Verify Prerequisites
8ccaae8
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.