Azure Storage Services including Blob Storage, File Shares, Queue Storage, Table Storage, and Data Lake. Provides object storage, SMB file shares, async messaging, NoSQL key-value, and big data analytics capabilities. Includes access tiers (hot, cool, archive) and lifecycle management. USE FOR: blob storage, file shares, queue storage, table storage, data lake, upload files, download blobs, storage accounts, access tiers, lifecycle management. DO NOT USE FOR: SQL databases, Cosmos DB (use azure-prepare), messaging with Event Hubs or Service Bus (use azure-messaging).
Install with Tessl CLI
npx tessl i github:microsoft/github-copilot-for-azure --skill azure-storage95
Does it follow best practices?
Evaluation — 99%
↑ 1.08xAgent success when using this skill
Validation for skill structure
Production auth credential selection and Python blob SDK best practices
Correct blob package
100%
100%
Identity package installed
100%
100%
DefaultAzureCredential for dev
0%
100%
ManagedIdentityCredential for production
100%
100%
Environment-aware branch
70%
100%
No hardcoded credentials
100%
100%
overwrite=True on upload
100%
100%
Content type set
100%
100%
Memory-efficient download
83%
100%
BlobServiceClient constructor
100%
100%
Correct import structure
100%
100%
Without context: $0.2237 · 1m · 17 turns · 24 in / 3,242 out tokens
With context: $0.4714 · 1m 22s · 26 turns · 614 in / 4,499 out tokens
TypeScript queue worker with poison message handling and visibility management
Correct queue package
100%
100%
Identity package installed
100%
100%
DefaultAzureCredential used
0%
100%
No hardcoded credentials
100%
100%
Batch receive
100%
100%
JSON parsing
100%
100%
Delete after processing
100%
100%
dequeueCount check
100%
100%
Dead-letter routing
100%
100%
Visibility timeout set
100%
100%
QueueServiceClient or QueueClient usage
100%
100%
Message content serialization
33%
66%
Without context: $0.2886 · 1m 36s · 15 turns · 21 in / 5,833 out tokens
With context: $0.3172 · 1m 11s · 18 turns · 21 in / 3,763 out tokens
Azure Table Storage partition key design and Data Lake large-file upload pattern
Correct Tables package
100%
100%
Data Lake package
100%
100%
Identity package installed
100%
100%
Partition key for building
100%
100%
Batch/transaction operations
100%
100%
upsert_entity for idempotent writes
100%
100%
Parameterized query filter
100%
100%
Data Lake append_data
100%
100%
flush_data after append
100%
100%
DefaultAzureCredential for dev
100%
100%
DataLakeServiceClient endpoint
100%
100%
TableClient construction
100%
100%
Without context: $0.2851 · 1m 19s · 16 turns · 21 in / 4,536 out tokens
With context: $0.6411 · 1m 49s · 31 turns · 1,098 in / 6,446 out tokens
Table of Contents
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.