PostgreSQL patterns for Python with psycopg and asyncpg — connection pooling,
99
99%
Does it follow best practices?
Impact
99%
1.15xAverage score across 5 eval scenarios
Passed
No known issues
{
"context": "Tests whether the agent uses asyncpg pool with lifecycle management, bulk insert/upsert for catalog import, transactions for stock transfers, server-side cursors for large exports, $1/$2 parameterized queries, and async context managers. The task describes business features without naming these patterns.",
"type": "weighted_checklist",
"checklist": [
{
"name": "asyncpg pool with lifecycle",
"description": "Uses 'await asyncpg.create_pool()' with min_size/max_size in a FastAPI lifespan or startup handler, and 'await pool.close()' on shutdown",
"max_score": 12
},
{
"name": "Bulk upsert for catalog import",
"description": "Product catalog import uses an efficient bulk pattern (executemany with ON CONFLICT, or copy_records_to_table with a temp table merge) -- NOT individual INSERT/UPDATE in a loop for 500-5000 products",
"max_score": 14
},
{
"name": "Transaction for stock transfer",
"description": "Stock transfer wraps the decrease, increase, and movement record in 'async with conn.transaction():' so all three operations succeed or fail atomically",
"max_score": 14
},
{
"name": "Stock validation before transfer",
"description": "Transfer checks source warehouse quantity within the transaction (using SELECT FOR UPDATE or equivalent) and raises an error if insufficient -- preventing negative stock",
"max_score": 8
},
{
"name": "Async context managers for connections",
"description": "All database operations use 'async with pool.acquire() as conn:' context manager -- no bare acquire() without guaranteed release",
"max_score": 10
},
{
"name": "$1/$2 parameterized queries",
"description": "All queries use $1, $2, $N numbered placeholders -- no f-strings, .format(), or string concatenation in SQL",
"max_score": 12
},
{
"name": "Server-side cursor for warehouse export",
"description": "Warehouse inventory export uses a cursor or streaming pattern (conn.cursor with async iteration, or batched fetch) for 50,000+ products -- not loading all into memory with fetch()",
"max_score": 10
},
{
"name": "DATABASE_URL from environment",
"description": "Connection string comes from environment variable, not hardcoded",
"max_score": 7
},
{
"name": "Pool timeout configuration",
"description": "Pool has command_timeout or max_inactive_connection_lifetime configured",
"max_score": 6
},
{
"name": "No SQL injection vectors",
"description": "No dynamic SQL built with string interpolation anywhere -- all values passed through parameterized queries",
"max_score": 7
}
]
}evals
scenario-1
scenario-2
scenario-3
scenario-4
scenario-5
skills
postgresql-python-best-practices