PostgreSQL patterns for Python with psycopg and asyncpg — connection pooling,
99
99%
Does it follow best practices?
Impact
99%
1.15xAverage score across 5 eval scenarios
Passed
No known issues
{
"context": "Tests whether the agent uses psycopg_pool with proper configuration, %s parameterized queries, transactions for atomic bulk notification creation, executemany/COPY for bulk inserts, server-side cursors for large patient histories, context managers for connections, and pool cleanup. The task describes healthcare notification requirements without naming these patterns.",
"type": "weighted_checklist",
"checklist": [
{
"name": "psycopg_pool ConnectionPool",
"description": "Uses psycopg_pool.ConnectionPool with min_size, max_size, and timeout -- not creating new connections per function call",
"max_score": 12
},
{
"name": "Context managers for connections",
"description": "All functions use 'with pool.connection() as conn:' or 'with get_conn() as conn:' context manager to ensure connection return to pool",
"max_score": 12
},
{
"name": "Transaction for bulk scheduling",
"description": "The schedule_bulk function wraps the template lookup + bulk notification insert in 'with conn.transaction():' for atomicity",
"max_score": 14
},
{
"name": "Efficient bulk insert",
"description": "Bulk notification creation uses executemany or COPY for inserting 200-1000 notifications -- not individual INSERT calls in a for loop",
"max_score": 12
},
{
"name": "%s parameterized queries",
"description": "All queries use %s placeholders with tuple args -- no f-strings, .format(), or string concatenation for SQL values",
"max_score": 12
},
{
"name": "Server-side cursor for patient history",
"description": "Patient notification history uses a server-side cursor (cursor with name parameter) or batched fetchmany for large result sets -- not fetchall()",
"max_score": 10
},
{
"name": "Pool shutdown cleanup",
"description": "Pool is closed on shutdown via atexit.register(pool.close) or equivalent lifecycle hook",
"max_score": 8
},
{
"name": "DATABASE_URL from environment",
"description": "Connection string comes from os.getenv('DATABASE_URL') or environment variable",
"max_score": 7
},
{
"name": "Dict row factory for query results",
"description": "Query functions returning results use psycopg.rows.dict_row for readable dict access",
"max_score": 7
},
{
"name": "max_lifetime or max_idle on pool",
"description": "Pool is configured with max_idle or max_lifetime to handle connection recycling",
"max_score": 6
}
]
}evals
scenario-1
scenario-2
scenario-3
scenario-4
scenario-5
skills
postgresql-python-best-practices