Understand and manage Clerk rate limits and quotas. Use when hitting rate limits, optimizing API usage, or planning for high-traffic scenarios. Trigger with phrases like "clerk rate limit", "clerk quota", "clerk API limits", "clerk throttling".
80
77%
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Passed
No known issues
Optimize this skill with Tessl
npx tessl skill review --optimize ./plugins/saas-packs/clerk-pack/skills/clerk-rate-limits/SKILL.mdUnderstand Clerk's rate limiting system and implement strategies to avoid hitting limits. Covers Backend API rate limits, retry logic, batching, caching, and monitoring.
Clerk Backend API enforces rate limits per API key:
| Plan | Rate Limit | Burst |
|---|---|---|
| Free | 20 req/10s | 40 |
| Pro | 100 req/10s | 200 |
| Enterprise | Custom | Custom |
Rate limit headers returned on every response:
X-RateLimit-Limit — max requests per windowX-RateLimit-Remaining — remaining requestsX-RateLimit-Reset — seconds until window resets// lib/clerk-api.ts
import { createClerkClient } from '@clerk/backend'
const clerk = createClerkClient({ secretKey: process.env.CLERK_SECRET_KEY! })
async function withRetry<T>(fn: () => Promise<T>, maxRetries = 3): Promise<T> {
for (let attempt = 0; attempt <= maxRetries; attempt++) {
try {
return await fn()
} catch (err: any) {
if (err.status === 429 && attempt < maxRetries) {
// Parse retry-after header or use exponential backoff
const retryAfter = err.headers?.['retry-after']
const waitMs = retryAfter ? parseInt(retryAfter) * 1000 : Math.pow(2, attempt) * 1000
console.warn(`Rate limited. Retrying in ${waitMs}ms (attempt ${attempt + 1}/${maxRetries})`)
await new Promise((resolve) => setTimeout(resolve, waitMs))
continue
}
throw err
}
}
throw new Error('Max retries exceeded')
}
// Usage
export async function getUser(userId: string) {
return withRetry(() => clerk.users.getUser(userId))
}// lib/clerk-batch.ts
import { createClerkClient } from '@clerk/backend'
const clerk = createClerkClient({ secretKey: process.env.CLERK_SECRET_KEY! })
async function batchGetUsers(userIds: string[], batchSize = 10) {
const results = []
for (let i = 0; i < userIds.length; i += batchSize) {
const batch = userIds.slice(i, i + batchSize)
const users = await Promise.all(batch.map((id) => clerk.users.getUser(id)))
results.push(...users)
// Respect rate limits between batches
if (i + batchSize < userIds.length) {
await new Promise((resolve) => setTimeout(resolve, 500))
}
}
return results
}
// For listing: use pagination instead of fetching all
async function getAllUsers() {
const allUsers = []
let offset = 0
const limit = 100
while (true) {
const batch = await clerk.users.getUserList({ limit, offset })
allUsers.push(...batch.data)
if (batch.data.length < limit) break
offset += limit
await new Promise((resolve) => setTimeout(resolve, 200)) // Rate limit pause
}
return allUsers
}// lib/clerk-cache.ts
const userCache = new Map<string, { user: any; cachedAt: number }>()
const CACHE_TTL = 60_000 // 1 minute
export async function getCachedUser(userId: string) {
const cached = userCache.get(userId)
if (cached && Date.now() - cached.cachedAt < CACHE_TTL) {
return cached.user
}
const { createClerkClient } = await import('@clerk/backend')
const clerk = createClerkClient({ secretKey: process.env.CLERK_SECRET_KEY! })
const user = await clerk.users.getUser(userId)
userCache.set(userId, { user, cachedAt: Date.now() })
return user
}
// Invalidate cache on webhook events
export function invalidateUserCache(userId: string) {
userCache.delete(userId)
}For production, use Redis instead of in-memory cache:
import { Redis } from '@upstash/redis'
const redis = Redis.fromEnv()
export async function getCachedUserRedis(userId: string) {
const cached = await redis.get(`clerk:user:${userId}`)
if (cached) return cached
const clerk = createClerkClient({ secretKey: process.env.CLERK_SECRET_KEY! })
const user = await clerk.users.getUser(userId)
await redis.set(`clerk:user:${userId}`, JSON.stringify(user), { ex: 60 })
return user
}// lib/clerk-monitor.ts
let rateLimitHits = 0
export function trackRateLimit(response: Response) {
const remaining = parseInt(response.headers.get('X-RateLimit-Remaining') || '999')
const limit = parseInt(response.headers.get('X-RateLimit-Limit') || '0')
if (remaining < limit * 0.1) {
console.warn(`[Clerk] Rate limit warning: ${remaining}/${limit} remaining`)
}
if (remaining === 0) {
rateLimitHits++
console.error(`[Clerk] Rate limit hit! Total hits this session: ${rateLimitHits}`)
}
}| Error | Cause | Solution |
|---|---|---|
429 Too Many Requests | Rate limit exceeded | Implement retry with backoff, add caching |
quota_exceeded | Monthly MAU quota hit | Upgrade plan or reduce active users |
| Concurrent limit hit | Too many parallel requests | Queue requests, reduce batchSize |
| Stale cache data | Cache not invalidated | Invalidate on user.updated webhook |
# Check current rate limit status
curl -s -D - -H "Authorization: Bearer $CLERK_SECRET_KEY" \
https://api.clerk.com/v1/users?limit=1 2>&1 | grep -i x-ratelimitProceed to clerk-security-basics for security best practices.
3e83543
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.