or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

docs

examples

edge-cases.mdreal-world-scenarios.md
index.md
tile.json

real-world-scenarios.mddocs/examples/

Real-World Scenarios

Comprehensive usage examples for common real-world scenarios with @cfkit/r2.

Client-Side File Upload

Generate a pre-signed URL on the server and let the client upload directly:

// Server-side: Generate pre-signed URL
const uploadUrl = await bucket.presignedUploadUrl({
  key: `uploads/${userId}/${filename}`,
  contentType: file.type,
  metadata: {
    'uploaded-by': userId,
    'uploaded-at': new Date().toISOString(),
    'original-filename': file.name
  },
  expiresIn: 3600, // 1 hour
  allowedContentTypes: ['image/*', 'application/pdf']
});

// Return uploadUrl.url to client

// Client-side: Upload file
await fetch(uploadUrl.url, {
  method: 'PUT',
  headers: {
    'Content-Type': file.type,
    'x-amz-meta-uploaded-by': userId,
    'x-amz-meta-uploaded-at': new Date().toISOString(),
    'x-amz-meta-original-filename': file.name
  },
  body: file
});

Server-Side File Processing Pipeline

Upload, process, and store processed files:

// Upload original file
const original = await bucket.uploadFile(`originals/${id}`, file, {
  contentType: file.type,
  metadata: { 'upload-id': id }
});

// Process file (e.g., resize image)
const processed = await processImage(file);

// Upload processed version
const processedResult = await bucket.uploadFile(`processed/${id}`, processed, {
  contentType: 'image/jpeg',
  metadata: {
    'original-key': original.key,
    'processed-at': new Date().toISOString()
  }
});

// Store metadata in database
await db.save({
  id,
  originalKey: original.key,
  processedKey: processedResult.key,
  uploadedAt: new Date()
});

Batch Operations

Process multiple files concurrently:

// Upload multiple files
const files = [
  { key: 'file1.jpg', content: file1 },
  { key: 'file2.jpg', content: file2 },
  { key: 'file3.jpg', content: file3 }
];

const results = await Promise.all(
  files.map(file =>
    bucket.uploadFile(file.key, file.content, {
      contentType: 'image/jpeg',
      metadata: { 'batch-id': batchId }
    })
  )
);

console.log(`Uploaded ${results.length} files`);

Conditional Download

Check existence before downloading:

const key = 'important-file.pdf';

if (await bucket.objectExists(key)) {
  const obj = await bucket.getObject(key);
  const blob = await obj.body.blob();
  
  // Process file
  console.log(`Downloaded ${obj.size} bytes`);
} else {
  console.log('File does not exist');
}

Secure File Sharing

Generate time-limited download URLs:

// Generate short-lived download URL for sensitive file
const downloadUrl = await bucket.presignedDownloadUrl('private-doc.pdf', {
  expiresIn: 300 // 5 minutes
});

// Send URL to authorized user via secure channel
await sendSecureMessage(userId, {
  message: 'Your document is ready',
  downloadUrl: downloadUrl.url,
  expiresIn: 300
});

File Cleanup

Delete old files based on metadata:

// List all files (requires custom implementation or listing API)
// For this example, assume we track keys in database

const oldFiles = await db.findFilesOlderThan(30); // 30 days

await Promise.all(
  oldFiles.map(file => bucket.deleteObject(file.key))
);

console.log(`Deleted ${oldFiles.length} old files`);

Metadata Tracking

Store and retrieve custom metadata:

// Upload with metadata
const result = await bucket.uploadFile('photo.jpg', file, {
  contentType: 'image/jpeg',
  metadata: {
    'user-id': userId,
    'album-id': albumId,
    'upload-timestamp': Date.now().toString(),
    'file-size': file.size.toString()
  }
});

// Later, retrieve metadata
const obj = await bucket.getObject('photo.jpg');
if (obj.metadata) {
  const userId = obj.metadata['user-id'];
  const albumId = obj.metadata['album-id'];
  const uploadTime = parseInt(obj.metadata['upload-timestamp']);
  
  console.log(`File uploaded by user ${userId} at ${new Date(uploadTime)}`);
}

Error Recovery

Handle errors with retry logic:

async function uploadWithRetry(
  key: string,
  file: Blob,
  options: UploadFileOptions,
  maxRetries = 3
): Promise<UploadResult> {
  for (let attempt = 1; attempt <= maxRetries; attempt++) {
    try {
      return await bucket.uploadFile(key, file, options);
    } catch (error) {
      if (attempt === maxRetries) {
        throw error;
      }
      
      // Exponential backoff
      await new Promise(resolve => 
        setTimeout(resolve, Math.pow(2, attempt) * 1000)
      );
    }
  }
  
  throw new Error('Upload failed after retries');
}

Content Type Validation

Validate file types before upload:

async function uploadImage(
  key: string,
  file: File
): Promise<UploadResult> {
  // Validate content type
  const allowedTypes = ['image/jpeg', 'image/png', 'image/gif'];
  if (!allowedTypes.includes(file.type)) {
    throw new Error(`Invalid file type: ${file.type}`);
  }
  
  // Generate pre-signed URL with validation
  const uploadUrl = await bucket.presignedUploadUrl({
    key,
    contentType: file.type,
    allowedContentTypes: ['image/*'],
    expiresIn: 3600
  });
  
  // Upload via pre-signed URL or directly
  return await bucket.uploadFile(key, file, {
    contentType: file.type
  });
}

Multi-Environment Configuration

Handle different environments:

const config = {
  accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
  accessKeyId: process.env.R2_ACCESS_KEY_ID!,
  secretAccessKey: process.env.R2_SECRET_ACCESS_KEY!
};

const r2 = new R2Client(config);

// Use different buckets per environment
const bucketName = process.env.NODE_ENV === 'production'
  ? 'production-bucket'
  : 'development-bucket';

const bucket = r2.bucket(bucketName);