Common usage patterns and workflows for r2put.
While r2put is designed for interactive use, it can be used in scripts:
#!/bin/bash
set -e
export CLOUDFLARE_ACCOUNT_ID="your-account-id"
export R2_ACCESS_KEY_ID="your-access-key-id"
export R2_SECRET_ACCESS_KEY="your-secret-access-key"
# Upload file with timestamped key
r2put --file ./backup.tar.gz --bucket backups --key "backups/$(date +%Y%m%d).tar.gz"
# Check exit code
if [ $? -eq 0 ]; then
echo "Upload successful"
else
echo "Upload failed"
exit 1
fiScript Usage Considerations:
r2put can be used in CI/CD pipelines, but with limitations:
# GitHub Actions example
- name: Upload to R2
env:
CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
R2_ACCESS_KEY_ID: ${{ secrets.R2_ACCESS_KEY_ID }}
R2_SECRET_ACCESS_KEY: ${{ secrets.R2_SECRET_ACCESS_KEY }}
run: |
npx r2put --file ./dist/app.zip --bucket artifacts --key "builds/${{ github.sha }}.zip"upload_to_r2:
script:
- export CLOUDFLARE_ACCOUNT_ID="$CLOUDFLARE_ACCOUNT_ID"
- export R2_ACCESS_KEY_ID="$R2_ACCESS_KEY_ID"
- export R2_SECRET_ACCESS_KEY="$R2_SECRET_ACCESS_KEY"
- npx r2put --file ./dist/app.zip --bucket artifacts --key "builds/$CI_COMMIT_SHA.zip"
variables:
CLOUDFLARE_ACCOUNT_ID: $CLOUDFLARE_ACCOUNT_ID
R2_ACCESS_KEY_ID: $R2_ACCESS_KEY_ID
R2_SECRET_ACCESS_KEY: $R2_SECRET_ACCESS_KEYCI/CD Considerations:
@cfkit/r2 directly for programmatic control in CI/CDBoth relative and absolute paths are supported:
# Relative path
r2put --file ./data.bin --bucket my-bucket
# Absolute path
r2put --file /home/user/data.bin --bucket my-bucket
# Path with spaces (requires quoting)
r2put --file "./my file.bin" --bucket my-bucket
# Using shell expansion
r2put --file ~/Documents/file.pdf --bucket documentsPath Handling Behavior:
~ for home directory)Example script for automated backups:
#!/bin/bash
BACKUP_DIR="/var/backups"
BUCKET="production-backups"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
# Create backup archive
tar -czf "$BACKUP_DIR/backup_$TIMESTAMP.tar.gz" /path/to/data
# Upload to R2
r2put --file "$BACKUP_DIR/backup_$TIMESTAMP.tar.gz" \
--bucket "$BUCKET" \
--key "backups/backup_$TIMESTAMP.tar.gz"
# Clean up local backup if upload succeeded
if [ $? -eq 0 ]; then
rm "$BACKUP_DIR/backup_$TIMESTAMP.tar.gz"
echo "Backup uploaded and local file removed"
fiUpload build artifacts to R2:
#!/bin/bash
# Build the application
npm run build
# Upload build artifacts
r2put --file ./dist/app.zip \
--bucket production-assets \
--key "releases/v$(cat package.json | grep version | cut -d'"' -f4).zip"
# Upload individual assets
for file in ./dist/assets/*; do
filename=$(basename "$file")
r2put --file "$file" \
--bucket production-assets \
--key "assets/$filename"
doneWhile r2put handles one file at a time, you can create a batch upload script:
#!/bin/bash
BUCKET="my-bucket"
UPLOAD_DIR="./files-to-upload"
for file in "$UPLOAD_DIR"/*; do
if [ -f "$file" ]; then
filename=$(basename "$file")
echo "Uploading $filename..."
r2put --file "$file" --bucket "$BUCKET" --key "uploads/$filename"
if [ $? -ne 0 ]; then
echo "Failed to upload $filename"
exit 1
fi
fi
done
echo "All files uploaded successfully"