Cloud Storage Client Library for Node.js that provides comprehensive API for managing buckets, files, and metadata with authentication, streaming, and access control.
—
Pending
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Pending
The risk profile of this skill
Official Node.js client library for Google Cloud Storage that provides comprehensive API for managing buckets, files, and metadata with authentication, streaming, and access control.
// Installation
npm install @google-cloud/storage
// Import
import { Storage } from '@google-cloud/storage';
// or
const { Storage } = require('@google-cloud/storage');// Main Storage class and core types
import {
Storage,
StorageOptions,
IdempotencyStrategy,
PreconditionOptions,
RetryOptions
} from '@google-cloud/storage';
// Bucket class and related types
import {
Bucket,
BucketOptions,
CreateBucketRequest,
CreateBucketResponse,
GetBucketsRequest,
GetBucketsResponse,
UploadOptions,
UploadResponse
} from '@google-cloud/storage';
// File class and related types
import {
File,
FileOptions,
FileMetadata,
DownloadOptions,
DownloadResponse,
PredefinedAcl
} from '@google-cloud/storage';
// Access control classes and types
import {
AclMetadata,
AccessControlObject,
Policy,
PolicyBinding,
Iam
} from '@google-cloud/storage';
// HMAC Key management
import {
HmacKey,
HmacKeyMetadata,
CreateHmacKeyOptions,
CreateHmacKeyResponse
} from '@google-cloud/storage';
// Notifications
import {
Notification,
NotificationMetadata,
Channel
} from '@google-cloud/storage';
// Transfer Manager
import {
TransferManager,
UploadManyFilesOptions,
DownloadManyFilesOptions
} from '@google-cloud/storage';
// Utilities and validation
import {
CRC32C,
HashStreamValidator,
GetSignedUrlResponse
} from '@google-cloud/storage';
// Error handling
import { ApiError } from '@google-cloud/storage';// Create client with Application Default Credentials
const storage = new Storage();
// Create client with explicit credentials
const storage = new Storage({
projectId: 'your-project-id',
keyFilename: '/path/to/keyfile.json'
});
// Get bucket reference
const bucket = storage.bucket('my-bucket');
// Get file reference
const file = bucket.file('path/to/file.txt');The library is organized around these core concepts:
// Configuration options
interface StorageOptions {
projectId?: string;
keyFilename?: string;
apiEndpoint?: string;
retryOptions?: RetryOptions;
crc32cGenerator?: CRC32CValidatorGenerator;
}
interface BucketOptions {
userProject?: string;
kmsKeyName?: string;
generation?: number;
preconditionOpts?: PreconditionOptions;
}
interface FileOptions {
generation?: number | string;
encryptionKey?: string | Buffer;
kmsKeyName?: string;
userProject?: string;
}
// Precondition options for conditional operations
interface PreconditionOptions {
ifGenerationMatch?: number | string;
ifGenerationNotMatch?: number | string;
ifMetagenerationMatch?: number | string;
ifMetagenerationNotMatch?: number | string;
}
// Retry configuration
interface RetryOptions {
retryDelayMultiplier?: number;
totalTimeout?: number;
maxRetryDelay?: number;
autoRetry?: boolean;
maxRetries?: number;
retryableErrorFn?: (err: ApiError) => boolean;
idempotencyStrategy?: IdempotencyStrategy;
}
// Idempotency strategy enum
enum IdempotencyStrategy {
RetryAlways,
RetryConditional,
RetryNever
}
// Common response patterns
type ResourceResponse<T> = [T, unknown]; // [resource, apiResponse]
type ListResponse<T> = [T[], {}, unknown]; // [items, nextQuery, apiResponse]
type OperationResponse = [unknown]; // [apiResponse]Core Storage class functionality including bucket management, service account operations, and HMAC key administration.
Key APIs:
bucket(), createBucket(), getBuckets()getServiceAccount()createHmacKey(), getHmacKeys(), hmacKey()channel()Bucket class providing file management, metadata operations, lifecycle rules, and bucket-level access control.
Key APIs:
file(), getFiles(), upload(), deleteFiles()getMetadata(), setMetadata(), exists()makePublic(), makePrivate()addLifecycleRule(), setStorageClass()createNotification(), getNotifications()File class for individual object operations including upload/download, encryption, and metadata management.
Key APIs:
save(), download(), createReadStream(), createWriteStream()copy(), move(), delete(), getMetadata(), setMetadata()makePublic(), makePrivate(), getSignedUrl()rotateEncryptionKey()ACL and IAM functionality for managing permissions on buckets and files.
Key APIs:
add(), get(), update(), remove()owners, readers, writersgetPolicy(), setPolicy(), testIamPermissions()HMAC key management and authentication options for programmatic access.
Key APIs:
createHmacKey(), getHmacKeys(), hmacKey()getMetadata(), setMetadata(), delete()Pub/Sub integration for bucket change notifications and channel management.
Key APIs:
createNotification(), getNotifications(), delete()createChannel(), stop()Bulk upload/download operations with parallel processing and progress tracking.
Key APIs:
uploadManyFiles(), uploadFileInChunks()downloadManyFiles(), downloadFileInChunks()Supporting utilities including checksums, signed URLs, and stream validation.
Key APIs:
CRC32C, HashStreamValidatorgetSignedUrl(), generateSignedPostPolicy()Channel, stop()import { ApiError } from '@google-cloud/storage';
try {
const [files] = await bucket.getFiles();
} catch (error) {
if (error instanceof ApiError) {
console.log(`Error ${error.code}: ${error.message}`);
console.log('Details:', error.errors);
}
}
// Configure retry behavior
const storage = new Storage({
retryOptions: {
autoRetry: true,
maxRetries: 3,
retryDelayMultiplier: 2,
totalTimeout: 600000, // 10 minutes
idempotencyStrategy: IdempotencyStrategy.RetryConditional
}
});// List all buckets
const [buckets] = await storage.getBuckets();
buckets.forEach(bucket => console.log(bucket.name));
// Upload a file
await bucket.upload('/local/path/file.txt', {
destination: 'remote/path/file.txt',
metadata: {
contentType: 'text/plain'
}
});
// Download a file
await bucket.file('remote/path/file.txt').download({
destination: '/local/path/downloaded.txt'
});
// Stream operations
const readStream = bucket.file('large-file.zip').createReadStream();
const writeStream = bucket.file('upload.zip').createWriteStream();
readStream.pipe(writeStream);
// Signed URL for temporary access
const [url] = await bucket.file('document.pdf').getSignedUrl({
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000 // 15 minutes
});