Cloudflare R2 + Presigned URLs: Zero Egress Fee File Storage Implementation
Learn how to implement Cloudflare R2 with presigned URLs while maintaining zero egress fees. Complete guide with code examples for secure, cost-effective file storage and delivery.

Cloudflare R2 + Presigned URLs: Zero Egress Fee File Storage Implementation

Introduction
Cloudflare R2 is revolutionizing object storage by offering S3-compatible storage with zero egress fees. When combined with presigned URLs, you get a powerful, cost-effective solution for file uploads, downloads, and media delivery. This comprehensive guide shows you how to implement Cloudflare R2 with presigned URLs while maintaining those precious zero egress fees.
Unlike traditional cloud storage solutions that charge hefty egress fees, Cloudflare R2 allows you to serve unlimited bandwidth without additional costs. This makes it perfect for applications with high download volumes, media streaming, or file sharing platforms.
💡 Key Benefits Covered:
- ✅ Zero egress fees for file downloads
- ✅ S3-compatible API for easy migration
- ✅ Secure presigned URL implementation
- ✅ Global edge network performance
- ✅ Cost-effective file storage solution
Why Cloudflare R2 + Presigned URLs?
The Cost Problem with Traditional Storage
Traditional cloud storage providers charge for egress (data transfer out). For a file-heavy application, these costs can become astronomical:
Traditional Storage Costs (per GB):
- AWS S3: $0.09 per GB egress
- Google Cloud Storage: $0.12 per GB egress
- Azure Blob Storage: $0.087 per GB egress
Cloudflare R2 Costs:
- Storage: $0.015 per GB/month
- Class A operations: $4.50 per million requests
- Class B operations: $0.36 per million requests
- Egress: $0.00 (FREE!)
What Are Presigned URLs?
Presigned URLs are temporary, secure URLs that grant time-limited access to private objects in your storage bucket. They allow you to:
- Upload files directly to storage without exposing credentials
- Download private files without making them public
- Control access duration and permissions
- Reduce server load by enabling direct client-to-storage transfers
Setting Up Cloudflare R2
1. Create R2 Bucket
First, create your R2 bucket in the Cloudflare dashboard:
# Via Cloudflare Dashboard:
1. Go to Cloudflare Dashboard > R2 Object Storage
2. Click "Create bucket"
3. Enter bucket name (e.g., "my-app-files")
4. Choose location (auto for global distribution)
5. Click "Create bucket"
2. Generate API Credentials
Create API tokens for programmatic access:
# Steps:
1. Go to Cloudflare Dashboard > R2 > Manage R2 API tokens
2. Click "Create API token"
3. Choose "Custom token"
4. Permissions: "Object Read and Write"
5. Account resources: Include your account
6. Zone resources: Include your R2 bucket
7. Copy the Access Key ID and Secret Access Key
3. Environment Configuration
Set up your environment variables:
.env.local
# Cloudflare R2 Configuration
R2_ACCOUNT_ID=your_account_id
R2_ACCESS_KEY_ID=your_access_key_id
R2_SECRET_ACCESS_KEY=your_secret_access_key
R2_BUCKET_NAME=my-app-files
R2_ENDPOINT=https://{account_id}.r2.cloudflarestorage.com
R2_PUBLIC_URL=https://pub-{hash}.r2.dev # Optional: for public access
Backend Implementation
Installing Dependencies
npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presigner
# or
yarn add @aws-sdk/client-s3 @aws-sdk/s3-request-presigner
R2 Client Setup
Create a reusable R2 client configuration:
// lib/r2-client.js
import { S3Client } from "@aws-sdk/client-s3";
const r2Client = new S3Client({
region: "auto", // R2 uses 'auto' region
endpoint: process.env.R2_ENDPOINT,
credentials: {
accessKeyId: process.env.R2_ACCESS_KEY_ID,
secretAccessKey: process.env.R2_SECRET_ACCESS_KEY,
},
// Important: Preserve zero egress fees
forcePathStyle: true,
});
export default r2Client;
Presigned URL Generation Service
Create a service for generating presigned URLs:
// lib/r2-presigned.js
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
import {
GetObjectCommand,
PutObjectCommand,
DeleteObjectCommand,
} from "@aws-sdk/client-s3";
import r2Client from "./r2-client.js";
import crypto from "crypto";
const BUCKET_NAME = process.env.R2_BUCKET_NAME;
// Generate presigned URL for file upload
export async function generateUploadURL(fileName, fileType, expiresIn = 3600) {
// Generate unique file name to prevent conflicts
const uniqueFileName = `${Date.now()}-${crypto.randomUUID()}-${fileName}`;
const command = new PutObjectCommand({
Bucket: BUCKET_NAME,
Key: uniqueFileName,
ContentType: fileType,
// Optional: Add metadata
Metadata: {
originalName: fileName,
uploadedAt: new Date().toISOString(),
},
});
try {
const uploadURL = await getSignedUrl(r2Client, command, {
expiresIn // URL expires in 1 hour by default
});
return {
uploadURL,
fileName: uniqueFileName,
originalName: fileName,
};
} catch (error) {
console.error("Error generating upload URL:", error);
throw new Error("Failed to generate upload URL");
}
}
// Generate presigned URL for file download
export async function generateDownloadURL(fileName, expiresIn = 3600) {
const command = new GetObjectCommand({
Bucket: BUCKET_NAME,
Key: fileName,
});
try {
const downloadURL = await getSignedUrl(r2Client, command, {
expiresIn
});
return downloadURL;
} catch (error) {
console.error("Error generating download URL:", error);
throw new Error("Failed to generate download URL");
}
}
// Generate presigned URL for file deletion
export async function generateDeleteURL(fileName, expiresIn = 900) {
const command = new DeleteObjectCommand({
Bucket: BUCKET_NAME,
Key: fileName,
});
try {
const deleteURL = await getSignedUrl(r2Client, command, {
expiresIn // Shorter expiration for delete operations
});
return deleteURL;
} catch (error) {
console.error("Error generating delete URL:", error);
throw new Error("Failed to generate delete URL");
}
}
// List files in bucket (for admin purposes)
export async function listFiles(prefix = "", maxKeys = 100) {
const { ListObjectsV2Command } = await import("@aws-sdk/client-s3");
const command = new ListObjectsV2Command({
Bucket: BUCKET_NAME,
Prefix: prefix,
MaxKeys: maxKeys,
});
try {
const response = await r2Client.send(command);
return response.Contents || [];
} catch (error) {
console.error("Error listing files:", error);
throw new Error("Failed to list files");
}
}
API Routes Implementation
Next.js API Routes
Create API endpoints for presigned URL generation:
// pages/api/upload/presign.js (or app/api/upload/presign/route.js for App Router)
import { generateUploadURL } from "../../../lib/r2-presigned";
export default async function handler(req, res) {
if (req.method !== "POST") {
return res.status(405).json({ error: "Method not allowed" });
}
try {
const { fileName, fileType } = req.body;
// Validate input
if (!fileName || !fileType) {
return res.status(400).json({
error: "fileName and fileType are required"
});
}
// Optional: Add file size/type restrictions
const maxSize = 10 * 1024 * 1024; // 10MB
const allowedTypes = [
'image/jpeg', 'image/png', 'image/webp',
'application/pdf', 'text/plain'
];
if (!allowedTypes.includes(fileType)) {
return res.status(400).json({
error: "File type not allowed"
});
}
const result = await generateUploadURL(fileName, fileType);
res.status(200).json({
success: true,
data: result
});
} catch (error) {
console.error("Upload presign error:", error);
res.status(500).json({
error: "Failed to generate upload URL"
});
}
}
// For App Router (app/api/upload/presign/route.js)
export async function POST(request) {
try {
const { fileName, fileType } = await request.json();
if (!fileName || !fileType) {
return Response.json({
error: "fileName and fileType are required"
}, { status: 400 });
}
const result = await generateUploadURL(fileName, fileType);
return Response.json({
success: true,
data: result
});
} catch (error) {
console.error("Upload presign error:", error);
return Response.json({
error: "Failed to generate upload URL"
}, { status: 500 });
}
}
// pages/api/download/presign.js
import { generateDownloadURL } from "../../../lib/r2-presigned";
export default async function handler(req, res) {
if (req.method !== "GET") {
return res.status(405).json({ error: "Method not allowed" });
}
try {
const { fileName } = req.query;
if (!fileName) {
return res.status(400).json({
error: "fileName is required"
});
}
// Optional: Add authorization check here
// const isAuthorized = await checkUserAccess(req, fileName);
// if (!isAuthorized) {
// return res.status(403).json({ error: "Access denied" });
// }
const downloadURL = await generateDownloadURL(fileName);
res.status(200).json({
success: true,
downloadURL
});
} catch (error) {
console.error("Download presign error:", error);
res.status(500).json({
error: "Failed to generate download URL"
});
}
}
Frontend Implementation
File Upload Hook
Create a React hook for handling file uploads:
// hooks/useR2Upload.js
import { useState } from 'react';
export function useR2Upload() {
const [uploading, setUploading] = useState(false);
const [uploadProgress, setUploadProgress] = useState(0);
const [error, setError] = useState(null);
const uploadFile = async (file) => {
setUploading(true);
setError(null);
setUploadProgress(0);
try {
// Step 1: Get presigned URL
const presignResponse = await fetch('/api/upload/presign', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
fileName: file.name,
fileType: file.type,
}),
});
if (!presignResponse.ok) {
throw new Error('Failed to get upload URL');
}
const { data } = await presignResponse.json();
// Step 2: Upload directly to R2 using presigned URL
const uploadResponse = await fetch(data.uploadURL, {
method: 'PUT',
body: file,
headers: {
'Content-Type': file.type,
},
});
if (!uploadResponse.ok) {
throw new Error('Failed to upload file');
}
setUploadProgress(100);
// Return file info for further processing
return {
fileName: data.fileName,
originalName: data.originalName,
size: file.size,
type: file.type,
};
} catch (err) {
setError(err.message);
throw err;
} finally {
setUploading(false);
}
};
const uploadWithProgress = async (file) => {
setUploading(true);
setError(null);
setUploadProgress(0);
try {
// Get presigned URL
const presignResponse = await fetch('/api/upload/presign', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
fileName: file.name,
fileType: file.type,
}),
});
if (!presignResponse.ok) {
throw new Error('Failed to get upload URL');
}
const { data } = await presignResponse.json();
// Upload with progress tracking
return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.upload.addEventListener('progress', (event) => {
if (event.lengthComputable) {
const percentComplete = (event.loaded / event.total) * 100;
setUploadProgress(percentComplete);
}
});
xhr.addEventListener('load', () => {
if (xhr.status === 200) {
resolve({
fileName: data.fileName,
originalName: data.originalName,
size: file.size,
type: file.type,
});
} else {
reject(new Error('Upload failed'));
}
setUploading(false);
});
xhr.addEventListener('error', () => {
reject(new Error('Upload failed'));
setUploading(false);
});
xhr.open('PUT', data.uploadURL);
xhr.setRequestHeader('Content-Type', file.type);
xhr.send(file);
});
} catch (err) {
setError(err.message);
setUploading(false);
throw err;
}
};
return {
uploadFile,
uploadWithProgress,
uploading,
uploadProgress,
error,
};
}
File Upload Component
Create a user-friendly file upload component:
// components/FileUpload.jsx
import React, { useCallback, useState } from 'react';
import { useDropzone } from 'react-dropzone';
import { useR2Upload } from '../hooks/useR2Upload';
export default function FileUpload({ onUploadComplete, maxFiles = 5 }) {
const [uploadedFiles, setUploadedFiles] = useState([]);
const { uploadWithProgress, uploading, uploadProgress, error } = useR2Upload();
const onDrop = useCallback(async (acceptedFiles) => {
for (const file of acceptedFiles) {
try {
const result = await uploadWithProgress(file);
setUploadedFiles(prev => [...prev, {
...result,
url: `/api/file/${result.fileName}`, // Your download endpoint
uploadedAt: new Date().toISOString(),
}]);
if (onUploadComplete) {
onUploadComplete(result);
}
} catch (err) {
console.error('Upload failed:', err);
}
}
}, [uploadWithProgress, onUploadComplete]);
const { getRootProps, getInputProps, isDragActive } = useDropzone({
onDrop,
maxFiles,
maxSize: 10 * 1024 * 1024, // 10MB
accept: {
'image/*': ['.png', '.jpg', '.jpeg', '.webp'],
'application/pdf': ['.pdf'],
'text/plain': ['.txt'],
},
});
return (
<div className="w-full max-w-lg mx-auto">
<div
{...getRootProps()}
className={`
border-2 border-dashed rounded-lg p-8 text-center cursor-pointer transition-colors
${isDragActive
? 'border-blue-400 bg-blue-50'
: 'border-gray-300 hover:border-gray-400'
}
${uploading ? 'pointer-events-none opacity-50' : ''}
`}
>
<input {...getInputProps()} />
<div className="space-y-4">
<div className="text-4xl">📁</div>
{uploading ? (
<div className="space-y-2">
<div className="text-sm text-gray-600">
Uploading... {Math.round(uploadProgress)}%
</div>
<div className="w-full bg-gray-200 rounded-full h-2">
<div
className="bg-blue-600 h-2 rounded-full transition-all duration-300"
style={{ width: `${uploadProgress}%` }}
></div>
</div>
</div>
) : isDragActive ? (
<div className="text-blue-600">Drop files here...</div>
) : (
<div className="space-y-2">
<div className="text-gray-700">
Drag and drop files here, or click to select
</div>
<div className="text-sm text-gray-500">
Max {maxFiles} files, 10MB each
</div>
</div>
)}
</div>
</div>
{error && (
<div className="mt-4 p-3 bg-red-50 border border-red-200 rounded-md">
<div className="text-red-600 text-sm">{error}</div>
</div>
)}
{uploadedFiles.length > 0 && (
<div className="mt-6 space-y-3">
<h3 className="font-medium text-gray-900">Uploaded Files</h3>
{uploadedFiles.map((file, index) => (
<div key={index} className="flex items-center justify-between p-3 bg-green-50 rounded-lg">
<div className="flex items-center space-x-3">
<div className="w-8 h-8 bg-green-100 rounded-full flex items-center justify-center">
<span className="text-green-600 text-xs">✓</span>
</div>
<div>
<div className="text-sm font-medium text-gray-900">
{file.originalName}
</div>
<div className="text-xs text-gray-500">
{(file.size / 1024 / 1024).toFixed(2)} MB
</div>
</div>
</div>
<a
href={file.url}
target="_blank"
rel="noopener noreferrer"
className="text-blue-600 hover:text-blue-800 text-sm"
>
View
</a>
</div>
))}
</div>
)}
</div>
);
}
Download Component
Create a component for secure file downloads:
// components/FileDownload.jsx
import React, { useState } from 'react';
export default function FileDownload({ fileName, originalName, fileSize }) {
const [downloading, setDownloading] = useState(false);
const [error, setError] = useState(null);
const handleDownload = async () => {
setDownloading(true);
setError(null);
try {
// Get presigned download URL
const response = await fetch(`/api/download/presign?fileName=${encodeURIComponent(fileName)}`);
if (!response.ok) {
throw new Error('Failed to get download URL');
}
const { downloadURL } = await response.json();
// Create temporary link and trigger download
const link = document.createElement('a');
link.href = downloadURL;
link.download = originalName || fileName;
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
} catch (err) {
setError(err.message);
} finally {
setDownloading(false);
}
};
return (
<div className="flex items-center justify-between p-4 border rounded-lg hover:bg-gray-50">
<div className="flex items-center space-x-3">
<div className="w-10 h-10 bg-blue-100 rounded-lg flex items-center justify-center">
<span className="text-blue-600">📄</span>
</div>
<div>
<div className="font-medium text-gray-900">
{originalName || fileName}
</div>
{fileSize && (
<div className="text-sm text-gray-500">
{(fileSize / 1024 / 1024).toFixed(2)} MB
</div>
)}
</div>
</div>
<div className="flex items-center space-x-2">
<button
onClick={handleDownload}
disabled={downloading}
className="px-4 py-2 text-sm font-medium text-blue-600 hover:text-blue-800 disabled:opacity-50"
>
{downloading ? 'Downloading...' : 'Download'}
</button>
</div>
{error && (
<div className="mt-2 text-sm text-red-600">
Error: {error}
</div>
)}
</div>
);
}
Advanced Implementation Patterns
1. Organized File Structure
Implement organized storage patterns:
// lib/r2-file-manager.js
export class R2FileManager {
constructor(bucket, basePath = '') {
this.bucket = bucket;
this.basePath = basePath;
}
// Organize files by user and type
generateFilePath(userId, fileType, fileName) {
const date = new Date();
const year = date.getFullYear();
const month = String(date.getMonth() + 1).padStart(2, '0');
const fileExtension = fileName.split('.').pop();
const uniqueId = crypto.randomUUID();
return `${this.basePath}/users/${userId}/${fileType}/${year}/${month}/${uniqueId}.${fileExtension}`;
}
// Generate presigned URL with custom path
async generateUploadURL(userId, fileType, fileName, expiresIn = 3600) {
const filePath = this.generateFilePath(userId, fileType, fileName);
const command = new PutObjectCommand({
Bucket: this.bucket,
Key: filePath,
ContentType: this.getContentType(fileName),
Metadata: {
userId,
fileType,
originalName: fileName,
uploadedAt: new Date().toISOString(),
},
});
const uploadURL = await getSignedUrl(r2Client, command, { expiresIn });
return {
uploadURL,
filePath,
fileName: fileName,
};
}
getContentType(fileName) {
const extension = fileName.split('.').pop().toLowerCase();
const mimeTypes = {
'jpg': 'image/jpeg',
'jpeg': 'image/jpeg',
'png': 'image/png',
'webp': 'image/webp',
'pdf': 'application/pdf',
'txt': 'text/plain',
'json': 'application/json',
};
return mimeTypes[extension] || 'application/octet-stream';
}
}
2. File Metadata Management
Track file metadata in your database:
// models/File.js (Example using Prisma)
model File {
id String @id @default(cuid())
fileName String // R2 file path
originalName String // Original filename
fileSize Int
contentType String
userId String
bucket String
isPublic Boolean @default(false)
downloadCount Int @default(0)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
user User @relation(fields: [userId], references: [id])
@@map("files")
}
// services/fileService.js
export class FileService {
static async saveFileMetadata(fileData) {
return await prisma.file.create({
data: {
fileName: fileData.fileName,
originalName: fileData.originalName,
fileSize: fileData.fileSize,
contentType: fileData.contentType,
userId: fileData.userId,
bucket: process.env.R2_BUCKET_NAME,
},
});
}
static async getFileById(fileId, userId) {
return await prisma.file.findFirst({
where: {
id: fileId,
userId, // Ensure user owns the file
},
});
}
static async incrementDownloadCount(fileId) {
return await prisma.file.update({
where: { id: fileId },
data: {
downloadCount: {
increment: 1,
},
},
});
}
}
3. Security and Access Control
Implement proper security measures:
// middleware/auth.js
export async function authenticateUser(req) {
const token = req.headers.authorization?.replace('Bearer ', '');
if (!token) {
throw new Error('Authentication required');
}
// Verify JWT token
const user = await verifyJWT(token);
return user;
}
// pages/api/upload/presign.js - With authentication
export default async function handler(req, res) {
try {
const user = await authenticateUser(req);
const { fileName, fileType, isPublic = false } = req.body;
// Check user permissions
if (!canUserUpload(user, fileType)) {
return res.status(403).json({ error: 'Upload permission denied' });
}
// Generate organized file path
const fileManager = new R2FileManager(process.env.R2_BUCKET_NAME);
const result = await fileManager.generateUploadURL(
user.id,
getFileCategory(fileType),
fileName
);
// Save metadata to database
const fileRecord = await FileService.saveFileMetadata({
...result,
fileSize: req.body.fileSize,
contentType: fileType,
userId: user.id,
isPublic,
});
res.status(200).json({
success: true,
data: {
...result,
fileId: fileRecord.id,
}
});
} catch (error) {
console.error('Upload presign error:', error);
res.status(500).json({ error: 'Failed to generate upload URL' });
}
}
function canUserUpload(user, fileType) {
// Implement your business logic
const allowedTypes = ['image/jpeg', 'image/png', 'application/pdf'];
return allowedTypes.includes(fileType) && user.role !== 'banned';
}
function getFileCategory(fileType) {
if (fileType.startsWith('image/')) return 'images';
if (fileType === 'application/pdf') return 'documents';
return 'misc';
}
Preserving Zero Egress Fees
Key Strategies
🚨 Important: Maintaining Zero Egress
- ✅ Use R2-native presigned URLs (not through CF Workers)
- ✅ Direct client-to-R2 transfers via presigned URLs
- ✅ Avoid proxying downloads through your server
- ✅ Use R2 public domains for public content
- ❌ Don't route downloads through Cloudflare Workers (charges apply)
Public vs Private Content Strategy
// Public content configuration
// For public files that don't need access control
export async function makeFilePublic(fileName) {
const command = new PutObjectAclCommand({
Bucket: process.env.R2_BUCKET_NAME,
Key: fileName,
ACL: 'public-read',
});
await r2Client.send(command);
// Return public URL (zero egress)
return `https://pub-${process.env.R2_PUBLIC_HASH}.r2.dev/${fileName}`;
}
// Private content using presigned URLs
export async function getPrivateFileURL(fileName, expiresIn = 3600) {
// This preserves zero egress while maintaining access control
return await generateDownloadURL(fileName, expiresIn);
}
Performance Optimization
1. Caching Strategies
// lib/cache.js - Cache presigned URLs
const urlCache = new Map();
const CACHE_DURATION = 30 * 60 * 1000; // 30 minutes
export function getCachedURL(key) {
const cached = urlCache.get(key);
if (cached && Date.now() < cached.expiry) {
return cached.url;
}
return null;
}
export function setCachedURL(key, url, expiresIn) {
urlCache.set(key, {
url,
expiry: Date.now() + (expiresIn * 1000) - (5 * 60 * 1000), // 5 min buffer
});
}
// Enhanced presigned URL generation with caching
export async function generateCachedDownloadURL(fileName, expiresIn = 3600) {
const cacheKey = `download-${fileName}`;
// Check cache first
const cachedURL = getCachedURL(cacheKey);
if (cachedURL) {
return cachedURL;
}
// Generate new URL
const downloadURL = await generateDownloadURL(fileName, expiresIn);
// Cache for future use
setCachedURL(cacheKey, downloadURL, expiresIn);
return downloadURL;
}
2. Batch Operations
// lib/batch-operations.js
export async function generateMultipleUploadURLs(files) {
const promises = files.map(file =>
generateUploadURL(file.name, file.type)
);
try {
const results = await Promise.all(promises);
return results;
} catch (error) {
console.error('Batch upload URL generation failed:', error);
throw error;
}
}
export async function batchDeleteFiles(fileNames) {
const { DeleteObjectsCommand } = await import("@aws-sdk/client-s3");
const command = new DeleteObjectsCommand({
Bucket: process.env.R2_BUCKET_NAME,
Delete: {
Objects: fileNames.map(fileName => ({ Key: fileName })),
},
});
try {
const result = await r2Client.send(command);
return result;
} catch (error) {
console.error('Batch delete failed:', error);
throw error;
}
}
Error Handling and Monitoring
Comprehensive Error Handling
// lib/r2-errors.js
export class R2Error extends Error {
constructor(message, code, statusCode = 500) {
super(message);
this.name = 'R2Error';
this.code = code;
this.statusCode = statusCode;
}
}
export function handleR2Error(error) {
if (error.name === 'NoSuchBucket') {
return new R2Error('Bucket not found', 'BUCKET_NOT_FOUND', 404);
}
if (error.name === 'AccessDenied') {
return new R2Error('Access denied', 'ACCESS_DENIED', 403);
}
if (error.name === 'SignatureDoesNotMatch') {
return new R2Error('Invalid credentials', 'INVALID_CREDENTIALS', 401);
}
return new R2Error('Storage operation failed', 'STORAGE_ERROR', 500);
}
// Enhanced presigned URL generation with error handling
export async function safeGenerateUploadURL(fileName, fileType, expiresIn = 3600) {
try {
return await generateUploadURL(fileName, fileType, expiresIn);
} catch (error) {
const r2Error = handleR2Error(error);
// Log for monitoring
console.error('R2 Upload URL Generation Failed:', {
fileName,
fileType,
error: r2Error.message,
code: r2Error.code,
timestamp: new Date().toISOString(),
});
throw r2Error;
}
}
Usage Analytics
// lib/analytics.js
export class R2Analytics {
static async trackUpload(fileData, userId) {
// Track upload metrics
await this.logEvent('file_uploaded', {
fileName: fileData.fileName,
fileSize: fileData.fileSize,
contentType: fileData.contentType,
userId,
timestamp: new Date().toISOString(),
});
}
static async trackDownload(fileName, userId) {
// Track download metrics
await this.logEvent('file_downloaded', {
fileName,
userId,
timestamp: new Date().toISOString(),
});
}
static async logEvent(event, data) {
// Send to your analytics service
try {
await fetch('/api/analytics/track', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ event, data }),
});
} catch (error) {
console.error('Analytics tracking failed:', error);
}
}
static async getStorageMetrics(userId) {
return await prisma.file.aggregate({
where: { userId },
_sum: { fileSize: true },
_count: { id: true },
});
}
}
Best Practices and Tips
Security Best Practices
- Short Expiration Times: Use minimal expiration times for presigned URLs
- Content Type Validation: Always validate file types and sizes
- Rate Limiting: Implement rate limiting for URL generation
- User Authentication: Verify user permissions before generating URLs
- Audit Logs: Log all file operations for security monitoring
Performance Tips
- Caching: Cache presigned URLs when appropriate
- Batch Operations: Use batch operations for multiple files
- Client-Side Validation: Validate files on the client before upload
- Progress Tracking: Implement upload progress for better UX
- Compression: Compress files on the client when possible
Cost Optimization
- Lifecycle Policies: Implement automatic deletion of old files
- Deduplication: Avoid storing duplicate files
- Public vs Private: Use public URLs for public content
- Monitor Usage: Track storage and operation costs
Testing Your Implementation
Unit Tests
// tests/r2-presigned.test.js
import { generateUploadURL, generateDownloadURL } from '../lib/r2-presigned';
describe('R2 Presigned URLs', () => {
test('should generate upload URL', async () => {
const result = await generateUploadURL('test.jpg', 'image/jpeg');
expect(result).toHaveProperty('uploadURL');
expect(result).toHaveProperty('fileName');
expect(result.uploadURL).toContain('X-Amz-Signature');
});
test('should generate download URL', async () => {
const downloadURL = await generateDownloadURL('test-file.jpg');
expect(downloadURL).toContain('X-Amz-Signature');
expect(downloadURL).toContain(process.env.R2_BUCKET_NAME);
});
test('should handle errors gracefully', async () => {
await expect(
generateUploadURL('', 'invalid/type')
).rejects.toThrow();
});
});
Integration Tests
// tests/upload-integration.test.js
import { uploadFile } from '../lib/test-utils';
describe('File Upload Integration', () => {
test('complete upload flow', async () => {
// 1. Generate presigned URL
const response = await fetch('/api/upload/presign', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
fileName: 'test.jpg',
fileType: 'image/jpeg'
}),
});
const { data } = await response.json();
expect(data.uploadURL).toBeDefined();
// 2. Upload file using presigned URL
const file = new File(['test content'], 'test.jpg', { type: 'image/jpeg' });
const uploadResponse = await fetch(data.uploadURL, {
method: 'PUT',
body: file,
});
expect(uploadResponse.ok).toBe(true);
// 3. Verify file was uploaded
const downloadResponse = await fetch(
`/api/download/presign?fileName=${data.fileName}`
);
const { downloadURL } = await downloadResponse.json();
expect(downloadURL).toBeDefined();
});
});
Monitoring and Maintenance
Health Checks
// pages/api/health/r2.js
import r2Client from '../../../lib/r2-client';
import { ListObjectsV2Command } from '@aws-sdk/client-s3';
export default async function handler(req, res) {
try {
// Test R2 connectivity
const command = new ListObjectsV2Command({
Bucket: process.env.R2_BUCKET_NAME,
MaxKeys: 1,
});
const startTime = Date.now();
await r2Client.send(command);
const responseTime = Date.now() - startTime;
res.status(200).json({
status: 'healthy',
service: 'cloudflare-r2',
responseTime: `${responseTime}ms`,
timestamp: new Date().toISOString(),
});
} catch (error) {
res.status(500).json({
status: 'unhealthy',
service: 'cloudflare-r2',
error: error.message,
timestamp: new Date().toISOString(),
});
}
}
Usage Dashboard
// components/R2Dashboard.jsx
import React, { useState, useEffect } from 'react';
export default function R2Dashboard() {
const [metrics, setMetrics] = useState(null);
const [loading, setLoading] = useState(true);
useEffect(() => {
fetchMetrics();
}, []);
const fetchMetrics = async () => {
try {
const response = await fetch('/api/analytics/r2-metrics');
const data = await response.json();
setMetrics(data);
} catch (error) {
console.error('Failed to fetch metrics:', error);
} finally {
setLoading(false);
}
};
if (loading) return <div>Loading metrics...</div>;
return (
<div className="grid grid-cols-1 md:grid-cols-3 gap-6">
<div className="bg-white p-6 rounded-lg shadow">
<h3 className="text-lg font-semibold mb-2">Total Files</h3>
<div className="text-3xl font-bold text-blue-600">
{'{'}metrics?.totalFiles?.toLocaleString(){'}'}
</div>
</div>
<div className="bg-white p-6 rounded-lg shadow">
<h3 className="text-lg font-semibold mb-2">Storage Used</h3>
<div className="text-3xl font-bold text-green-600">
{'{'}(metrics?.totalSize / 1024 / 1024 / 1024).toFixed(2){'}'} GB
</div>
</div>
<div className="bg-white p-6 rounded-lg shadow">
<h3 className="text-lg font-semibold mb-2">Monthly Cost</h3>
<div className="text-3xl font-bold text-purple-600">
{(metrics?.totalSize / 1024 / 1024 / 1024 * 0.015).toFixed(2){'}'}
</div>
<div className="text-sm text-gray-500 mt-1">
(Egress: $0.00)
</div>
</div>
</div>
);
}
Conclusion
Implementing Cloudflare R2 with presigned URLs offers a powerful combination of security, performance, and cost-effectiveness. The zero egress fee model makes it especially attractive for applications with high download volumes or media-heavy content.
🎯 Key Takeaways:
- ✅ Cost Savings: Eliminate egress fees with R2's free data transfer
- ✅ Security: Presigned URLs provide secure, time-limited access
- ✅ Performance: Direct client-to-storage transfers reduce server load
- ✅ Scalability: Handle millions of files without infrastructure concerns
- ✅ S3 Compatibility: Easy migration from existing S3-based solutions
By following the implementation patterns and best practices outlined in this guide, you'll have a robust, secure, and cost-effective file storage solution that scales with your application's needs.
Remember to monitor your usage, implement proper security measures, and regularly review your file management policies to maintain optimal performance and cost-efficiency.
🚀 Next Steps:
- Set up your Cloudflare R2 bucket and credentials
- Implement the basic presigned URL endpoints
- Add file upload/download components to your frontend
- Implement security measures and rate limiting
- Set up monitoring and analytics
- Test thoroughly and deploy to production
Happy coding, and enjoy your zero egress fees! 🎉
Tags

About Renie Namocot
Full-stack developer specializing in Laravel, Next.js, React, WordPress, and Shopify. Passionate about creating efficient, scalable web applications and sharing knowledge through practical tutorials.