Laravel File Uploads with AWS S3, CDN, and Signed URLs: Complete Guide
Handle file uploads in Laravel the right way — store on AWS S3, serve via CloudFront CDN, generate time-limited signed URLs for private files, and optimise images on the fly. Everything a startup needs for production file handling.
The Problem With Local File Storage in Production
Many Laravel projects start with Storage::disk('local') — files stored on the same server as the application. This works fine locally but creates real problems in production: if you redeploy with zero-downtime (swapping between two app servers), local files are not shared between instances. If your server goes down, files are gone. If you want to serve files via CDN, you need them on object storage anyway.
The right production architecture stores uploaded files on AWS S3 (or a compatible service like Cloudflare R2 or DigitalOcean Spaces), serves public assets through a CDN, and protects private files with time-limited signed URLs. This guide walks through the complete implementation.
AWS S3 Setup
Create an S3 bucket in your AWS console. For security, do not make the bucket public — serve everything through signed URLs or CloudFront. Configure an IAM user with a policy that grants only the permissions your application needs:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::your-bucket-name",
"arn:aws:s3:::your-bucket-name/*"
]
}
]
}
Add the IAM user's access key to .env:
AWS_ACCESS_KEY_ID=AKIA...
AWS_SECRET_ACCESS_KEY=...
AWS_DEFAULT_REGION=eu-west-1
AWS_BUCKET=your-bucket-name
composer require league/flysystem-aws-s3-v3
Uploading Files
Laravel's Storage facade works identically for local and S3 — you just change the disk:
// Simple upload — returns the stored path
$path = $request->file('document')->store('documents', 's3');
// With a custom filename and access control
$path = Storage::disk('s3')->putFileAs(
'documents/' . $team->id,
$request->file('document'),
Str::uuid() . '.' . $request->file('document')->extension(),
);
// Store the path in the database
Document::create([
'team_id' => $team->id,
'path' => $path,
'name' => $request->file('document')->getClientOriginalName(),
'size' => $request->file('document')->getSize(),
'mime_type' => $request->file('document')->getMimeType(),
]);
Always store the S3 path (e.g. documents/team-1/uuid.pdf) in the database — not the full URL. URLs change when you switch regions, CDN providers, or bucket names. The path is stable.
Serving Private Files with Signed URLs
For files that should only be accessible to authenticated users — invoices, private documents, user-uploaded content — never store them in a public bucket. Instead, generate time-limited signed URLs on demand:
// Generate a signed URL valid for 60 minutes
$url = Storage::disk('s3')->temporaryUrl(
$document->path,
now()->addMinutes(60)
);
return redirect($url);
// Or return as JSON for frontend consumption
return response()->json(['url' => $url, 'expires_in' => 3600]);
Signed URLs are S3 URLs with an HMAC signature appended as query parameters. After the expiry time, the URL returns a 403 — even if someone shares it. This is the correct pattern for any file that should not be permanently publicly accessible.
A clean pattern is to route all private file access through a Laravel controller that checks authorisation before generating the signed URL:
// routes/web.php
Route::get('/documents/{document}/download', [DocumentController::class, 'download'])
->middleware('auth')
->name('documents.download');
// DocumentController
public function download(Document $document)
{
$this->authorize('view', $document);
$url = Storage::disk('s3')->temporaryUrl($document->path, now()->addMinutes(5));
return redirect($url);
}
The 5-minute expiry is intentionally short — the user's browser follows the redirect immediately. The short expiry prevents the URL from being shared or bookmarked.
Direct Browser-to-S3 Uploads (Presigned Upload URLs)
For large files, routing the upload through your Laravel server wastes bandwidth and ties up a server process. Instead, generate a presigned S3 upload URL in Laravel and let the browser upload directly to S3:
// Generate a presigned PUT URL that allows the browser to upload directly
public function generateUploadUrl(Request $request): JsonResponse
{
$request->validate(['filename' => 'required|string', 'content_type' => 'required|string']);
$path = 'uploads/' . $request->user()->team_id . '/' . Str::uuid() . '.' . pathinfo($request->filename, PATHINFO_EXTENSION);
$command = app(S3Client::class)->getCommand('PutObject', [
'Bucket' => config('filesystems.disks.s3.bucket'),
'Key' => $path,
'ContentType' => $request->content_type,
]);
$presignedRequest = app(S3Client::class)->createPresignedRequest($command, '+15 minutes');
return response()->json([
'upload_url' => (string) $presignedRequest->getUri(),
'path' => $path,
]);
}
// After the upload completes, the frontend POSTs the path back to confirm it
public function confirmUpload(Request $request): JsonResponse
{
$request->validate(['path' => 'required|string']);
Document::create([
'team_id' => $request->user()->team_id,
'path' => $request->path,
'name' => $request->name,
]);
return response()->json(['success' => true]);
}
Image Optimisation with Spatie Media Library
For user-uploaded images (avatars, product photos, cover images), you need responsive variants — a full-size original, a thumbnail, and intermediate sizes. Spatie's laravel-medialibrary package handles this automatically:
composer require spatie/laravel-medialibrary
// On your model
use Spatie\MediaLibrary\HasMedia;
use Spatie\MediaLibrary\InteractsWithMedia;
use Spatie\MediaLibrary\MediaCollections\Models\Media;
class User extends Authenticatable implements HasMedia
{
use InteractsWithMedia;
public function registerMediaConversions(?Media $media = null): void
{
$this->addMediaConversion('thumb')
->width(80)->height(80)->sharpen(10)->nonQueued();
$this->addMediaConversion('medium')
->width(400)->height(400);
}
public function registerMediaCollections(): void
{
$this->addMediaCollection('avatar')
->singleFile() // Only keep the latest avatar
->useDisk('s3');
}
}
// Upload
$user->addMediaFromRequest('avatar')
->toMediaCollection('avatar');
// Retrieve URLs
$user->getFirstMediaUrl('avatar', 'thumb'); // 80x80 thumbnail
$user->getFirstMediaUrl('avatar', 'medium'); // 400x400
Cloudflare R2: The Cost-Effective S3 Alternative
AWS S3 charges for both storage and egress (data transfer out). Cloudflare R2 charges only for storage — egress is free. For startups serving files to users (PDFs, images, exports), R2 can reduce storage costs by 90% compared to S3. R2 is fully S3-compatible, so your Laravel code is identical — just change the endpoint URL:
# .env — Cloudflare R2
AWS_ACCESS_KEY_ID=your-r2-access-key
AWS_SECRET_ACCESS_KEY=your-r2-secret-key
AWS_DEFAULT_REGION=auto
AWS_BUCKET=your-bucket-name
AWS_ENDPOINT=https://your-account-id.r2.cloudflarestorage.com
AWS_USE_PATH_STYLE_ENDPOINT=true
Frequently Asked Questions
How do I set a maximum file upload size?
Set limits in three places: PHP's upload_max_filesize and post_max_size in php.ini (for server-side uploads), your Nginx client_max_body_size directive, and your Laravel validation rule ('file' => 'max:20480' for 20MB). All three must be consistent. For direct-to-S3 uploads, PHP and Nginx limits do not apply — the file goes straight from the browser to S3.
Should I scan uploaded files for malware?
Yes, if users can upload documents that other users will download. Use a service like ClamAV (self-hosted, free) or VirusTotal API (managed). The flow: upload to S3, dispatch a queued job to scan, mark the file as clean or quarantine it, then notify the user. Never skip this step for a product where users upload files on behalf of other users.
How do I handle chunked uploads for very large files?
For files over 100MB, use S3 multipart upload. The AWS SDK handles this automatically via the presigned URL approach if you split the file on the client side using the Uppy or Filepond JavaScript libraries — both support S3 multipart out of the box. You initiate the multipart upload in Laravel, return presigned URLs for each chunk, and complete the multipart upload after all chunks are confirmed.
Senior Full Stack Developer — Laravel, Vue.js, Nuxt.js & AI. Available for freelance projects.
Hire Me for Your Project