Introduction
Managing images efficiently is a key part of many web applications — from profile pictures and product photos to blog thumbnails and documents. Amazon S3 (Simple Storage Service) provides a reliable, secure, and scalable cloud-based solution for storing images and other files. However, storing large image files directly can quickly increase your storage costs and reduce your website’s performance.
In this complete guide, we’ll go step-by-step through how to upload, compress, delete, and directly upload images using pre-signed URLs in an AWS S3 bucket with Node.js and Express. You’ll learn how to build a powerful backend that can:
Upload images to an AWS S3 bucket.
Compress images before uploading to reduce storage size.
Delete images from the bucket.
Generate pre-signed URLs for direct client-side uploads.
This article uses simple, natural language and provides complete code examples, making it easy for you to follow along.
What is AWS S3?
Amazon S3 (Simple Storage Service) is a cloud-based storage solution by Amazon Web Services. It allows you to store, retrieve, and manage data such as images, videos, and documents.
Why developers use S3:
Scalability: Store unlimited files without worrying about space.
Durability: Data is automatically backed up across multiple servers.
Security: AWS handles authentication and encryption.
Global Access: You can retrieve files from anywhere using their URLs.
Think of S3 as an online storage folder where you can programmatically upload, read, or delete files using Node.js.
Prerequisites
Before starting, make sure you have the following ready:
AWS Account – Sign up at AWS if you don’t already have one.
S3 Bucket – Create a new bucket and note its name and region.
IAM User with S3 Access – Create an IAM user with permissions for PutObject
, GetObject
, and DeleteObject
.
Node.js Installed – Ensure you have Node.js and npm installed.
Basic Express Knowledge – Understand how to create simple routes.
Then, initialize your project
mkdir s3-image-handler
cd s3-image-handler
npm init -y
Install dependencies
npm install express aws-sdk multer sharp dotenv
Setting Up Environment Variables
Store sensitive AWS credentials securely using environment variables. Create a .env
file:
AWS_ACCESS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
AWS_REGION=us-east-1
S3_BUCKET_NAME=your-bucket-name
Load them in your code using:
require('dotenv').config();
Connecting to AWS S3
Create a new file named s3.js
:
const AWS = require('aws-sdk');
require('dotenv').config();
const s3 = new AWS.S3({
region: process.env.AWS_REGION,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
});
const BUCKET = process.env.S3_BUCKET_NAME;
module.exports = { s3, BUCKET };
This setup allows your Node.js app to securely interact with AWS S3.
Handling Image Uploads with Multer
Multer helps handle incoming file uploads in Express.
uploadMiddleware.js
const multer = require('multer');
const storage = multer.memoryStorage();
const upload = multer({
storage,
limits: { fileSize: 5 * 1024 * 1024 }, // 5MB limit
fileFilter: (req, file, cb) => {
if (!file.mimetype.startsWith('image/')) {
return cb(new Error('Only image files are allowed!'), false);
}
cb(null, true);
}
});
module.exports = upload;
This middleware ensures only images are uploaded and temporarily stores them in memory.
Compressing Images with Sharp
Compressing images helps reduce storage and bandwidth usage. Sharp makes this easy.
compress.js
const sharp = require('sharp');
async function compressImage(buffer) {
return await sharp(buffer)
.resize(800)
.jpeg({ quality: 80 })
.toBuffer();
}
module.exports = compressImage;
This function resizes images to a maximum width of 800 pixels and reduces the quality to 80%.
Uploading Images to S3
Now, combine everything in app.js
const express = require('express');
const upload = require('./uploadMiddleware');
const { s3, BUCKET } = require('./s3');
const compressImage = require('./compress');
const app = express();
app.use(express.json());
app.post('/upload', upload.single('image'), async (req, res) => {
try {
if (!req.file) return res.status(400).json({ error: 'No file uploaded' });
const compressedBuffer = await compressImage(req.file.buffer);
const key = `images/${Date.now()}_${req.file.originalname}`;
const params = {
Bucket: BUCKET,
Key: key,
Body: compressedBuffer,
ContentType: 'image/jpeg',
ACL: 'public-read'
};
const result = await s3.upload(params).promise();
res.json({
message: 'Image uploaded successfully',
url: result.Location,
key: result.Key
});
} catch (error) {
console.error(error);
res.status(500).json({ error: 'Image upload failed' });
}
});
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => console.log(`Server running on port ${PORT}`));
This uploads and compresses images before saving them in S3.
Deleting Images from S3
Add a route to delete files
app.delete('/delete', async (req, res) => {
try {
const { key } = req.body;
if (!key) return res.status(400).json({ error: 'Image key is required' });
await s3.deleteObject({ Bucket: BUCKET, Key: key }).promise();
res.json({ message: 'Image deleted successfully' });
} catch (error) {
console.error(error);
res.status(500).json({ error: 'Image deletion failed' });
}
});
This route deletes an image from the bucket using its unique key.
Generating Pre-signed URLs for Direct Uploads
A pre-signed URL is a temporary link that allows the client to upload files directly to S3 without going through your server. This method increases upload speed and security.
Step 1. Create the Pre-signed URL Route
app.get('/generate-presigned-url', async (req, res) => {
try {
const fileName = req.query.fileName;
const fileType = req.query.fileType;
if (!fileName || !fileType) {
return res.status(400).json({ error: 'File name and type are required' });
}
const params = {
Bucket: BUCKET,
Key: `uploads/${Date.now()}_${fileName}`,
ContentType: fileType,
Expires: 300 // 5 minutes
};
const uploadURL = await s3.getSignedUrlPromise('putObject', params);
res.json({
message: 'Pre-signed URL generated successfully',
uploadURL,
key: params.Key
});
} catch (error) {
console.error('Error generating pre-signed URL:', error);
res.status(500).json({ error: 'Failed to generate pre-signed URL' });
}
});
Step 2. Upload from the Frontend
Example (JavaScript)
async function uploadImage(file) {
const response = await fetch(`/generate-presigned-url?fileName=${file.name}&fileType=${file.type}`);
const data = await response.json();
await fetch(data.uploadURL, {
method: 'PUT',
headers: { 'Content-Type': file.type },
body: file
});
console.log('File uploaded successfully to:', data.key);
}
This uploads the image directly from the browser to S3 using the pre-signed URL.
Benefits of Pre-signed URLs
Improved Security: AWS credentials remain hidden.
Better Performance: The server doesn’t handle file data.
Scalability: Easily handle large uploads directly from users.
Flexibility: Works with web and mobile apps.
Best Practices
Always validate file type and size before uploading.
Use unique keys for each image to prevent overwriting.
Limit the validity time of pre-signed URLs.
Enable S3 versioning or backups for critical files.
Consider compressing images on the frontend before upload.
Summary
In this complete tutorial, we built a full Node.js + Express backend that can upload, compress, delete, and manage images in an AWS S3 bucket. We also implemented pre-signed URLs to allow users to upload files directly from the frontend without exposing AWS credentials. This solution improves performance, reduces storage costs, and makes your web applications faster and more scalable. With AWS S3 and Node.js, you can now efficiently handle any image management task from end to end.