When you start building full-stack applications, sooner or later, you run into the question of how to make them easy to run anywhere. Maybe your React frontend works great locally, and your Node.js backend runs fine on your machine, but how do you hand that project to another developer or deploy it to production without the familiar “it works on my computer” problem?
That is where Docker comes in. Docker lets you package your application into small units called containers. These containers include the code, runtime, libraries, and settings needed to run your app. With Docker, you do not need to worry about differences between machines or operating systems.
In this guide, we will walk through the steps to Dockerize a simple React + Node.js application. By the end, you will have a clear idea of how to run both the frontend and backend inside containers and make them talk to each other.
Why Docker?
Before diving in, let’s answer the basic “why.” Why should you bother learning Docker at all?
Consistency: No matter where you deploy your app, the container will run the same.
Isolation: Each service (React frontend, Node backend, database) runs in its own container without interfering with the others.
Easy sharing: You can give your teammate a single docker-compose.yml
file, and they can spin up the whole stack without manual setup.
Deployment ready: Most cloud platforms support Docker, so once you Dockerize your app, deployment becomes much smoother.
Think of Docker like shipping your app inside a box. Whoever receives it just needs Docker installed to run it.
Setting Up a Simple React + Node.js App
To keep things simple, we will create a minimal full-stack app. The backend will be Node.js with Express, and the frontend will be a React app built with Create React App.
1. Create the backend
mkdir docker-demo
cd docker-demo
mkdir backend
cd backend
npm init -y
npm install express cors
Create an index.js
file inside the backend folder:
const express = require("express");
const cors = require("cors");
const app = express();
const PORT = 5000;
app.use(cors());
app.get("/api", (req, res) => {
res.json({ message: "Hello from the backend!" });
});
app.listen(PORT, () => {
console.log(`Server running on http://localhost:${PORT}`);
});
This backend serves a simple JSON response.
2. Create the frontend
Go back to the root folder and create a React app:
cd ..
npx create-react-app frontend
Open frontend/src/App.js
and update it:
import { useEffect, useState } from "react";
function App() {
const [data, setData] = useState("");
useEffect(() => {
fetch("http://localhost:5000/api")
.then((res) => res.json())
.then((result) => setData(result.message));
}, []);
return (
<div>
<h1>React + Node.js with Docker</h1>
<p>{data}</p>
</div>
);
}
export default App;
At this point, if you run both npm start
inside the frontend and node index.js
Inside the backend, they will work together.
Now let’s containerize them.
Writing Dockerfiles
Each service needs its own Dockerfile that explains how to build the image.
1. Dockerfile for the backend
Create a Dockerfile
inside the backend folder:
# Use official Node.js image
FROM node:18
# Set working directory
WORKDIR /app
# Copy package.json and install dependencies
COPY package*.json ./
RUN npm install
# Copy the rest of the code
COPY . .
# Expose port 5000
EXPOSE 5000
# Run the app
CMD ["node", "index.js"]
This tells Docker to:
2. Dockerfile for the frontend
Create a Dockerfile
folder inside the frontend folder:
# Build stage
FROM node:18 as build
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
# Serve stage
FROM nginx:alpine
COPY --from=build /app/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
Here we use a multi-stage build. First, Node builds the React app. Then Nginx serves the static files.
Using Docker Compose
Running each container separately is possible, but when you have multiple services, docker-compose
makes life easier.
Create a docker-compose.yml
file in the root folder:
version: "3"
services:
backend:
build: ./backend
ports:
- "5000:5000"
frontend:
build: ./frontend
ports:
- "3000:80"
depends_on:
- backend
This setup tells Docker to:
Build the backend image and map it to port 5000
Build the frontend image and serve it on port 3000 (Nginx default port is 80, mapped to 3000 on the host)
Make sure the frontend starts after the backend
Fixing CORS and API URLs
One small adjustment is needed. In production, the frontend should not call http://localhost:5000
. Instead, it should refer to the backend container by service name.
Update App.js
in React:
useEffect(() => {
fetch("/api")
.then((res) => res.json())
.then((result) => setData(result.message));
}, []);
Then, add a proxy setting in frontend/package.json
:
"proxy": "http://backend:5000"
This way, when the frontend requests to /api
Docker routes it to the backend container.
Running the App with Docker
Now you are ready to spin up everything:
docker-compose up --build
Docker will:
Build the backend image
Build the frontend image
Start both containers
You can now open http://localhost:3000
in your browser. The React app will load, and the message from the backend should appear.
Common Issues and Fixes
Port conflicts: If you already have something running on port 3000 or 5000, stop it or change the mapping in docker-compose.yml
.
Caching problems: Sometimes Docker caches dependencies. Use --no-cache
with docker-compose build
to force a rebuild.
File changes not updating: By default, Docker images are static. During development, you might mount volumes so that changes reflect immediately. Example for backend:
volumes:
- ./backend:/app
This mounts your local code inside the container.
Best Practices
Use .dockerignore
: Just like .gitignore
It helps keep your image clean by excluding node_modules and other unnecessary files.
Multi-stage builds: Especially for React, multi-stage builds reduce image size.
Environment variables: Instead of hardcoding API URLs or secrets, use .env
files with Docker Compose.
Keep images small: Base images like node:alpine
help reduce size.
Log everything: Containers should log to stdout and stderr, not local files.
Wrapping Up
Congratulations! You have just Dockerized a simple React + Node.js app. The steps you followed here are the foundation for more complex setups, whether that means adding a database, authentication, or scaling up to multiple services.
The beauty of Docker is that it makes your app portable. You can share your project with a teammate, and they only need to run docker-compose up
to get the same result. When it is time to deploy, most platforms like AWS, Azure, and Google Cloud can run your containers without much change.
If you are just starting with Docker, the key is to practice with small projects like this one. Once you get comfortable, you will find that Docker becomes a standard part of your workflow.