Node.js: The Complete Guide for 2026

February 11, 2026 · 25 min read

Node.js changed backend development by bringing JavaScript to the server. Built on Chrome's V8 engine, it uses an event-driven, non-blocking I/O model that makes it exceptionally well-suited for data-intensive real-time applications. Whether you are building REST APIs, microservices, real-time chat applications, or CLI tools, Node.js provides the runtime, the ecosystem, and the performance to get the job done.

This guide covers everything from initial setup to production deployment. Every section includes working code examples you can copy and adapt for your own projects. By the end, you will understand the event loop, module systems, Express.js, streams, async patterns, testing, security, and deployment strategies that power Node.js applications in production today.

⚙ Related tools: Test your API endpoints with the JSON Formatter, generate UUIDs for your database records with the UUID Generator, and keep our JavaScript Cheat Sheet open as a quick reference.

1. Installation and Setup

The recommended way to install Node.js is through a version manager. This allows you to switch between Node.js versions per project and avoids permission issues with global packages.

Using nvm (Node Version Manager)

# Install nvm
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.1/install.sh | bash

# Restart your terminal, then install Node.js LTS
nvm install --lts
nvm use --lts

# Verify installation
node --version    # v22.x.x
npm --version     # 10.x.x

# Install a specific version
nvm install 20
nvm use 20

# Set a default version
nvm alias default 22

Project Initialization

Every Node.js project starts with a package.json file. Create one interactively or accept all defaults:

# Create project directory
mkdir my-api && cd my-api

# Interactive setup
npm init

# Or accept all defaults
npm init -y

The generated package.json looks like this:

{
  "name": "my-api",
  "version": "1.0.0",
  "description": "My Node.js API",
  "main": "index.js",
  "type": "module",
  "scripts": {
    "start": "node src/index.js",
    "dev": "node --watch src/index.js",
    "test": "jest"
  },
  "keywords": [],
  "license": "MIT"
}

Note the "type": "module" field — this enables ES Module syntax throughout your project. The --watch flag (available since Node.js 18) automatically restarts your application when files change, eliminating the need for nodemon in many cases.

2. Understanding the Event Loop

The event loop is the heart of Node.js. It is what allows a single-threaded runtime to handle thousands of concurrent connections. Understanding it is essential for writing performant applications and debugging unexpected behavior.

How It Works

When Node.js starts, it initializes the event loop, processes the input script, then begins processing the event loop. Each iteration of the loop is called a tick, and it passes through these phases in order:

  1. Timers — executes callbacks scheduled by setTimeout() and setInterval()
  2. Pending callbacks — executes I/O callbacks deferred to the next iteration
  3. Idle, prepare — internal use only
  4. Poll — retrieves new I/O events; executes I/O callbacks; blocks here when appropriate
  5. Check — executes setImmediate() callbacks
  6. Close callbacks — executes close event callbacks (e.g., socket.on('close'))

Between each phase, Node.js checks for microtasks — resolved promises and process.nextTick() callbacks — and executes all of them before moving to the next phase.

// Demonstrating event loop execution order
console.log('1: Script start');

setTimeout(() => console.log('2: setTimeout'), 0);

setImmediate(() => console.log('3: setImmediate'));

Promise.resolve().then(() => console.log('4: Promise microtask'));

process.nextTick(() => console.log('5: nextTick'));

console.log('6: Script end');

// Output:
// 1: Script start
// 6: Script end
// 5: nextTick
// 4: Promise microtask
// 2: setTimeout
// 3: setImmediate

The synchronous code runs first (1, 6). Then process.nextTick() fires before promise microtasks. Finally, the timer and check phases run. This ordering matters when you need precise control over execution timing.

Avoiding Event Loop Blocking

Because the event loop is single-threaded, CPU-intensive synchronous operations block all other work. Never do heavy computation on the main thread:

// BAD: blocks the event loop
function fibonacci(n) {
    if (n <= 1) return n;
    return fibonacci(n - 1) + fibonacci(n - 2);
}
app.get('/fib/:n', (req, res) => {
    res.json({ result: fibonacci(parseInt(req.params.n)) });
});

// GOOD: offload to a worker thread
import { Worker } from 'node:worker_threads';

app.get('/fib/:n', (req, res) => {
    const worker = new Worker('./fib-worker.js', {
        workerData: { n: parseInt(req.params.n) }
    });
    worker.on('message', (result) => res.json({ result }));
    worker.on('error', (err) => res.status(500).json({ error: err.message }));
});

3. Modules: CommonJS vs ES Modules

Node.js supports two module systems. Understanding both is necessary because the ecosystem is in transition and you will encounter both in production codebases.

CommonJS (CJS)

// math.js (CommonJS)
function add(a, b) {
    return a + b;
}

function multiply(a, b) {
    return a * b;
}

module.exports = { add, multiply };

// app.js
const { add, multiply } = require('./math');
console.log(add(2, 3));       // 5
console.log(multiply(4, 5));  // 20

ES Modules (ESM)

// math.js (ES Modules)
export function add(a, b) {
    return a + b;
}

export function multiply(a, b) {
    return a * b;
}

export default class Calculator {
    evaluate(expr) { /* ... */ }
}

// app.js
import Calculator, { add, multiply } from './math.js';
console.log(add(2, 3));       // 5
console.log(multiply(4, 5));  // 20

Key Differences

Feature CommonJS ES Modules
Syntax require() / module.exports import / export
Loading Synchronous Asynchronous
Top-level await No Yes
Tree-shaking No Yes
File extension .cjs or default .mjs or "type": "module"

For new projects, use ES Modules. Set "type": "module" in your package.json and use import/export throughout.

4. npm and package.json

npm is the default package manager for Node.js and hosts over two million packages. Mastering it is essential for productive development.

# Install a production dependency
npm install express

# Install a dev dependency
npm install --save-dev jest

# Install a specific version
npm install express@4.21.0

# Install all dependencies from package.json
npm install

# Production install (skip devDependencies)
npm ci --omit=dev

# Update packages to latest within semver range
npm update

# Check for outdated packages
npm outdated

# Audit for security vulnerabilities
npm audit
npm audit fix

Understanding Semantic Versioning

Package versions follow the MAJOR.MINOR.PATCH format. In package.json, the prefix matters:

Useful package.json Scripts

{
  "scripts": {
    "start": "node src/index.js",
    "dev": "node --watch src/index.js",
    "test": "jest --coverage",
    "test:watch": "jest --watchAll",
    "lint": "eslint src/",
    "lint:fix": "eslint src/ --fix",
    "build": "tsc",
    "db:migrate": "node scripts/migrate.js",
    "docker:build": "docker build -t my-api .",
    "docker:run": "docker run -p 3000:3000 my-api"
  }
}

5. Express.js Fundamentals

Express.js is the most widely used Node.js web framework. It provides routing, middleware support, template engines, and a minimal but extensible structure for building web applications and APIs.

import express from 'express';

const app = express();
const PORT = process.env.PORT || 3000;

// Built-in middleware for parsing JSON and URL-encoded bodies
app.use(express.json());
app.use(express.urlencoded({ extended: true }));

// Basic routes
app.get('/', (req, res) => {
    res.json({ message: 'Welcome to the API' });
});

app.get('/api/users', (req, res) => {
    const { page = 1, limit = 10 } = req.query;
    // Fetch users with pagination...
    res.json({ page: Number(page), limit: Number(limit), users: [] });
});

app.get('/api/users/:id', (req, res) => {
    const { id } = req.params;
    // Fetch user by ID...
    res.json({ id, name: 'Jane Doe' });
});

app.post('/api/users', (req, res) => {
    const { name, email } = req.body;
    if (!name || !email) {
        return res.status(400).json({ error: 'Name and email are required' });
    }
    // Create user...
    res.status(201).json({ id: 1, name, email });
});

app.put('/api/users/:id', (req, res) => {
    const { id } = req.params;
    const { name, email } = req.body;
    // Update user...
    res.json({ id, name, email });
});

app.delete('/api/users/:id', (req, res) => {
    const { id } = req.params;
    // Delete user...
    res.status(204).send();
});

app.listen(PORT, () => {
    console.log(`Server running on http://localhost:${PORT}`);
});

Route Organization with Router

As your application grows, organize routes into separate files using express.Router():

// routes/users.js
import { Router } from 'express';
const router = Router();

router.get('/', getAllUsers);
router.get('/:id', getUserById);
router.post('/', createUser);
router.put('/:id', updateUser);
router.delete('/:id', deleteUser);

export default router;

// index.js
import userRoutes from './routes/users.js';
import productRoutes from './routes/products.js';

app.use('/api/users', userRoutes);
app.use('/api/products', productRoutes);

6. Middleware Patterns

Middleware functions have access to the request object, the response object, and the next() function. They can execute code, modify req/res, end the request-response cycle, or call the next middleware.

// Request logging middleware
function requestLogger(req, res, next) {
    const start = Date.now();
    res.on('finish', () => {
        const duration = Date.now() - start;
        console.log(`${req.method} ${req.url} ${res.statusCode} ${duration}ms`);
    });
    next();
}

// Authentication middleware
function authenticate(req, res, next) {
    const token = req.headers.authorization?.split(' ')[1];
    if (!token) {
        return res.status(401).json({ error: 'Authentication required' });
    }
    try {
        const payload = jwt.verify(token, process.env.JWT_SECRET);
        req.user = payload;
        next();
    } catch (err) {
        res.status(401).json({ error: 'Invalid or expired token' });
    }
}

// Rate limiting middleware
const requestCounts = new Map();

function rateLimit({ windowMs = 60000, max = 100 } = {}) {
    return (req, res, next) => {
        const key = req.ip;
        const now = Date.now();
        const windowStart = now - windowMs;

        if (!requestCounts.has(key)) {
            requestCounts.set(key, []);
        }

        const timestamps = requestCounts.get(key).filter(t => t > windowStart);
        timestamps.push(now);
        requestCounts.set(key, timestamps);

        if (timestamps.length > max) {
            return res.status(429).json({ error: 'Too many requests' });
        }

        res.set('X-RateLimit-Limit', max);
        res.set('X-RateLimit-Remaining', max - timestamps.length);
        next();
    };
}

// Apply middleware
app.use(requestLogger);
app.use(rateLimit({ windowMs: 60000, max: 100 }));

// Protected routes
app.use('/api/admin', authenticate, adminRoutes);

Error-Handling Middleware

Error-handling middleware is defined with four parameters. It must be registered after all other middleware and routes:

// Must have exactly 4 parameters
function errorHandler(err, req, res, next) {
    console.error(`Error: ${err.message}`, {
        stack: err.stack,
        url: req.url,
        method: req.method
    });

    const statusCode = err.statusCode || 500;
    const message = statusCode === 500
        ? 'Internal server error'
        : err.message;

    res.status(statusCode).json({
        error: message,
        ...(process.env.NODE_ENV === 'development' && { stack: err.stack })
    });
}

app.use(errorHandler);

7. File System Operations

The node:fs module provides methods for interacting with the file system. Always use the promise-based API (fs/promises) in modern code:

import { readFile, writeFile, mkdir, readdir, stat, unlink } from 'node:fs/promises';
import { join, dirname } from 'node:path';
import { fileURLToPath } from 'node:url';

// ESM equivalent of __dirname
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);

// Read a file
async function readConfig() {
    try {
        const data = await readFile(join(__dirname, 'config.json'), 'utf-8');
        return JSON.parse(data);
    } catch (err) {
        if (err.code === 'ENOENT') {
            console.log('Config file not found, using defaults');
            return { port: 3000, debug: false };
        }
        throw err;
    }
}

// Write a file (creates directories if needed)
async function saveData(filename, data) {
    const filepath = join(__dirname, 'data', filename);
    await mkdir(dirname(filepath), { recursive: true });
    await writeFile(filepath, JSON.stringify(data, null, 2), 'utf-8');
}

// List directory contents
async function listFiles(dir) {
    const entries = await readdir(dir, { withFileTypes: true });
    for (const entry of entries) {
        const fullPath = join(dir, entry.name);
        if (entry.isDirectory()) {
            console.log(`[DIR]  ${entry.name}`);
            await listFiles(fullPath); // recurse
        } else {
            const info = await stat(fullPath);
            console.log(`[FILE] ${entry.name} (${info.size} bytes)`);
        }
    }
}

// Watch for file changes
import { watch } from 'node:fs';

watch('./src', { recursive: true }, (eventType, filename) => {
    console.log(`${eventType}: ${filename}`);
});

8. Streams and Piping

Streams process data in chunks instead of loading everything into memory. This is critical for handling large files, HTTP bodies, and real-time data. Node.js has four types of streams: Readable, Writable, Transform, and Duplex.

import { createReadStream, createWriteStream } from 'node:fs';
import { pipeline } from 'node:stream/promises';
import { createGzip, createGunzip } from 'node:zlib';
import { Transform } from 'node:stream';

// Read a large file efficiently
const readStream = createReadStream('large-log.txt', { encoding: 'utf-8' });
let lineCount = 0;

readStream.on('data', (chunk) => {
    lineCount += chunk.split('\n').length - 1;
});

readStream.on('end', () => {
    console.log(`Total lines: ${lineCount}`);
});

// Compress a file using pipeline (recommended)
async function compressFile(input, output) {
    await pipeline(
        createReadStream(input),
        createGzip(),
        createWriteStream(output)
    );
    console.log(`Compressed ${input} to ${output}`);
}

// Custom transform stream
class CSVToJSON extends Transform {
    constructor(options) {
        super({ ...options, objectMode: true });
        this.headers = null;
        this.buffer = '';
    }

    _transform(chunk, encoding, callback) {
        this.buffer += chunk.toString();
        const lines = this.buffer.split('\n');
        this.buffer = lines.pop(); // keep incomplete last line

        for (const line of lines) {
            if (!line.trim()) continue;
            const values = line.split(',').map(v => v.trim());

            if (!this.headers) {
                this.headers = values;
                continue;
            }

            const obj = {};
            this.headers.forEach((header, i) => {
                obj[header] = values[i];
            });
            this.push(JSON.stringify(obj) + '\n');
        }
        callback();
    }

    _flush(callback) {
        if (this.buffer.trim() && this.headers) {
            const values = this.buffer.split(',').map(v => v.trim());
            const obj = {};
            this.headers.forEach((header, i) => {
                obj[header] = values[i];
            });
            this.push(JSON.stringify(obj) + '\n');
        }
        callback();
    }
}

// Use the transform stream
await pipeline(
    createReadStream('data.csv'),
    new CSVToJSON(),
    createWriteStream('data.jsonl')
);

9. Building an HTTP Server from Scratch

Understanding the built-in node:http module gives you insight into what Express abstracts away. Here is a minimal HTTP server built without any framework:

import { createServer } from 'node:http';

const routes = new Map();

function route(method, path, handler) {
    routes.set(`${method}:${path}`, handler);
}

function parseBody(req) {
    return new Promise((resolve, reject) => {
        const chunks = [];
        req.on('data', (chunk) => chunks.push(chunk));
        req.on('end', () => {
            const body = Buffer.concat(chunks).toString();
            try {
                resolve(body ? JSON.parse(body) : {});
            } catch {
                reject(new Error('Invalid JSON'));
            }
        });
        req.on('error', reject);
    });
}

// Define routes
route('GET', '/api/health', (req, res) => {
    res.writeHead(200, { 'Content-Type': 'application/json' });
    res.end(JSON.stringify({ status: 'ok', uptime: process.uptime() }));
});

route('POST', '/api/echo', async (req, res) => {
    const body = await parseBody(req);
    res.writeHead(200, { 'Content-Type': 'application/json' });
    res.end(JSON.stringify({ received: body }));
});

// Create server
const server = createServer(async (req, res) => {
    const handler = routes.get(`${req.method}:${req.url}`);

    if (!handler) {
        res.writeHead(404, { 'Content-Type': 'application/json' });
        return res.end(JSON.stringify({ error: 'Not found' }));
    }

    try {
        await handler(req, res);
    } catch (err) {
        res.writeHead(500, { 'Content-Type': 'application/json' });
        res.end(JSON.stringify({ error: 'Internal server error' }));
    }
});

server.listen(3000, () => console.log('Server listening on port 3000'));

10. Async/Await Patterns

Async/await is built on promises and makes asynchronous code read like synchronous code. Here are the patterns you will use most often:

// Sequential execution
async function processOrderSequentially(orderId) {
    const order = await fetchOrder(orderId);
    const user = await fetchUser(order.userId);
    const payment = await processPayment(order);
    const confirmation = await sendConfirmation(user.email, payment);
    return confirmation;
}

// Parallel execution with Promise.all
async function getDashboardData(userId) {
    const [profile, orders, notifications] = await Promise.all([
        fetchProfile(userId),
        fetchOrders(userId),
        fetchNotifications(userId)
    ]);
    return { profile, orders, notifications };
}

// Parallel with error tolerance using Promise.allSettled
async function fetchAllFeeds(urls) {
    const results = await Promise.allSettled(
        urls.map(url => fetch(url).then(r => r.json()))
    );

    const successful = results
        .filter(r => r.status === 'fulfilled')
        .map(r => r.value);

    const failed = results
        .filter(r => r.status === 'rejected')
        .map(r => r.reason.message);

    return { successful, failed };
}

// Retry pattern with exponential backoff
async function fetchWithRetry(url, options = {}, maxRetries = 3) {
    for (let attempt = 0; attempt <= maxRetries; attempt++) {
        try {
            const response = await fetch(url, options);
            if (!response.ok) throw new Error(`HTTP ${response.status}`);
            return await response.json();
        } catch (err) {
            if (attempt === maxRetries) throw err;
            const delay = Math.min(1000 * Math.pow(2, attempt), 10000);
            console.log(`Attempt ${attempt + 1} failed, retrying in ${delay}ms`);
            await new Promise(resolve => setTimeout(resolve, delay));
        }
    }
}

// Async iteration over streams
import { createReadStream } from 'node:fs';
import { createInterface } from 'node:readline';

async function processLargeFile(filepath) {
    const rl = createInterface({
        input: createReadStream(filepath),
        crlfDelay: Infinity
    });

    let lineNumber = 0;
    for await (const line of rl) {
        lineNumber++;
        if (line.includes('ERROR')) {
            console.log(`Error on line ${lineNumber}: ${line}`);
        }
    }
}

11. Error Handling

Robust error handling separates amateur code from production-ready applications. Node.js has multiple error propagation mechanisms, and you need to handle all of them.

// Custom error classes
class AppError extends Error {
    constructor(message, statusCode = 500, code = 'INTERNAL_ERROR') {
        super(message);
        this.name = 'AppError';
        this.statusCode = statusCode;
        this.code = code;
        Error.captureStackTrace(this, this.constructor);
    }
}

class NotFoundError extends AppError {
    constructor(resource = 'Resource') {
        super(`${resource} not found`, 404, 'NOT_FOUND');
    }
}

class ValidationError extends AppError {
    constructor(message) {
        super(message, 400, 'VALIDATION_ERROR');
    }
}

// Usage in route handlers
app.get('/api/users/:id', async (req, res, next) => {
    try {
        const user = await db.users.findById(req.params.id);
        if (!user) throw new NotFoundError('User');
        res.json(user);
    } catch (err) {
        next(err); // pass to error-handling middleware
    }
});

// Global safety nets (for truly unexpected errors)
process.on('uncaughtException', (err) => {
    console.error('Uncaught exception:', err);
    // Log to monitoring service
    process.exit(1); // exit and let PM2 restart
});

process.on('unhandledRejection', (reason, promise) => {
    console.error('Unhandled rejection at:', promise, 'reason:', reason);
    // Log to monitoring service
    process.exit(1);
});

Graceful Shutdown

function gracefulShutdown(server) {
    const shutdown = async (signal) => {
        console.log(`${signal} received. Starting graceful shutdown...`);

        server.close(() => {
            console.log('HTTP server closed');
        });

        // Close database connections
        await db.disconnect();
        console.log('Database connections closed');

        // Close Redis, message queues, etc.
        await cache.quit();
        console.log('Cache connections closed');

        process.exit(0);
    };

    process.on('SIGTERM', () => shutdown('SIGTERM'));
    process.on('SIGINT', () => shutdown('SIGINT'));
}

const server = app.listen(3000);
gracefulShutdown(server);

12. Environment Variables and Configuration

Never hard-code configuration values. Use environment variables for anything that changes between environments (development, staging, production) or contains secrets.

// config.js - centralized configuration
import { readFileSync, existsSync } from 'node:fs';

// Load .env file in development (Node.js 21+ has built-in .env support)
// For older versions: npm install dotenv

const config = {
    port: parseInt(process.env.PORT, 10) || 3000,
    nodeEnv: process.env.NODE_ENV || 'development',
    db: {
        host: process.env.DB_HOST || 'localhost',
        port: parseInt(process.env.DB_PORT, 10) || 5432,
        name: process.env.DB_NAME || 'myapp_dev',
        user: process.env.DB_USER || 'postgres',
        password: process.env.DB_PASSWORD,
        ssl: process.env.DB_SSL === 'true'
    },
    jwt: {
        secret: process.env.JWT_SECRET,
        expiresIn: process.env.JWT_EXPIRES_IN || '24h'
    },
    redis: {
        url: process.env.REDIS_URL || 'redis://localhost:6379'
    },
    cors: {
        origins: process.env.CORS_ORIGINS?.split(',') || ['http://localhost:5173']
    }
};

// Validate required variables
const required = ['DB_PASSWORD', 'JWT_SECRET'];
const missing = required.filter(key => !process.env[key]);

if (missing.length > 0 && config.nodeEnv === 'production') {
    throw new Error(`Missing required environment variables: ${missing.join(', ')}`);
}

export default config;

Your .env file (never commit this to version control):

# .env
NODE_ENV=development
PORT=3000
DB_HOST=localhost
DB_PORT=5432
DB_NAME=myapp_dev
DB_USER=postgres
DB_PASSWORD=local_dev_password
JWT_SECRET=dev-only-secret-change-in-prod
REDIS_URL=redis://localhost:6379

Starting with Node.js 21, you can load .env files natively without any third-party package:

node --env-file=.env src/index.js

13. Clustering and Worker Threads

A single Node.js process runs on one CPU core. To utilize all available cores, use the cluster module for networking or worker threads for CPU-intensive tasks.

Cluster Module

import cluster from 'node:cluster';
import { availableParallelism } from 'node:os';
import process from 'node:process';

const numCPUs = availableParallelism();

if (cluster.isPrimary) {
    console.log(`Primary ${process.pid} is running`);
    console.log(`Forking ${numCPUs} workers...`);

    for (let i = 0; i < numCPUs; i++) {
        cluster.fork();
    }

    cluster.on('exit', (worker, code, signal) => {
        console.log(`Worker ${worker.process.pid} died (${signal || code})`);
        console.log('Forking a replacement worker...');
        cluster.fork();
    });
} else {
    // Workers share the same server port
    const app = createApp(); // your Express app
    app.listen(3000, () => {
        console.log(`Worker ${process.pid} started`);
    });
}

Worker Threads

// heavy-task.js (worker)
import { parentPort, workerData } from 'node:worker_threads';

function heavyComputation(data) {
    // Simulate CPU-intensive work
    let result = 0;
    for (let i = 0; i < data.iterations; i++) {
        result += Math.sqrt(i) * Math.sin(i);
    }
    return result;
}

const result = heavyComputation(workerData);
parentPort.postMessage(result);

// main.js
import { Worker } from 'node:worker_threads';

function runWorker(data) {
    return new Promise((resolve, reject) => {
        const worker = new Worker('./heavy-task.js', { workerData: data });
        worker.on('message', resolve);
        worker.on('error', reject);
        worker.on('exit', (code) => {
            if (code !== 0) {
                reject(new Error(`Worker exited with code ${code}`));
            }
        });
    });
}

// Run computation off the main thread
app.get('/api/compute', async (req, res) => {
    const result = await runWorker({ iterations: 100_000_000 });
    res.json({ result });
});

14. Testing with Jest

Automated tests are non-negotiable in production codebases. Jest is the most popular testing framework for Node.js. It includes a test runner, assertion library, and mocking support out of the box.

# Install Jest
npm install --save-dev jest @jest/globals
// userService.js
export class UserService {
    constructor(db, emailService) {
        this.db = db;
        this.emailService = emailService;
    }

    async createUser(name, email) {
        if (!name || !email) {
            throw new Error('Name and email are required');
        }
        if (!email.includes('@')) {
            throw new Error('Invalid email format');
        }

        const existingUser = await this.db.findByEmail(email);
        if (existingUser) {
            throw new Error('Email already in use');
        }

        const user = await this.db.create({ name, email });
        await this.emailService.sendWelcome(user.email, user.name);
        return user;
    }

    async getUser(id) {
        const user = await this.db.findById(id);
        if (!user) throw new Error('User not found');
        return user;
    }
}

// userService.test.js
import { jest } from '@jest/globals';
import { UserService } from './userService.js';

describe('UserService', () => {
    let service;
    let mockDb;
    let mockEmail;

    beforeEach(() => {
        mockDb = {
            findByEmail: jest.fn(),
            findById: jest.fn(),
            create: jest.fn()
        };
        mockEmail = {
            sendWelcome: jest.fn()
        };
        service = new UserService(mockDb, mockEmail);
    });

    describe('createUser', () => {
        it('should create a user and send welcome email', async () => {
            mockDb.findByEmail.mockResolvedValue(null);
            mockDb.create.mockResolvedValue({ id: 1, name: 'Jane', email: 'jane@test.com' });
            mockEmail.sendWelcome.mockResolvedValue(true);

            const user = await service.createUser('Jane', 'jane@test.com');

            expect(user).toEqual({ id: 1, name: 'Jane', email: 'jane@test.com' });
            expect(mockDb.create).toHaveBeenCalledWith({
                name: 'Jane', email: 'jane@test.com'
            });
            expect(mockEmail.sendWelcome).toHaveBeenCalledWith('jane@test.com', 'Jane');
        });

        it('should throw when email is already in use', async () => {
            mockDb.findByEmail.mockResolvedValue({ id: 2, email: 'jane@test.com' });

            await expect(service.createUser('Jane', 'jane@test.com'))
                .rejects.toThrow('Email already in use');
        });

        it('should throw when name or email is missing', async () => {
            await expect(service.createUser('', 'jane@test.com'))
                .rejects.toThrow('Name and email are required');
        });

        it('should throw on invalid email format', async () => {
            await expect(service.createUser('Jane', 'not-an-email'))
                .rejects.toThrow('Invalid email format');
        });
    });

    describe('getUser', () => {
        it('should return a user by ID', async () => {
            mockDb.findById.mockResolvedValue({ id: 1, name: 'Jane' });
            const user = await service.getUser(1);
            expect(user).toEqual({ id: 1, name: 'Jane' });
        });

        it('should throw when user is not found', async () => {
            mockDb.findById.mockResolvedValue(null);
            await expect(service.getUser(999)).rejects.toThrow('User not found');
        });
    });
});

Testing HTTP Endpoints

// Install supertest: npm install --save-dev supertest
import request from 'supertest';
import { createApp } from './app.js';

describe('GET /api/health', () => {
    const app = createApp();

    it('should return 200 with status ok', async () => {
        const res = await request(app)
            .get('/api/health')
            .expect(200);

        expect(res.body.status).toBe('ok');
        expect(res.body).toHaveProperty('uptime');
    });
});

describe('POST /api/users', () => {
    const app = createApp();

    it('should create a user with valid data', async () => {
        const res = await request(app)
            .post('/api/users')
            .send({ name: 'Jane', email: 'jane@test.com' })
            .expect(201);

        expect(res.body.name).toBe('Jane');
        expect(res.body).toHaveProperty('id');
    });

    it('should return 400 when email is missing', async () => {
        await request(app)
            .post('/api/users')
            .send({ name: 'Jane' })
            .expect(400);
    });
});

15. Security Best Practices

Security vulnerabilities can be catastrophic. Follow these practices to harden your Node.js application:

Input Validation and Sanitization

// Use a validation library like Zod
import { z } from 'zod';

const CreateUserSchema = z.object({
    name: z.string().min(1).max(100).trim(),
    email: z.string().email().toLowerCase(),
    age: z.number().int().min(13).max(150).optional(),
    role: z.enum(['user', 'admin']).default('user')
});

app.post('/api/users', (req, res, next) => {
    try {
        const validData = CreateUserSchema.parse(req.body);
        // validData is typed and sanitized
        // Proceed with creating user...
        res.status(201).json(validData);
    } catch (err) {
        if (err instanceof z.ZodError) {
            return res.status(400).json({
                error: 'Validation failed',
                details: err.errors
            });
        }
        next(err);
    }
});

Security Headers and CORS

import helmet from 'helmet';
import cors from 'cors';

// Helmet sets secure HTTP headers
app.use(helmet());

// Configure CORS properly
app.use(cors({
    origin: ['https://yourdomain.com', 'https://app.yourdomain.com'],
    methods: ['GET', 'POST', 'PUT', 'DELETE'],
    allowedHeaders: ['Content-Type', 'Authorization'],
    credentials: true,
    maxAge: 86400 // cache preflight for 24 hours
}));

// Prevent parameter pollution
app.use((req, res, next) => {
    for (const key of Object.keys(req.query)) {
        if (Array.isArray(req.query[key])) {
            req.query[key] = req.query[key][req.query[key].length - 1];
        }
    }
    next();
});

Additional Security Measures

16. Deployment: PM2 and Docker

PM2 Process Manager

PM2 keeps your application running, automatically restarts it on crashes, and provides monitoring. It is the standard process manager for Node.js in production:

# Install PM2 globally
npm install -g pm2

# Start your application
pm2 start src/index.js --name my-api

# Cluster mode (uses all CPU cores)
pm2 start src/index.js --name my-api -i max

# View running processes
pm2 list

# Monitor in real-time
pm2 monit

# View logs
pm2 logs my-api

# Restart / stop / delete
pm2 restart my-api
pm2 stop my-api
pm2 delete my-api

# Save process list for auto-start on reboot
pm2 save
pm2 startup

PM2 Ecosystem File

// ecosystem.config.cjs
module.exports = {
    apps: [{
        name: 'my-api',
        script: 'src/index.js',
        instances: 'max',
        exec_mode: 'cluster',
        env: {
            NODE_ENV: 'development',
            PORT: 3000
        },
        env_production: {
            NODE_ENV: 'production',
            PORT: 8080
        },
        max_memory_restart: '500M',
        log_date_format: 'YYYY-MM-DD HH:mm:ss',
        error_file: '/var/log/my-api/error.log',
        out_file: '/var/log/my-api/out.log',
        merge_logs: true
    }]
};

Docker Deployment

Containers provide consistent environments and simple scaling. Here is a production-optimized Dockerfile:

# Stage 1: Install dependencies
FROM node:22-alpine AS deps
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm ci --omit=dev

# Stage 2: Production image
FROM node:22-alpine
WORKDIR /app

# Create non-root user
RUN addgroup -S appgroup && adduser -S appuser -G appgroup

# Copy dependencies and source
COPY --from=deps /app/node_modules ./node_modules
COPY . .

# Set ownership and switch to non-root user
RUN chown -R appuser:appgroup /app
USER appuser

# Expose port and set environment
EXPOSE 3000
ENV NODE_ENV=production

# Health check
HEALTHCHECK --interval=30s --timeout=5s --start-period=10s --retries=3 \
    CMD wget --no-verbose --tries=1 --spider http://localhost:3000/api/health || exit 1

CMD ["node", "src/index.js"]
# docker-compose.yml
services:
  api:
    build: .
    ports:
      - "3000:3000"
    environment:
      - NODE_ENV=production
      - DB_HOST=db
      - DB_PORT=5432
      - DB_NAME=myapp
      - DB_USER=postgres
      - DB_PASSWORD_FILE=/run/secrets/db_password
      - JWT_SECRET_FILE=/run/secrets/jwt_secret
    depends_on:
      db:
        condition: service_healthy
    restart: unless-stopped

  db:
    image: postgres:16-alpine
    volumes:
      - pgdata:/var/lib/postgresql/data
    environment:
      - POSTGRES_DB=myapp
      - POSTGRES_PASSWORD_FILE=/run/secrets/db_password
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U postgres"]
      interval: 10s
      timeout: 5s
      retries: 5

volumes:
  pgdata:

Build and run:

# Build the image
docker build -t my-api:latest .

# Run with docker compose
docker compose up -d

# View logs
docker compose logs -f api

# Scale the API service
docker compose up -d --scale api=3

17. Performance Monitoring

Monitoring is essential in production. Track response times, memory usage, event loop lag, and error rates to catch issues before they affect users.

// Simple response time tracking middleware
function performanceMiddleware(req, res, next) {
    const start = process.hrtime.bigint();

    res.on('finish', () => {
        const duration = Number(process.hrtime.bigint() - start) / 1e6;
        const logData = {
            method: req.method,
            url: req.originalUrl,
            status: res.statusCode,
            duration: `${duration.toFixed(2)}ms`,
            contentLength: res.get('Content-Length') || 0
        };

        if (duration > 1000) {
            console.warn('SLOW REQUEST:', logData);
        }
    });

    next();
}

// Memory and event loop monitoring
function startHealthMonitor(intervalMs = 30000) {
    setInterval(() => {
        const mem = process.memoryUsage();
        const stats = {
            rss: `${(mem.rss / 1024 / 1024).toFixed(1)}MB`,
            heapUsed: `${(mem.heapUsed / 1024 / 1024).toFixed(1)}MB`,
            heapTotal: `${(mem.heapTotal / 1024 / 1024).toFixed(1)}MB`,
            external: `${(mem.external / 1024 / 1024).toFixed(1)}MB`,
            uptime: `${(process.uptime() / 3600).toFixed(1)}h`
        };

        console.log('Health:', stats);

        // Alert if memory is too high
        if (mem.heapUsed / mem.heapTotal > 0.9) {
            console.error('MEMORY WARNING: Heap usage above 90%');
        }
    }, intervalMs);
}

// Event loop lag detection
function monitorEventLoopLag(thresholdMs = 100) {
    let lastCheck = Date.now();

    setInterval(() => {
        const now = Date.now();
        const lag = now - lastCheck - 1000; // expected interval is 1000ms
        lastCheck = now;

        if (lag > thresholdMs) {
            console.warn(`Event loop lag: ${lag}ms`);
        }
    }, 1000).unref(); // unref so it doesn't keep the process alive
}

// Health check endpoint
app.get('/api/health', (req, res) => {
    const mem = process.memoryUsage();
    res.json({
        status: 'ok',
        uptime: process.uptime(),
        timestamp: new Date().toISOString(),
        memory: {
            rss: mem.rss,
            heapUsed: mem.heapUsed,
            heapTotal: mem.heapTotal
        },
        nodeVersion: process.version
    });
});

Structured Logging

Use structured JSON logging in production for easy parsing by log aggregation tools:

// Simple structured logger
const LOG_LEVELS = { error: 0, warn: 1, info: 2, debug: 3 };
const currentLevel = LOG_LEVELS[process.env.LOG_LEVEL || 'info'];

function log(level, message, meta = {}) {
    if (LOG_LEVELS[level] > currentLevel) return;

    const entry = {
        timestamp: new Date().toISOString(),
        level,
        message,
        ...meta,
        pid: process.pid
    };

    const output = level === 'error' ? process.stderr : process.stdout;
    output.write(JSON.stringify(entry) + '\n');
}

// Usage
log('info', 'Server started', { port: 3000 });
log('error', 'Database connection failed', { host: 'db.example.com', err: err.message });
log('debug', 'Request received', { method: 'GET', url: '/api/users' });

Frequently Asked Questions

What is the Node.js event loop and how does it work?

The Node.js event loop is the mechanism that allows Node.js to perform non-blocking I/O operations despite JavaScript being single-threaded. It works by offloading operations to the system kernel whenever possible. When an asynchronous operation completes, its callback is placed into a queue. The event loop continuously cycles through phases: timers (setTimeout/setInterval callbacks), pending callbacks (I/O callbacks deferred to the next iteration), idle/prepare (internal use), poll (retrieve new I/O events and execute their callbacks), check (setImmediate callbacks), and close callbacks. This architecture allows a single Node.js process to handle thousands of concurrent connections without the overhead of thread management.

What is the difference between CommonJS and ES Modules in Node.js?

CommonJS uses require() and module.exports syntax and loads modules synchronously at runtime. ES Modules (ESM) use import/export syntax and support static analysis, tree-shaking, and top-level await. To use ESM in Node.js, either set "type": "module" in package.json or use the .mjs file extension. CommonJS files can use the .cjs extension when the project is set to ESM mode. ESM is the modern standard and is recommended for new projects, but CommonJS remains widely used in the ecosystem. You can import CommonJS modules from ESM files, but not the other way around without using dynamic import().

How do I handle errors properly in Node.js?

Node.js error handling depends on the pattern being used. For synchronous code, use try/catch blocks. For promises and async/await, use try/catch around awaited calls or .catch() on promise chains. For event emitters, listen for the 'error' event. For Express.js, use error-handling middleware with four parameters (err, req, res, next). Always handle uncaught exceptions with process.on('uncaughtException') and unhandled rejections with process.on('unhandledRejection') as safety nets, but never rely on them for normal flow. In production, log the error and exit the process gracefully, letting a process manager like PM2 restart it. Create custom error classes that extend Error for domain-specific errors with meaningful status codes and messages.

When should I use streams in Node.js?

Use streams when working with large amounts of data that would be impractical to load entirely into memory. Common use cases include reading or writing large files, processing HTTP request and response bodies, transforming data on the fly (compression, encryption, parsing), piping data between sources and destinations, and real-time data processing. For example, reading a 2GB log file with fs.readFile() would consume 2GB of RAM, but using fs.createReadStream() processes it in small chunks (typically 64KB). Streams are also essential for building efficient HTTP servers that can start sending responses before reading the entire source, reducing time-to-first-byte. The pipeline() function from the stream module is the recommended way to pipe streams, as it automatically handles error propagation and cleanup.

How do I deploy a Node.js application to production?

For production deployment, use a process manager like PM2 to keep your application running, handle automatic restarts on crashes, and manage log rotation. Set NODE_ENV=production to enable optimizations. Use a reverse proxy like Nginx or Traefik to handle SSL termination, static files, load balancing, and request buffering. For containerized deployments, create a multi-stage Dockerfile: use node:lts-alpine as the base, copy package.json first and run npm ci --omit=dev for efficient layer caching, then copy source code and set the CMD. Use health checks, configure proper logging (structured JSON to stdout), set memory limits, and implement graceful shutdown by listening for SIGTERM and closing servers and database connections before exiting. Use environment variables for all configuration and never commit secrets to version control.

Related Resources

Related Resources

REST API Design Guide
Design robust RESTful APIs with proper patterns and conventions
Docker Complete Guide
Containerize your Node.js applications for production deployment
JavaScript ES6+ Features
Master modern JavaScript syntax used throughout Node.js
TypeScript Tips and Tricks
Add type safety to your Node.js projects with TypeScript
JSON Formatter
Format and validate JSON data from your API responses
JavaScript Cheat Sheet
Quick reference for JavaScript syntax, methods, and patterns