Table of Contents
Node.js
Q. What is Node.js?
Node.js is a free, open source tool that lets you run JavaScript outside the web browser.
With Node.js, you can build fast and scalable applications like web servers, APIs, tools, and more.
Q. What Can You Build With Node.js?
Node.js uses an event-driven, non-blocking model.
It can handle many connections at once without waiting for one to finish before starting another.
This makes it great for real-time apps and high-traffic websites.
Q. What is npm?
npm is the package manager for Node.js.
It helps you install and manage third-party packages (libraries) to add more features to your apps.
Node.js vs Browser
| Feature | Node.js | Browser |
|---|---|---|
| File System Access | Yes | No |
| Networking (TCP/UDP) | Yes | No |
| DOM Access | No | Yes |
| Global Object | global | window / self |
| Modules | CommonJS / ESM | ESM / Scripts |
| Environment Variables | Yes (process.env) | No |
| Security | Full OS access | Sandboxed |
| Package Management | npm / yarn | CDN / Bundler |
Node Cmd Line
Node.js provides a powerful command line interface (CLI) that allows you to run JavaScript files, manage packages, debug applications, and more.
This guide covers the essential commands and techniques every Node.js developer should know.
Basic Node.js Commands
# Run a JavaScript filenode app.js
# Run with additional argumentsnode app.js arg1 arg2
# Run in watch mode (restarts on file changes)node --watch app.jsDebugging Node.js Applications
# Start with inspector (listens on default port 9229)node --inspect app.jsCommon CLI Tools
# Install and use different Node.js versionsnvm install 18.16.0 # Install specific versionnvm use 18.16.0 # Switch to versionnvm ls # List installed versions
# Common npm commandsnpm init # Initialize a new projectnpm install # Install dependenciesnpm update # Update packagesnpm audit # Check for vulnerabilitiesCommon Command Line Flags
Basic Flags
# Show Node.js versionnode --version # or -v
# Show V8 versionnode --v8-options
# Show command-line helpnode --helpRuntime Behavior
# Check syntax without executingnode --check app.js
# Show stack traces for warningsnode --trace-warnings app.js
# Set max memory (in MB)node --max-old-space-size=4096 app.js
# Preload a module before executionnode --require dotenv/config app.jsNode Architecture
Node.js uses a single-threaded, event-driven architecture that is designed to handle many connections at once, efficiently and without blocking the main thread.
Node.js Architecture Diagram
- Client Request Phase
Clients send requests to the Node.js server Each request is added to the Event Queue
- Event Loop Phase
The Event Loop continuously checks the Event Queue Picks up requests one by one in a loop
- Request Processing
Simple (non-blocking) tasks are handled immediately by the main thread Complex/blocking tasks are offloaded to the Thread Pool
- Response Phase
When blocking tasks complete, their callbacks are placed in the Callback Queue Event Loop processes callbacks and sends responses
Node Async Programming
Q. What is Asynchronous Programming?
In Node.js, asynchronous operations let your program do other work while waiting for tasks like file I/O or network requests to complete.
This non-blocking approach enables Node.js to handle thousands of concurrent connections efficiently.
Node Async
Q. Why Use Asynchronous Code?
Asynchronous code lets Node.js handle many requests at once, without waiting for slow operations like file or database access.
This makes Node.js great for servers and real-time apps.
A. Synchronous
a. Blocks execution until complete
b. Simple to understand
c. Can cause delays
d. Uses functions like readFileSync
const fs = require('fs');
console.log('1. Starting sync read...');const data = fs.readFileSync('myfile.txt', 'utf8');console.log('2. File contents:', data);console.log('3. Done reading file');
Output: 1 → 2 → 3B. Asynchronous
a. Non-blocking execution
b. Better performance
c. More complex to handle
d. Uses callbacks, promises, or async/await
const fs = require('fs');
console.log('1. Starting async read...');fs.readFile('myfile.txt', 'utf8', (err, data) => { if (err) throw err; console.log('2. File contents:', data);});
console.log('3. Done starting read operation');
Output: 1 → 3 → 2 (doesn't wait for file read to complete)Node Promises
Promises in Node.js provide a cleaner way to handle asynchronous operations compared to traditional callbacks.
Promises represent the completion (or failure) of an asynchronous operation and its result.
Promise States
-
Pending: Initial state, operation not completed
-
Fulfilled: Operation completed successfully
-
Rejected: Operation failed
Benefits of Using Promises
with Callback
getUser(id, (err, user) => { if (err) return handleError(err); getOrders(user.id, (err, orders) => { if (err) return handleError(err); // Process orders... });});with Promises
getUser(id) .then(user => getOrders(user.id)) .then(orders => processOrders(orders)) .catch(handleError);Key Advantages:
-
Flatter code structure (avoids callback hell)
-
Better error handling with single .catch()
-
Easier to compose and chain operations
-
Built-in support for parallel operations
Utility Methods
- Promise.all([p1, p2, …])
“All or Nothing”
Success: Resolves only when ALL input promises resolve. It returns an array of the results.
Failure: Rejects immediately if ANY ONE of the promises rejects (“fail-fast”).
It is used to run multiple promises in parallel, and wait for ALL of them to complete. It fails fast if any promise rejects.
const p1 = Promise.resolve(10);const p2 = Promise.resolve(20);const p3 = Promise.reject("Error!");
// Success ScenarioPromise.all([p1, p2]).then(console.log); // [10, 20]
// Failure ScenarioPromise.all([p1, p2, p3]) .then(console.log) .catch(err => console.log("Failed because:", err)); // Logs: "Failed because: Error!" (ignores p1 and p2 results)- Promise.allSettled([p1, p2, …])
“Wait for everyone, no matter what”
Behavior: Waits for ALL promises to finish, regardless of whether they succeeded or failed.
Result: Returns an array of objects describing the outcome of each promise.
Use Case: When you want to fetch data from multiple independent sources and you want to display whatever data you got, even if some failed.
const p1 = Promise.resolve("Good data");const p2 = Promise.reject("Bad connection");
Promise.allSettled([p1, p2]).then(results => console.log(results));
/* Output:[ { status: 'fulfilled', value: 'Good data' }, { status: 'rejected', reason: 'Bad connection' }]*/- Promise.race([p1, p2, …])
“First one to finish (win or lose)”
Behavior: Settles as soon as the first promise settles.
Result: It takes the result (value or error) of the very first promise to finish.
Use Case: Timeouts. You race a fetch request against a 5-second timer. If the timer finishes first, you cancel the request.
const slowFetch = new Promise(resolve => setTimeout(() => resolve("Data"), 5000));const timeout = new Promise((_, reject) => setTimeout(() => reject("Timed out"), 1000));
Promise.race([slowFetch, timeout]) .then(console.log) .catch(console.error); // Logs: "Timed out" (because 1s < 5s)Utility Methods
- Promise.resolve(value)
Creates a Promise that is immediately resolved (successful) with the given value.
Use case: When you need to return a Promise to keep an API consistent, even if you already have the value synchronously.
const happyPromise = Promise.resolve("Success!");
happyPromise.then(data => console.log(data)); // Logs: "Success!"- Promise.reject(reason)
Creates a Promise that is immediately rejected (failed) with the given reason (usually an Error object).
Use case: Useful for testing error handling or early-exiting an async function with an error.
const sadPromise = Promise.reject(new Error("Something went wrong"));
sadPromise.catch(err => console.error(err.message)); // Logs: "Something went wrong"Node Async/await
Async/await is a modern way to handle asynchronous operations in Node.js, building on top of Promises to create even more readable code.
Introduced in Node.js 7.6 and standardized in ES2017, async/await allows you to write asynchronous code that looks and behaves more like synchronous code.
Async/await is basically Promises with a more readable syntax. This makes your code cleaner and more maintainable.
The syntax consists of two keywords:
async: Used to declare an asynchronous function that returns a Promise
await: Used to pause execution until a Promise is resolved, can only be used inside async functions
async function getData() { console.log('Starting...'); const result = await someAsyncOperation(); console.log(`Result: ${result}`); return result;}Async/Await vs Promises vs Callbacks
with callbacks
function getUser(userId, callback) { setTimeout(() => { callback(null, { id: userId, name: 'John' }); }, 1000);}
function getUserPosts(user, callback) { setTimeout(() => { callback(null, ['Post 1', 'Post 2']); }, 1000);}
// Using callbacksgetUser(1, (error, user) => { if (error) { console.error(error); return; } console.log('User:', user);
getUserPosts(user, (error, posts) => { if (error) { console.error(error); return; } console.log('Posts:', posts); });});with promises
function getUserPromise(userId) { return new Promise(resolve => { setTimeout(() => { resolve({ id: userId, name: 'John' }); }, 1000); });}
function getUserPostsPromise(user) { return new Promise(resolve => { setTimeout(() => { resolve(['Post 1', 'Post 2']); }, 1000); });}
// Using promisesgetUserPromise(1) .then(user => { console.log('User:', user); return getUserPostsPromise(user); }) .then(posts => { console.log('Posts:', posts); }) .catch(error => { console.error(error); });With Async/Await
// Using async/awaitasync function getUserAndPosts() { try { const user = await getUserPromise(1); console.log('User:', user);
const posts = await getUserPostsPromise(user); console.log('Posts:', posts); } catch (error) { console.error(error); }}
getUserAndPosts();| Pattern | Pros | Cons |
|---|---|---|
| Callbacks | - Simple to understand - Widely supported | - Callback hell - Error handling is complex - Hard to reason about |
| Promises | - Chaining with .then()- Better error handling - Composable | - Still requires nesting for complex flows - Not as readable as async/await |
| Async/Await | - Clean, synchronous-like code - Easy error handling with try/catch - Easier debugging | - Requires understanding of Promises - Easy to accidentally block execution |
Node.js Error Handling
Q. Why Handle Errors?
Errors are inevitable in any program, but how you handle them makes all the difference. In Node.js, proper error handling is crucial because:
-
It prevents applications from crashing unexpectedly
-
It provides meaningful feedback to users
-
It makes debugging easier with proper error context
-
It helps maintain application stability in production
-
It ensures resources are properly cleaned up
Common Error Types in Node.js
- Standard JavaScript Errors
// SyntaxErrorJSON.parse('{invalid json}');
// TypeErrornull.someProperty;
// ReferenceErrorunknownVariable;- System Errors
// ENOENT: No such file or directoryconst fs = require('fs');fs.readFile('nonexistent.txt', (err) => { console.error(err.code); // 'ENOENT'});
// ECONNREFUSED: Connection refusedconst http = require('http');const req = http.get('http://nonexistent-site.com', (res) => {});req.on('error', (err) => { console.error(err.code); // 'ECONNREFUSED' or 'ENOTFOUND'});Basic Error Handling
Error-First Callbacks
The most common pattern in Node.js core modules where the first argument to a callback is an error object (if any occurred).
const fs = require('fs');
function readConfigFile(filename, callback) { fs.readFile(filename, 'utf8', (err, data) => { if (err) { // Handle specific error types if (err.code === 'ENOENT') { return callback(new Error(`Config file ${filename} not found`)); } else if (err.code === 'EACCES') { return callback(new Error(`No permission to read ${filename}`)); } // For all other errors return callback(err); }
// Process data if no error try { const config = JSON.parse(data); callback(null, config); } catch (parseError) { callback(new Error(`Invalid JSON in ${filename}`)); } });}
// UsagereadConfigFile('config.json', (err, config) => { if (err) { console.error('Failed to read config:', err.message); // Handle the error (e.g., use default config) return; } console.log('Config loaded successfully:', config);});Modern Error Handling
Using try…catch with Async/Await
With async/await, you can use try/catch blocks for both synchronous and asynchronous code:
const fs = require('fs').promises;
async function loadUserData(userId) { try { const data = await fs.readFile(`users/${userId}.json`, 'utf8'); const user = JSON.parse(data);
if (!user.email) { throw new Error('Invalid user data: missing email'); }
return user; } catch (error) { // Handle different error types if (error.code === 'ENOENT') { throw new Error(`User ${userId} not found`); } else if (error instanceof SyntaxError) { throw new Error('Invalid user data format'); } // Re-throw other errors throw error; } finally { // Cleanup code that runs whether successful or not console.log(`Finished processing user ${userId}`); }}
// Usage(async () => { try { const user = await loadUserData(123); console.log('User loaded:', user); } catch (error) { console.error('Failed to load user:', error.message); // Handle error (e.g., show to user, retry, etc.) }})();Global Error Handling
For unexpected errors, you can listen for uncaughtException to perform cleanup before exiting:
// Handle uncaught exceptions (synchronous errors)process.on('uncaughtException', (error) => { console.error('UNCAUGHT EXCEPTION! Shutting down...'); console.error(error.name, error.message);
// Perform cleanup (close database connections, etc.) server.close(() => { console.log('Process terminated due to uncaught exception'); process.exit(1); // Exit with failure });});
// Handle unhandled promise rejectionsprocess.on('unhandledRejection', (reason, promise) => { console.error('UNHANDLED REJECTION! Shutting down...'); console.error('Unhandled Rejection at:', promise, 'Reason:', reason);
// Close server and exit server.close(() => { process.exit(1); });});
// Example of an unhandled promise rejectionPromise.reject(new Error('Something went wrong'));
// Example of an uncaught exceptionsetTimeout(() => { throw new Error('Uncaught exception after timeout');}, 1000);Custom Error Types
class ValidationError extends Error { constructor(message, field) { super(message); this.name = 'ValidationError'; this.field = field; this.statusCode = 400; }}
class NotFoundError extends Error { constructor(resource) { super(`${resource} not found`); this.name = 'NotFoundError'; this.statusCode = 404; }}
// Usagefunction getUser(id) { if (!id) { throw new ValidationError('User ID is required', 'id'); } // ...}Module Basics
Modules are the building blocks of Node.js applications, allowing you to organize code into logical, reusable components. They help in:
-
Organizing code into manageable files
-
Encapsulating functionality
-
Preventing global namespace pollution
-
Improving code maintainability and reusability
Creating and Exporting Modules
In Node.js, any file with a .js extension is a module. You can export functionality from a module in several ways:
- Exporting Multiple Items
// Exporting multiple functionsconst getCurrentDate = () => new Date().toISOString();
const formatCurrency = (amount, currency = 'USD') => { return new Intl.NumberFormat('en-US', { style: 'currency', currency: currency }).format(amount);};
// Method 1: Exporting multiple itemsexports.getCurrentDate = getCurrentDate;exports.formatCurrency = formatCurrency;
// Method 2: Exporting an object with multiple properties// module.exports = { getCurrentDate, formatCurrency };- Exporting a Single Item
To export a single item (function, object, etc.), assign it to module.exports:
class Logger { constructor(name) { this.name = name; }
log(message) { console.log(`[${this.name}] ${message}`); }
error(error) { console.error(`[${this.name}] ERROR:`, error.message); }}
// Exporting a single classmodule.exports = Logger;- Using Your Modules
Import and use your custom modules using require() with a relative or absolute path:
const http = require('http');const path = require('path');
// Importing custom modulesconst { getCurrentDate, formatCurrency } = require('./utils');const Logger = require('./logger');
// Create a logger instanceconst logger = new Logger('App');
// Create serverconst server = http.createServer((req, res) => { try { logger.log(`Request received for ${req.url}`);
res.writeHead(200, { 'Content-Type': 'text/html' }); res.write(`<h1>Welcome to our app!</h1>`); res.write(`<p>Current date: ${getCurrentDate()}</p>`); res.write(`<p>Formatted amount: ${formatCurrency(99.99)}</p>`); res.end(); } catch (error) { logger.error(error); res.writeHead(500, { 'Content-Type': 'text/plain' }); res.end('Internal Server Error'); }});
// Start serverconst PORT = process.env.PORT || 3000;server.listen(PORT, () => { logger.log(`Server running at http://localhost:${PORT}`);});- Module Loading and Caching
Node.js caches modules after the first time they are loaded. This means that subsequent require() calls return the cached version.
Module Resolution
When you require a module, Node.js looks for it in this order:
-
Core Node.js modules (like fs, http)
-
Node modules in node_modules folders
-
Local files (using ./ or ../ prefix)
Node ES Modules
ES Modules (ESM) is the official standard format for packaging JavaScript code for reuse.
It was introduced in ES6 (ES2015) and is now supported in Node.js.
Prior to ES Modules, Node.js exclusively used the CommonJS module format (require/exports).
Now developers can choose between CommonJS and ES Modules based on their project needs.
| Feature | CommonJS | ES Modules |
|---|---|---|
| File Extension | .js (default) | .mjs (or .js with proper config) |
| Import Syntax | require() | import |
| Export Syntax | module.exports / exports | export / export default |
| Import Timing | Dynamic (runtime) | Static (parsed before execution) |
| Top-level Await | Not supported | Supported |
| File URL in Imports | Not required | Required for local files |
CommonJS Module
// math.js (CommonJS)function add(a, b) { return a + b;}
function subtract(a, b) { return a - b;}
module.exports = { add, subtract};
// app.js (CommonJS)const math = require('./math');console.log(math.add(5, 3)); // 8ES Module
// math.mjs (ES Module)export function add(a, b) { return a + b;}
export function subtract(a, b) { return a - b;}
// app.mjs (ES Module)import { add, subtract } from './math.mjs';console.log(add(5, 3)); // 8Enabling ES Modules
- Using the .mjs File Extension
The simplest way is to use the .mjs extension for your files.
Node.js will automatically treat these files as ES Modules.
- Setting “type”: “module” in package.json
To use ES Modules with regular .js files, add the following to your package.json:
{ "name": "my-package", "version": "1.0.0", "type": "module"}With this setting, all .js files in your project will be treated as ES Modules.
- Using the —input-type=module Flag
For scripts run directly with the node command, you can specify the module system:
node --input-type=module script.jsNode NPM
Q. What is NPM?
NPM is a package manager for Node.js packages, or modules if you like.
www.npmjs.com hosts thousands of free packages to download and use.
The NPM program is installed on your computer when you install Node.js
Q. What is a Package?
A package in Node.js contains all the files you need for a module.
Modules are JavaScript libraries you can include in your project.
Global Packages
Packages can be installed globally, making them available as command-line tools anywhere on your system.
Global packages are typically used for CLI tools and utilities.
- Install a package globally
npm install -g package-name- Updating Packages
npm update package-name- Uninstalling a Package
npm uninstall package-nameNode Package.json
Q. What is package.json?
package.json is a special file that describes your Node.js project.
It contains information about your app, such as its name, version, dependencies, scripts, and more.
This file is essential for managing and sharing Node.js projects, especially when using npm (Node Package Manager).
- Creating package.json
You can create a package.json file by running the following command in your project folder:
npm initFor a quick setup with default values, use:
npm init -yexample:
{ "name": "my-node-app", "version": "1.0.0", "description": "A simple Node.js app", "main": "index.js", "scripts": { "start": "node index.js" }, "author": "Your Name", "license": "ISC"}Dependencies
"dependencies": { "express": "^4.18.2", "mongoose": "~7.0.0", "lodash": "4.17.21"},Dev Dependencies
"devDependencies": { "nodemon": "^2.0.22", "jest": "^29.5.0", "eslint": "^8.38.0"}Version Ranges
^4.17.21- Compatible with 4.x.x (up to but not including 5.0.0)
~4.17.21 - Patch updates only (4.17.x)
4.17.21 - Exact version
latest - Latest stable version
- Adding Dependencies
npm install package-name
# Install and save to devDependenciesnpm install --save-dev package-name
# Install exact versionnpm install package-name@1.2.3- Updating Dependencies
npm run script-name
# Run start script (can be called with just 'npm start')npm start
# Run test script (can be called with just 'npm test')npm testNode Publish packages
Q. What Does it Mean to Publish a Package?
Publishing a package means making your Node.js module or project available for others to install and use via the npm registry.
This is how open-source libraries and tools are shared with the Node.js community.
When you publish a package, it becomes available for anyone to install using npm install your-package-name.
Preparing Your Package
- Initialize Package
Create a new directory and initialize your package:
mkdir my-packagecd my-packagenpm init -y- Essential Files
A package should include these key files:
package.json - Metadata about your packageREADME.md - Documentation (supports Markdown)index.js - Main entry point (or specify in package.json)LICENSE - Terms of use (MIT, ISC, etc.).gitignore - To exclude node_modules, logs, etc..npmignore - Optional, to exclude files from the published package- Package.json Essentials
Ensure your package.json has these minimum fields:
{ "name": "your-package-name", "version": "1.0.0", "description": "A brief description of your package", "main": "index.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "keywords": ["keyword1", "keyword2"], "author": "Your Name <your.email@example.com>", "license": "MIT"}Creating an npm Account
a. Sign Up Create an account at npmjs.com/signup if you don’t have one.
b. Verify Your Email Check your email and verify your account before publishing.
c. Login via CLI Open your terminal and run:
npm loginYou’ll be prompted for:
.Username
.Password
.Email (must match your npm account)
.One-time password (if you have 2FA enabled)
- Check Login Status
npm whoamiPublishing Your Package
a. Check Name Availability
npm view <package-name>If the package with that name does not already exist, you can use that name.
If it does, you’ll need to choose a different name in your package.json.
b. Test Package Locally Before publishing, test your package locally:
# In your package directorynpm link
# In another project directorynpm link <package-name>- Publish to npm Registry
# First, make sure you're in the right directorycd path/to/your/package
# Publish to the public npm registrynpm publish- Publish with a Specific Tag
npm publish --tag beta- Publish a Public Package (if using npm paid account)
npm publish --access publicUpdating Your Package
- Update the Version Number Use semantic versioning (SemVer) to update your package version:
# For a patch release (bug fixes)npm version patch
# For a minor release (backward-compatible features)npm version minor
# For a major release (breaking changes)
npm version major- Update Changelog
Update your CHANGELOG.md to document the changes in this version.
- Publish the Update
npm publish- Tag the Release (Optional)
If you’re using Git, create a tag for the release:
git tag -a v1.0.0 -m "Initial release"git push origin v1.0.0Managing Published Packages
# Unpublish a specific versionnpm unpublish <package-name>@<version>
# Unpublish the entire package (only works within 72 hours of publishing)npm unpublish <package-name> --forceTransferring Ownership
To transfer a package to another user or organization:
npm owner add <username> <package-name>Core Modules
HTTP Module
Node.js includes a powerful built-in HTTP module that enables you to create HTTP servers and make HTTP requests.
This module is essential for building web applications and APIs in Node.js.
Key Features
-
Create HTTP servers to handle requests and send responses
-
Make HTTP requests to other servers
-
Handle different HTTP methods (GET, POST, PUT, DELETE, etc.)
-
Work with request and response headers
-
Handle streaming data for large payloads
Creating an HTTP Server
The HTTP module’s createServer() method creates an HTTP server that listens for requests on a specified port and executes a callback function for each request.
// Import the HTTP moduleconst http = require('http');
// Create a server objectconst server = http.createServer((req, res) => { // Set the response HTTP header with HTTP status and Content type res.writeHead(200, { 'Content-Type': 'text/plain' });
// Send the response body as 'Hello, World!' res.end('Hello, World!\n');});
// Define the port to listen on const PORT = 3000;
// Start the server and listen on the specified portserver.listen(PORT, 'localhost', () => { console.log(`Server running at http://localhost:${PORT}/`);});Running the Server
Save the code in a file named server.js
Run the server using Node.js:
node server.jsCommon Response Headers
Content-Type: Specifies the media type of the content (e.g., text/html, application/json)
Content-Length: The length of the response body in bytes
Location: Used in redirects (with 3xx status codes)
Set-Cookie: Sets HTTP cookies on the client
Cache-Control: Directives for caching mechanisms
Access-Control-Allow-Origin: For CORS supportHTTPS Module
The HTTPS module is a core Node.js module that provides an implementation of the HTTPS protocol, which is essentially HTTP over TLS/SSL.
It’s a secure version of the HTTP module, providing encrypted communication between clients and servers.
Q. Why Use HTTPS?
HTTPS is crucial for modern web applications because it:
Encrypts Data: Protects sensitive information like passwords, credit card numbers, and personal data from eavesdropping
Authenticates Servers: Verifies that clients are communicating with the intended server
Ensures Data Integrity: Prevents data from being modified or corrupted during transfer
Builds Trust: Visual indicators (like the padlock icon) increase user confidence
Improves SEO: Search engines prioritize HTTPS websites in search results Enables Modern Features: Many web APIs (like Geolocation, Service Workers) require HTTPS
How HTTPS Works
-
Client initiates a secure connection to the server
-
Server presents its SSL/TLS certificate to the client
-
Client verifies the certificate with a trusted Certificate Authority (CA)
-
Encrypted session is established using asymmetric encryption
-
Symmetric encryption is used for the actual data transfer
Creating an HTTPS Server
const https = require('https');const fs = require('fs');const path = require('path');
// Path to your SSL/TLS certificate and keyconst sslOptions = { key: fs.readFileSync(path.join(__dirname, 'key.pem')), cert: fs.readFileSync(path.join(__dirname, 'cert.pem')), // Enable all security features minVersion: 'TLSv1.2', // Recommended security settings secureOptions: require('constants').SSL_OP_NO_SSLv3 | require('constants').SSL_OP_NO_TLSv1 | require('constants').SSL_OP_NO_TLSv1_1};
// Create the HTTPS serverconst server = https.createServer(sslOptions, (req, res) => { // Security headers res.setHeader('Strict-Transport-Security', 'max-age=31536000; includeSubDomains'); res.setHeader('X-Content-Type-Options', 'nosniff'); res.setHeader('X-Frame-Options', 'SAMEORIGIN'); res.setHeader('X-XSS-Protection', '1; mode=block'); res.setHeader('Referrer-Policy', 'strict-origin-when-cross-origin');
// Handle different routes if (req.url === '/') { res.writeHead(200, { 'Content-Type': 'text/html; charset=utf-8' }); res.end('<h1>Welcome to the Secure Server</h1><p>Your connection is encrypted!</p>'); } else if (req.url === '/api/status') { res.writeHead(200, { 'Content-Type': 'application/json' }); res.end(JSON.stringify({ status: 'ok', time: new Date().toISOString() })); } else { res.writeHead(404, { 'Content-Type': 'text/plain' }); res.end('404 Not Found'); }});
// Handle server errorsserver.on('error', (error) => { console.error('Server error:', error);});
// Start the server on port 3000 (HTTPS default is 443 but requires root)const PORT = process.env.PORT || 3000;server.listen(PORT, '0.0.0.0', () => { console.log(`Server running at https://localhost:${PORT}`); console.log('Press Ctrl+C to stop the server');});HTTPS Server with Express.js
While you can use the core HTTPS module directly, most Node.js applications use a web framework like Express.js to handle HTTP/HTTPS requests.
const express = require('express');const https = require('https');const fs = require('fs');const path = require('path');const helmet = require('helmet'); // Security middleware
// Create Express appconst app = express();
// Security middlewareapp.use(helmet());
// Parse JSON and URL-encoded bodiesapp.use(express.json());app.use(express.urlencoded({ extended: true }));
// Serve static files from 'public' directoryapp.use(express.static(path.join(__dirname, 'public'), { dotfiles: 'ignore', etag: true, extensions: ['html', 'htm'], index: 'index.html', maxAge: '1d', redirect: true}));
// Routesapp.get('/', (req, res) => { res.send('<h1>Welcome to Secure Express Server</h1>');});
app.get('/api/status', (req, res) => { res.json({ status: 'operational', timestamp: new Date().toISOString(), environment: process.env.NODE_ENV || 'development', nodeVersion: process.version });});
// Error handling middlewareapp.use((err, req, res, next) => { console.error(err.stack); res.status(500).json({ error: 'Something went wrong!' });});
// 404 handlerapp.use((req, res) => { res.status(404).json({ error: 'Not Found' });});
// SSL/TLS optionsconst sslOptions = { key: fs.readFileSync(path.join(__dirname, 'key.pem')), cert: fs.readFileSync(path.join(__dirname, 'cert.pem')), // Enable HTTP/2 if available allowHTTP1: true, // Recommended security options minVersion: 'TLSv1.2', ciphers: [ 'TLS_AES_256_GCM_SHA384', 'TLS_CHACHA20_POLY1305_SHA256', 'TLS_AES_128_GCM_SHA256', 'ECDHE-RSA-AES128-GCM-SHA256', '!DSS', '!aNULL', '!eNULL', '!EXPORT', '!DES', '!RC4', '!3DES', '!MD5', '!PSK' ].join(':'), honorCipherOrder: true};
// Create HTTPS serverconst PORT = process.env.PORT || 3000;const server = https.createServer(sslOptions, app);
// Handle unhandled promise rejectionsprocess.on('unhandledRejection', (reason, promise) => { console.error('Unhandled Rejection at:', promise, 'reason:', reason);});
// Handle uncaught exceptionsprocess.on('uncaughtException', (error) => { console.error('Uncaught Exception:', error); // Perform cleanup and exit if needed process.exit(1);});
// Graceful shutdownconst gracefulShutdown = (signal) => { console.log(`\nReceived ${signal}. Shutting down gracefully...`);
server.close(() => { console.log('HTTP server closed.'); // Close database connections, etc. process.exit(0); });
// Force close server after 10 seconds setTimeout(() => { console.error('Forcing shutdown...'); process.exit(1); }, 10000);};
// Listen for shutdown signalsprocess.on('SIGTERM', gracefulShutdown);process.on('SIGINT', gracefulShutdown);
// Start the serverconst HOST = process.env.HOST || '0.0.0.0';server.listen(PORT, HOST, () => { console.log(`Express server running at https://${HOST}:${PORT}`); console.log('Environment:', process.env.NODE_ENV || 'development'); console.log('Press Ctrl+C to stop the server');});Using Environment Variables
It’s a best practice to use environment variables for configuration. Create a .env file:
NODE_ENV=development PORT=3000HOST=0.0.0.0SSL_KEY_PATH=./key.pemSSL_CERT_PATH=./cert.pemThen use the dotenv package to load them:
require('dotenv').config();
// Access environment variablesconst PORT = process.env.PORT || 3000;const HOST = process.env.HOST || '0.0.0.0';| Feature | HTTP | HTTPS |
|---|---|---|
| Data Encryption | No (plain text) | Yes (encrypted) |
| Server Authentication | No | Yes (via certificates) |
| Data Integrity | No protection | Protected (tampering detected) |
| Default Port | 80 | 443 |
| Performance | Faster | Slight overhead (but optimized with HTTP/2) |
| SEO Ranking | Lower | Higher (Google prefers HTTPS) |
| Setup Complexity | Simpler | More complex (requires certificates) |
File System(fs)
The Node.js File System module (fs) provides a comprehensive set of methods for working with the file system on your computer.
It allows you to perform file I/O operations in both synchronous and asynchronous ways.
- Reading Files
Node.js provides several methods to read files, including both callback-based and promise-based approaches.
The most common method is fs.readFile().
with callbacks
const fs = require('fs');
// Read file asynchronously with callbackfs.readFile('myfile.txt', 'utf8', (err, data) => { if (err) { console.error('Error reading file:', err); return; } console.log('File content:', data);});
// For binary data (like images), omit the encodingfs.readFile('image.png', (err, data) => { if (err) throw err; // data is a Buffer containing the file content console.log('Image size:', data.length, 'bytes');});with Promises
// Using fs.promises (Node.js 10.0.0+)const fs = require('fs').promises;
async function readFileExample() { try { const data = await fs.readFile('myfile.txt', 'utf8'); console.log('File content:', data); } catch (err) { console.error('Error reading file:', err); }}
readFileExample();
// Or with util.promisify (Node.js 8.0.0+)const { promisify } = require('util');const readFileAsync = promisify(require('fs').readFile);
async function readWithPromisify() { try { const data = await readFileAsync('myfile.txt', 'utf8'); console.log(data); } catch (err) { console.error(err); }}
readWithPromisify();Reading Files Synchronously
For simple scripts, you can use synchronous methods, but avoid them in production servers as they block the event loop:
const fs = require('fs');
try { // Read file synchronously const data = fs.readFileSync('myfile.txt', 'utf8'); console.log('File content:', data);} catch (err) { console.error('Error reading file:', err);}- Creating and Writing Files
Node.js provides several methods for creating and writing to files.
a. Using fs.writeFile()
Creates a new file or overwrites an existing file with the specified content:
const fs = require('fs').promises;
async function writeFileExample() { try { // Write text to a file await fs.writeFile('myfile.txt', 'Hello, World!', 'utf8');
// Write JSON data const data = { name: 'John', age: 30, city: 'New York' }; await fs.writeFile('data.json', JSON.stringify(data, null, 2), 'utf8');
console.log('Files created successfully'); } catch (err) { console.error('Error writing files:', err); }}
writeFileExample();b. Using fs.appendFile()
Appends content to a file, creating the file if it doesn’t exist:
const fs = require('fs').promises;
async function appendToFile() { try { // Append a timestamped log entry const logEntry = `${new Date().toISOString()}: Application started\n`; await fs.appendFile('app.log', logEntry, 'utf8');
console.log('Log entry added'); } catch (err) { console.error('Error appending to file:', err); }}
appendToFile();c. Using File Handles
For more control over file operations, you can use file handles:
const fs = require('fs').promises;
async function writeWithFileHandle() { let fileHandle;
try { // Open a file for writing (creates if doesn't exist) fileHandle = await fs.open('output.txt', 'w');
// Write content to the file await fileHandle.write('First line\n'); await fileHandle.write('Second line\n'); await fileHandle.write('Third line\n');
console.log('Content written successfully'); } catch (err) { console.error('Error writing to file:', err); } finally { // Always close the file handle if (fileHandle) { await fileHandle.close(); } }}
writeWithFileHandle();d. Using Streams for Large Files
For writing large amounts of data, use streams to avoid high memory usage:
const fs = require('fs');const { pipeline } = require('stream/promises');const { Readable } = require('stream');
async function writeLargeFile() { // Create a readable stream (could be from HTTP request, etc.) const data = Array(1000).fill().map((_, i) => `Line ${i + 1}: ${'x'.repeat(100)}\n`); const readable = Readable.from(data);
// Create a writable stream to a file const writable = fs.createWriteStream('large-file.txt');
try { // Pipe the data from readable to writable await pipeline(readable, writable); console.log('Large file written successfully'); } catch (err) { console.error('Error writing file:', err); }}
writeLargeFile();File Flags: When opening files, you can specify different modes:
'w' - Open for writing (file is created or truncated)'wx' - Like 'w' but fails if the path exists'w+' - Open for reading and writing (file is created or truncated)'a' - Open for appending (file is created if it doesn't exist)'ax' - Like 'a' but fails if the path exists'r+' - Open for reading and writing (file must exist)- Deleting Files and Directories
Node.js provides several methods to delete files and directories.
a. Deleting a Single File
Use fs.unlink() to delete a file:
const fs = require('fs').promises;
async function deleteFile() { const filePath = 'file-to-delete.txt';
try { // Check if file exists before deleting await fs.access(filePath);
// Delete the file await fs.unlink(filePath); console.log('File deleted successfully'); } catch (err) { if (err.code === 'ENOENT') { console.log('File does not exist'); } else { console.error('Error deleting file:', err); } }}
deleteFile();b. Deleting Multiple Files
To delete multiple files, you can use Promise.all() with fs.unlink():
const fs = require('fs').promises;const path = require('path');
async function deleteFiles() { const filesToDelete = [ 'temp1.txt', 'temp2.txt', 'temp3.txt' ];
try { // Delete all files in parallel await Promise.all( filesToDelete.map(file => fs.unlink(file).catch(err => { if (err.code !== 'ENOENT') { console.error(`Error deleting ${file}:`, err); } }) ) );
console.log('Files deleted successfully'); } catch (err) { console.error('Error during file deletion:', err); }}
deleteFiles();c. Deleting Directories
To delete directories, you have several options depending on your needs:
const fs = require('fs').promises;const path = require('path');
async function deleteDirectory(dirPath) { try { // Check if the directory exists const stats = await fs.stat(dirPath);
if (!stats.isDirectory()) { console.log('Path is not a directory'); return; }
// For Node.js 14.14.0+ (recommended) await fs.rm(dirPath, { recursive: true, force: true });
// For older Node.js versions (deprecated but still works) // await fs.rmdir(dirPath, { recursive: true });
console.log('Directory deleted successfully'); } catch (err) { if (err.code === 'ENOENT') { console.log('Directory does not exist'); } else { console.error('Error deleting directory:', err); } }}
// UsagedeleteDirectory('directory-to-delete');- Renaming and Moving Files
The fs.rename() method can be used for both renaming and moving files.
It’s a versatile method for file system operations that involve changing file paths.
a. Basic File Renaming
To rename a file in the same directory:
const fs = require('fs').promises;
async function renameFile() { const oldPath = 'old-name.txt'; const newPath = 'new-name.txt';
try { // Check if source file exists await fs.access(oldPath);
// Check if destination file already exists try { await fs.access(newPath); console.log('Destination file already exists'); return; } catch (err) { // Destination doesn't exist, safe to proceed }
// Perform the rename await fs.rename(oldPath, newPath); console.log('File renamed successfully'); } catch (err) { if (err.code === 'ENOENT') { console.log('Source file does not exist'); } else { console.error('Error renaming file:', err); } }}
// UsagerenameFile();b. Moving Files Between Directories
You can use fs.rename() to move files between directories:
const fs = require('fs').promises;const path = require('path');
async function moveFile() { const sourceFile = 'source/file.txt'; const targetDir = 'destination'; const targetFile = path.join(targetDir, 'file.txt');
try { // Ensure source file exists await fs.access(sourceFile);
// Create target directory if it doesn't exist await fs.mkdir(targetDir, { recursive: true });
// Move the file await fs.rename(sourceFile, targetFile);
console.log('File moved successfully'); } catch (err) { if (err.code === 'ENOENT') { console.log('Source file does not exist'); } else if (err.code === 'EXDEV') { console.log('Cross-device move detected, using copy+delete fallback'); await moveAcrossDevices(sourceFile, targetFile); } else { console.error('Error moving file:', err); } }}
// Helper function for cross-device movesasync function moveAcrossDevices(source, target) { try { // Copy the file await fs.copyFile(source, target);
// Delete the original await fs.unlink(source);
console.log('File moved across devices successfully'); } catch (err) { // Clean up if something went wrong try { await fs.unlink(target); } catch (e) {} throw err; }}
// UsagemoveFile();Path Module
Q. What is the Path Module?
The Path module is a built-in Node.js module that provides tools for handling and transforming file paths across different operating systems.
Since Windows uses backslashes () and POSIX systems (Linux, macOS) use forward slashes (/), the Path module helps write cross-platform code that works correctly on any system.
Path Module Methods
- path.basename()
Returns the last portion of a path, similar to the Unix basename command.
const path = require('path');
// Get filename from a pathconst filename = path.basename('/users/docs/file.txt');console.log(filename);
// Get filename without extensionconst filenameWithoutExt = path.basename('/users/docs/file.txt', '.txt');console.log(filenameWithoutExt);__dirname and __filename
In Node.js, __dirname and __filename are special variables available in CommonJS modules that provide the directory name and file name of the current module.
Common Js:
// CommonJS module (e.g., app.js)const path = require('path');
// Get the directory name of the current moduleconsole.log('Directory name:', __dirname);
// Get the file name of the current moduleconsole.log('File name:', __filename);
// Building paths relative to the current moduleconst configPath = path.join(__dirname, 'config', 'app-config.json');console.log('Config file path:', configPath);
// Getting the directory name using path.dirname()console.log('Directory using path.dirname():', path.dirname(__filename));ES Modules:
// ES Module (e.g., app.mjs or "type": "module" in package.json)import { fileURLToPath } from 'url';import { dirname } from 'path';
// Get the current module's URLconst __filename = fileURLToPath(import.meta.url);const __dirname = dirname(__filename);
console.log('ES Module file path:', __filename);console.log('ES Module directory:', __dirname);
// Example with dynamic importsasync function loadConfig() { const configPath = new URL('../config/app-config.json', import.meta.url); const config = await import(configPath, { with: { type: 'json' } }); return config;}- path.extname()
Returns the extension of a path, from the last occurrence of the . character to the end of the string.
const path = require('path');
const extension = path.extname('file.txt');console.log(extension);
console.log(path.extname('index.html'));console.log(path.extname('index.coffee.md'));console.log(path.extname('index.'));console.log(path.extname('index'));console.log(path.extname('.index'));- path.join()
Joins all given path segments together using the platform-specific separator as a delimiter, then normalizes the resulting path.
const path = require('path');
// Join path segmentsconst fullPath = path.join('/users', 'docs', 'file.txt');console.log(fullPath); // Output depends on OS
// Handle relative paths and navigationconsole.log(path.join('/users', '../system', './logs', 'file.txt'));
// Handle multiple slashesconsole.log(path.join('users', '//docs', 'file.txt')); // Normalizes slashes- path.resolve()
Resolves a sequence of paths or path segments into an absolute path, processing from right to left until an absolute path is constructed.
const path = require('path');
// 1. Resolve relative to current working directoryconsole.log(path.resolve('file.txt'));
// 2. Resolve with multiple segmentsconsole.log(path.resolve('/users', 'docs', 'file.txt'));
// 3. Right-to-left processingconsole.log(path.resolve('/first', '/second', 'third')); // '/second/third'
// 4. Using __dirname for module-relative pathsconsole.log(path.resolve(__dirname, 'config', 'app.json'));- path.parse()
Returns an object whose properties represent significant elements of the path.
const path = require('path');
// Parse a file pathconst pathInfo = path.parse('/users/docs/file.txt');console.log(pathInfo);/* Output on Unix/macOS:{ root: '/', dir: '/users/docs', base: 'file.txt', ext: '.txt', name: 'file'}*/
// Accessing parsed componentsconsole.log('Directory:', pathInfo.dir); // /users/docsconsole.log('Filename:', pathInfo.base); // file.txtconsole.log('Name only:', pathInfo.name); // fileconsole.log('Extension:', pathInfo.ext); // .txtOS Module
The OS module in Node.js provides a powerful set of utilities for interacting with the underlying operating system.
It offers a cross-platform way to access system-related information and perform common operating system tasks.
Key Features:
-
Retrieve system information (CPU, memory, platform, etc.)
-
Access user and network information
-
Work with file paths and directories in a cross-platform way
-
Monitor system resources and performance
-
Handle operating system signals and errors
Methods
- os.platform() & os.type()
Identifies the operating system the code is running on.
console.log(os.platform());// Output: 'win32' (Windows), 'darwin' (macOS), or 'linux'
console.log(os.type());// Output: 'Windows_NT', 'Darwin', or 'Linux'- os.homedir()
Returns the path to the current user’s home directory. This is safer than hardcoding paths like C:\Users\Name.
console.log(os.homedir());// Output Windows: 'C:\Users\Alice'// Output Mac/Linux: '/home/alice'- os.cpus()
Returns an array of objects containing information about each logical CPU core (model, speed, times).
const cpus = os.cpus();console.log(cpus.length); // Output: 8 (if you have an 8-core CPU)- os.totalmem() & os.freemem()
Returns system memory in bytes. You often need to divide by 1024 to get KB, MB, or GB.
const totalGB = os.totalmem() / (1024 * 1024 * 1024);const freeGB = os.freemem() / (1024 * 1024 * 1024);
console.log(`Total RAM: ${totalGB.toFixed(2)} GB`);console.log(`Free RAM: ${freeGB.toFixed(2)} GB`);- os.uptime()
Returns the system uptime in seconds (how long the computer/server has been running).
const uptimeSeconds = os.uptime();console.log(`System has been up for ${uptimeSeconds} seconds`);Useful Constant: os.EOL
Stands for End Of Line. It returns the specific line break character for the operating system.
. Windows uses \r\n
. POSIX (Mac/Linux) uses \n
Use Case: When writing files (logs, text files) that need to be readable on the specific OS running the code.
const fs = require('fs');// Writes a new line correctly regardless of OSfs.appendFileSync('log.txt', 'Log entry' + os.EOL);URL Method
The URL module provides utilities for URL resolution and parsing.
It can be used to split up a web address into readable parts, construct URLs, and handle different URL components.
- new URL(string) (Parsing)
Parses a URL string into an object, breaking it down into its components.
const myUrl = new URL('https://user:pass@sub.example.com:8080/p/a/t/h?query=string#hash');- Accessing Components
Once parsed, you can access specific parts of the URL directly.
console.log(myUrl.hostname); // 'sub.example.com'console.log(myUrl.pathname); // '/p/a/t/h'console.log(myUrl.protocol); // 'https:'console.log(myUrl.port); // '8080'- searchParams (Handling Query Strings)
This is the most powerful feature. It allows you to read and modify the ?key=value part of a URL without doing string manipulation.
const site = new URL('https://example.com/search?q=nodejs&sort=asc');
// Read paramsconsole.log(site.searchParams.get('q')); // 'nodejs'
// Add/Modify paramssite.searchParams.append('page', '2');console.log(site.href);// Output: 'https://example.com/search?q=nodejs&sort=asc&page=2'- Modifying URLs
You can change specific properties, and the full URL updates automatically.
const myUrl = new URL('https://example.com/home');
myUrl.pathname = '/about';myUrl.port = '3000';
console.log(myUrl.href);// Output: 'https://example.com:3000/about'- url.format(object) / .toString()
Converts a URL object back into a string.
const myUrl = new URL('https://example.com');console.log(myUrl.toString());// Output: 'https://example.com/'Useful Helper: fileURLToPath
If you are using ES Modules (where you use import instead of require), you often need to convert a file URL to a standard path string to use with the fs or path modules.
const { fileURLToPath } = require('url');
const __filename = fileURLToPath('file:///C:/path/to/file.js');console.log(__filename);// Output: 'C:\path\to\file.js'Events Module
Every action on a computer is an event, like when a connection is made or a file is opened.
Objects in Node.js can fire events, like the readStream object fires events when opening and closing a file:
/ Import the events moduleconst EventEmitter = require('events');
// Create an event emitter instanceconst myEmitter = new EventEmitter();
// Register an event listenermyEmitter.on('greet', () => { console.log('Hello there!');});
// Emit the eventmyEmitter.emit('greet'); // Outputs: Hello there!- EventEmitter Class
The EventEmitter class is fundamental to Node.js’s event-driven architecture.
It provides the ability to create and handle custom events.
let events = require('events');let eventEmitter = new events.EventEmitter();- EventEmitter Object
You can assign event handlers to your own events with the EventEmitter object.
In the example below we have created a function that will be executed when a “scream” event is fired.
To fire an event, use the emit() method.
let events = require('events');let eventEmitter = new events.EventEmitter();
//Create an event handler:let myEventHandler = function () { console.log('I hear a scream!');}
//Assign the event handler to an event:eventEmitter.on('scream', myEventHandler);
//Fire the 'scream' event:eventEmitter.emit('scream');Common EventEmitter Patterns
- Passing Arguments to Event Handlers
const EventEmitter = require('events');const emitter = new EventEmitter();
// Emit event with argumentsemitter.on('userJoined', (username, userId) => { console.log(`${username} (${userId}) has joined the chat`);});
emitter.emit('userJoined', 'JohnDoe', 42);// Outputs: JohnDoe (42) has joined the chat- Handling Events Only Once
const EventEmitter = require('events');const emitter = new EventEmitter();
// This listener will be called only onceemitter.once('connection', () => { console.log('First connection established');});
emitter.emit('connection'); // This will trigger the listeneremitter.emit('connection'); // This won't trigger the listener again- Error Handling
const EventEmitter = require('events');const emitter = new EventEmitter();
// Always handle 'error' eventsemitter.on('error', (err) => { console.error('An error occurred:', err.message);});
// This will trigger the error handleremitter.emit('error', new Error('Something went wrong'));- Clean Up Listeners
// Add a listenerconst listener = () => console.log('Event occurred');myEmitter.on('event', listener);
// Later, remove the listener when no longer neededmyEmitter.off('event', listener);Stream Module
The Node.js stream module handles streaming data. This is fundamental for Node.js’s ability to handle large amounts of data efficiently.
Instead of reading a 1GB file into memory all at once (which would crash your app), streams read it in small chunks (pieces), process them, and send them out.
Q. Why use Streams?
Memory Efficiency:
Without streams, reading a 4GB file requires 4GB of RAM. With streams, Node.js reads small chunks (default 64kb), so memory usage stays tiny regardless of file size.
The 4 Types of Streams
a. Readable: Source of data (e.g., reading a file).
b. Writable: Destination for data (e.g., writing to a file).
c. Duplex: Both readable and writable (e.g., a network socket).
d. Transform: Modifies data as it passes through (e.g., file compression).
- pipe(destination)
The most important method. It takes a Readable stream and connects it to a Writable stream. It manages the speed automatically so the destination doesn’t get overwhelmed.
JavaScript
const readStream = fs.createReadStream('./input.txt');const writeStream = fs.createWriteStream('./output.txt');
// Read from input, write to output automaticallyreadStream.pipe(writeStream);- Event: ‘data’ (Reading Chunks)
If you don’t use pipe, you can listen for data manually. This fires every time a “chunk” of data is ready.
const readable = fs.createReadStream('./bigfile.txt', { encoding: 'utf8' });
readable.on('data', (chunk) => { console.log('Received chunk:', chunk);});- Event: ‘end’ (Finished Reading)
Fired when there is no more data to consume from a Readable stream.
readable.on('end', () => { console.log('Finished reading the file.');});- .write(chunk) (Writing Data)
Used to write data to a Writable stream manually.
const writable = fs.createWriteStream('./log.txt');
writable.write('First line\n');writable.write('Second line\n');writable.end(); // Marks the end of writing- stream.pipeline()
This is the modern, safer version of .pipe(). It handles errors properly (cleaning up streams if one fails), which .pipe() does not always do well.
const { pipeline } = require('stream');const fs = require('fs');const zlib = require('zlib'); // Compression module
pipeline( fs.createReadStream('input.txt'), zlib.createGzip(), // Transform stream (compression) fs.createWriteStream('input.txt.gz'), (err) => { if (err) { console.error('Pipeline failed.', err); } else { console.log('Pipeline succeeded.'); } });Buffer Module
The Node.js Buffer Module is used to handle binary data. While JavaScript is great with Unicode strings, it struggles with raw binary data (like images, video streams, or compressed files).
A Buffer is essentially a fixed-size chunk of memory (outside the V8 JavaScript engine) that stores integers between 0 and 255 (bytes).
It is a Global object, so you do usually not need to import it.
- Creating Buffers
There are several ways to create buffers in Node.js:
a. Buffer.alloc()
Creates a new Buffer of the specified size, initialized with zeros.
This is the safest way to create a new buffer as it ensures no old data is present.
// Create a buffer of 10 bytes filled with zerosconst buffer1 = Buffer.alloc(10);console.log(buffer1);b. Buffer.allocUnsafe()
Creates a new Buffer of the specified size, but doesn’t initialize the memory.
This is faster than Buffer.alloc() but may contain old or sensitive data.
Always fill the buffer before use if security is a concern.
// Create an uninitialized buffer of 10 bytesconst buffer2 = Buffer.allocUnsafe(10);console.log(buffer2);
// Fill the buffer with zeros for securitybuffer2.fill(0);console.log(buffer2);c. Buffer.from()
Creates a new Buffer from various sources like strings, arrays, or ArrayBuffer. This is the most flexible way to create buffers from existing data.
// Create a buffer from a stringconst buffer3 = Buffer.from('Hello, World!');console.log(buffer3);
console.log(buffer3.toString());
// Create a buffer from an array of integersconst buffer4 = Buffer.from([65, 66, 67, 68, 69]);console.log(buffer4);
console.log(buffer4.toString());
// Create a buffer from another bufferconst buffer5 = Buffer.from(buffer4);console.log(buffer5);- Using Buffers
a. Writing to Buffers
// Create an empty bufferconst buffer = Buffer.alloc(10);
// Write a string to the bufferbuffer.write('Hello');console.log(buffer);
console.log(buffer.toString());
// Write bytes at specific positionsbuffer[5] = 44; // ASCII for ','buffer[6] = 32; // ASCII for spacebuffer.write('Node', 7);console.log(buffer.toString());b. Reading from Buffers
// Create a buffer from a stringconst buffer = Buffer.from('Hello, Node.js!');
// Read the entire buffer as a stringconsole.log(buffer.toString());
// Read a portion of the buffer (start at position 7, end before position 11)console.log(buffer.toString('utf8', 7, 11));
// Read a single byteconsole.log(buffer[0]);
// Convert the ASCII code to a characterconsole.log(String.fromCharCode(buffer[0]));c. Iterating Through Buffers
// Create a buffer from a stringconst buffer = Buffer.from('Hello');
// Iterate using for...of loopfor (const byte of buffer) {console.log(byte);}
// Iterate using forEachbuffer.forEach((byte, index) => { console.log(`Byte at position ${index}: ${byte}`);});Buffer Methods
- Buffer.compare()
Compares two buffers and returns a number indicating whether the first one comes before, after, or is the same as the second one in sort order:
const buffer1 = Buffer.from('ABC');const buffer2 = Buffer.from('BCD');const buffer3 = Buffer.from('ABC');
console.log(Buffer.compare(buffer1, buffer2));console.log(Buffer.compare(buffer2, buffer1));console.log(Buffer.compare(buffer1, buffer3));- buffer.copy()
Copies data from one buffer to another:
// Create source and target buffersconst source = Buffer.from('Hello, World!');const target = Buffer.alloc(source.length);
// Copy from source to targetsource.copy(target);
console.log(target.toString());
// Create a target buffer for partial copyconst partialTarget = Buffer.alloc(5);
// Copy only part of the source (starting at index 7)source.copy(partialTarget, 0, 7);
console.log(partialTarget.toString());- buffer.slice()
Creates a new buffer that references the same memory as the original, but with offset and cropped to the given end:
const buffer = Buffer.from('Hello, World!');
// Create a slice from position 7 to the endconst slice = buffer.slice(7);console.log(slice.toString());
// Create a slice from position 0 to 5const slice2 = buffer.slice(0, 5);console.log(slice2.toString());
// Important: slices share memory with original bufferslice[0] = 119; // ASCII for 'w' (lowercase)console.log(slice.toString());console.log(buffer.toString());- buffer.toString()
Decodes a buffer to a string using a specified encoding:
const buffer = Buffer.from('Hello, World!');
// Default encoding is UTF-8console.log(buffer.toString());
// Specify encodingconsole.log(buffer.toString('utf8'));
// Decode only a portion of the bufferconsole.log(buffer.toString('utf8', 0, 5));
// Using different encodingsconst hexBuffer = Buffer.from('48656c6c6f', 'hex');console.log(hexBuffer.toString());
const base64Buffer = Buffer.from('SGVsbG8=', 'base64');console.log(base64Buffer.toString());- buffer.equals()
Compares two buffers for content equality:
const buffer1 = Buffer.from('Hello');const buffer2 = Buffer.from('Hello');const buffer3 = Buffer.from('World');
console.log(buffer1.equals(buffer2));
console.log(buffer1.equals(buffer3));
console.log(buffer1 === buffer2);Crypto Module
The Crypto module is a built-in Node.js module that provides cryptographic functionality including:
a. Hash functions (SHA-256, SHA-512, etc.)
b. HMAC (Hash-based Message Authentication Code)
c. Symmetric encryption (AES, DES, etc.)
d. Asymmetric encryption (RSA, ECDSA, etc.)
e. Digital signatures and verification
f. Secure random number generation
The Crypto module is essential for applications that need to handle sensitive information securely.
The Crypto module wraps the OpenSSL library, providing access to well-established and tested cryptographic algorithms.
This module is often used to handle sensitive data, such as:
. User authentication and password storage
. Secure data transmission
. File encryption and decryption
. Secure communication channels
Hash Functions
Hashing is a one-way transformation of data into a fixed-length string of characters.
Hash functions have several important properties:
. Deterministic: Same input always produces the same output
. Fixed Length: Output is always the same size regardless of input size
. One-Way: Extremely difficult to reverse the process
. Avalanche Effect: Small changes in input produce significant changes in output
Creating a Hash
const crypto = require('crypto');
// Create a hash objectconst hash = crypto.createHash('sha256');
// Update the hash with datahash.update('Hello, World!');
// Get the digest in hexadecimal formatconst digest = hash.digest('hex');console.log(digest);Common Hash Algorithms
const crypto = require('crypto');const data = 'Hello, World!';
// MD5 (not recommended for security-critical applications)const md5 = crypto.createHash('md5').update(data).digest('hex');console.log('MD5:', md5);
// SHA-1 (not recommended for security-critical applications)const sha1 = crypto.createHash('sha1').update(data).digest('hex');console.log('SHA-1:', sha1);
// SHA-256const sha256 = crypto.createHash('sha256').update(data).digest('hex');console.log('SHA-256:', sha256);
// SHA-512const sha512 = crypto.createHash('sha512').update(data).digest('hex');console.log('SHA-512:', sha512);