Node.js  

What are Streams and Buffers in Node.js?

🔍 Introduction

In Node.js, handling large files or continuous data (such as video, audio, or network requests) can be slow and memory-intensive when loading everything at once. Streams and buffers help by allowing data to be processed in small parts instead of all together. This approach is faster and uses less memory.

📦 What is a Buffer?

A buffer is like a temporary box in memory where binary data is stored while moving between different places in your application.

In simple words:

  • A buffer is used to store raw binary data (not just text).
  • It is very helpful when working with files, network requests, or any process that deals with binary information.
  • Data comes in small parts (chunks), and the buffer holds it before it's processed.

Example:

// Creating and using a buffer
const buffer = Buffer.from('Hello Node.js');
console.log(buffer); 
console.log(buffer.toString()); 

Buffer.from creates binary data from a string, and toString() turns it back into readable text.

🌊 What is a Stream?

A stream is like a pipeline that lets you read or write data in a continuous flow without waiting for the whole data to be ready.

Types of Streams in Node.js:

  • Readable Streams: For reading data (e.g., fs.createReadStream).
  • Writable Streams: For writing data (e.g., fs.createWriteStream).
  • Duplex Streams: Can both read and write data (e.g., TCP sockets).
  • Transform Streams: Can modify data while reading/writing (e.g., compression).

Example:

// Reading a file with a stream
const fs = require('fs');

const readableStream = fs.createReadStream('largefile.txt', { encoding: 'utf8' });

readableStream.on('data', (chunk) => {
  console.log('Received chunk:', chunk);
});

readableStream.on('end', () => {
  console.log('Finished reading file');
});

Instead of loading the entire file at once, the file is read in smaller parts (chunks) and processed immediately.

⚡ How Streams and Buffers Work Together

  • When reading a file or getting data from the internet, Node.js puts incoming chunks of data into a buffer.
  • The stream then processes each chunk one after another.
  • This combination allows handling large data smoothly without slowing down your app.

Example:

// Copying a file using streams
const fs = require('fs');

const readable = fs.createReadStream('video.mp4');
const writable = fs.createWriteStream('copy.mp4');

readable.pipe(writable);

The pipe() method connects the readable stream (source) to the writable stream (destination), and Node.js handles buffering automatically.

📊 Benefits of Using Streams and Buffers

  • Memory Efficiency: Only small parts of data are loaded at a time.
  • Faster Processing: You can start processing data as soon as some part of it arrives.
  • Scalable Applications: Suitable for working with very large files or real-time data without performance issues.

📝 Summary

In Node.js, buffers store binary data temporarily, and streams process that data in small chunks without waiting for all of it to arrive. This makes applications faster, more memory-efficient, and scalable. Together, they are essential for handling big files, live video/audio, or network communication efficiently.