Streams in Node.js

📘 Node.js 👁 45 views 📅 Nov 05, 2025
⏱ Estimated reading time: 2 min

Streams are a way to handle data piece by piece instead of loading everything into memory at once.
They are ideal for working with large files, network data, and real-time processing.


1. Why Streams Are Needed

Traditional file handling:

  • Loads entire data into memory

  • Slower and memory-intensive

Stream-based handling:

  • Processes data in small chunks

  • Faster and memory-efficient

  • Supports real-time data flow


2. Types of Streams

Node.js provides four main types of streams:

2.1 Readable Streams

Used to read data.

Examples:

  • fs.createReadStream()

  • HTTP request stream

const fs = require('fs'); const readStream = fs.createReadStream('input.txt'); readStream.on('data', chunk => { console.log(chunk.toString()); });

2.2 Writable Streams

Used to write data.

Examples:

  • fs.createWriteStream()

  • HTTP response stream

const writeStream = fs.createWriteStream('output.txt'); writeStream.write('Hello Streams'); writeStream.end();

2.3 Duplex Streams

Can read and write data.

Example:

  • TCP sockets

// Example: socket connection

2.4 Transform Streams

Modify data while streaming.

Example:

  • Compression

  • Encryption

const { Transform } = require('stream'); const upperCase = new Transform({ transform(chunk, encoding, callback) { callback(null, chunk.toString().toUpperCase()); } });

3. Stream Events

Common events:

EventDescription
dataData chunk received
endNo more data
errorError occurred
finishWrite completed

4. Piping Streams

pipe() connects streams together.

fs.createReadStream('a.txt') .pipe(fs.createWriteStream('b.txt'));

✔ Simplest way to transfer data
✔ Handles backpressure automatically


5. Backpressure

Backpressure occurs when:

  • Data is produced faster than it is consumed

Streams handle this automatically using pipe().


6. Real-World Use Cases

  • File uploads/downloads

  • Video streaming

  • Data compression

  • Reading large logs


7. Advantages of Streams

  • Low memory usage

  • High performance

  • Scalable

  • Ideal for large data


8. Streams vs Buffers

StreamsBuffers
Process data in chunksStore entire data
Memory efficientMemory heavy
Suitable for large dataSuitable for small data

9. Best Practices

  • Use pipe() whenever possible

  • Handle error events

  • Close streams properly


10. Summary

  • Streams process data chunk-by-chunk

  • Node.js supports Readable, Writable, Duplex, and Transform streams

  • Streams improve performance and memory usage

  • Widely used in real-world Node.js applications


🔒 Some advanced sections are available for Registered Members
Register Now

Share this Post


← Back to Tutorials

Popular Competitive Exam Quizzes