Comprehensive Guide to Node.js Streams (node:stream API)

The node:stream module is a core component of Node.js, providing an abstraction for working with streaming data. Streams enable efficient handling of I/O operations, especially when dealing with large files, network communication, or real‑time data processing. They are built on top of EventEmitter and come in four primary types: Readable, Writable, Duplex, and Transform.

Readable / Writable / Duplex / Transformpipeline / finishedBackpressureObject ModeCustom StreamsAsync IterationWeb Streams Interop

~3 min read • Updated Dec 30, 2025

1. Introduction


The node:stream module provides a powerful interface for processing data incrementally. Instead of loading entire datasets into memory, streams allow data to be consumed or produced piece by piece, improving performance and scalability.


2. Stream Types


  • Readable: Data sources (e.g., fs.createReadStream(), HTTP responses).
  • Writable: Data destinations (e.g., fs.createWriteStream(), HTTP requests).
  • Duplex: Both readable and writable (e.g., TCP sockets).
  • Transform: Duplex streams that modify data (e.g., compression, encryption).

3. Accessing the Stream Module


const stream = require('node:stream');
// or
import stream from 'node:stream';

4. Streams Promises API


Available via require('node:stream/promises') or stream.promises.


4.1 pipeline()


Asynchronously pipes streams with proper error handling and cleanup.


await pipeline(
  fs.createReadStream('input.txt'),
  zlib.createGzip(),
  fs.createWriteStream('input.txt.gz')
);

4.2 finished()


Resolves when a stream finishes (readable or writable).


await finished(readableStream);

5. Object Mode


Streams normally handle Buffers or strings. In object mode, they can handle arbitrary JavaScript values (except null).


const readable = new stream.Readable({
  objectMode: true,
  read() {
    this.push({ value: 42 });
    this.push(null);
  }
});

6. Buffering & Backpressure


  • Each stream maintains an internal buffer.
  • highWaterMark controls buffer size and backpressure behavior.
  • write() returns false when the buffer is full → wait for 'drain'.

7. API for Consumers


7.1 Writable Streams


  • write(chunk[, encoding][, callback])
  • end([chunk][, encoding][, callback])
  • Events: 'drain', 'finish', 'error', 'pipe'

7.2 Readable Streams


Two operating modes:

  • Paused mode: Must call read().
  • Flowing mode: Emits 'data' events automatically.

7.3 Async Iteration


for await (const chunk of readable) {
  console.log(chunk);
}

8. Duplex & Transform Streams


  • Duplex: Independent readable and writable sides.
  • Transform: Output is derived from input (e.g., uppercase, compression).

9. Implementing Custom Streams


9.1 Custom Writable


class MyWritable extends stream.Writable {
  _write(chunk, encoding, callback) {
    // Process chunk
    callback();
  }
}

9.2 Custom Readable


class Counter extends stream.Readable {
  _read() {
    this.push('data');
    this.push(null);
  }
}

9.3 Custom Transform


class Uppercase extends stream.Transform {
  _transform(chunk, encoding, callback) {
    callback(null, chunk.toString().toUpperCase());
  }
}

10. PassThrough Stream


A Transform stream that outputs data unchanged.


11. Utility Functions


  • stream.duplexPair(): Creates a connected Duplex pair.
  • stream.addAbortSignal(signal, stream): Links an AbortSignal to a stream.
  • Readable.from(iterable): Creates a Readable from an iterable or async generator.
  • Web Streams interop: toWeb() / fromWeb().

12. Best Practices


  • Use pipeline() for chaining streams safely.
  • Handle backpressure correctly.
  • Avoid object mode unless necessary.
  • Use async iteration for clean consumption.
  • Enable source maps with --enable-source-maps.

Conclusion


Streams are the backbone of Node.js I/O performance. Mastering them enables developers to build scalable, memory‑efficient applications capable of handling files, network traffic, and real‑time data transformations.


Written & researched by Dr. Shahin Siami