Home Web Front-end JS Tutorial The Basics of Node.js Streams

The Basics of Node.js Streams

Feb 20, 2025 am 10:07 AM

The Basics of Node.js Streams

Node.js, being asynchronous and event-driven, excels at I/O-bound operations. Leveraging Node.js streams significantly simplifies these tasks by efficiently processing data in smaller chunks. Let's delve into the world of streams and see how they streamline I/O.

Key Concepts:

  • Node.js streams, asynchronous and event-driven, optimize I/O by handling data in manageable portions.
  • Streams are classified as Readable, Writable, or Duplex (both readable and writable). Readable streams fetch data from a source; writable streams send data to a destination.
  • The pipe() function is invaluable, facilitating seamless data transfer between source and destination without manual flow management.
  • Methods like Readable.pause(), Readable.resume(), and readable.unpipe() offer granular control over data flow, enhancing stream functionality.

Understanding Streams:

Streams are analogous to Unix pipes, enabling effortless data transfer from source to destination. Essentially, a stream is an EventEmitter with specialized methods. The implemented methods determine whether a stream is Readable, Writable, or Duplex. Readable streams provide data input; writable streams handle data output.

You've likely encountered streams in Node.js already. In an HTTP server, the request is a readable stream, and the response is a writable stream. The fs module provides both readable and writable file stream capabilities.

This article focuses on readable and writable streams; duplex streams are beyond its scope.

Readable Streams:

A readable stream reads data from a source (a file, in-memory buffer, or another stream). Being EventEmitters, they trigger various events. We utilize these events to interact with the streams.

Reading from Streams:

The most common approach is to listen for the data event and attach a callback. When data is available, the data event fires, executing the callback.

const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
let data = '';
readableStream.on('data', (chunk) => { data += chunk; });
readableStream.on('end', () => { console.log(data); });
Copy after login
Copy after login

fs.createReadStream() creates a readable stream. Initially static, it begins flowing upon attaching a data event listener. Data chunks are then passed to the callback. The frequency of data events is determined by the stream implementation (e.g., an HTTP request might emit an event per few KB, while a file stream might emit per line).

The end event signals the end of data.

Alternatively, repeatedly call read() on the stream instance until all data is read:

const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
let data = '';
readableStream.on('data', (chunk) => { data += chunk; });
readableStream.on('end', () => { console.log(data); });
Copy after login
Copy after login

read() retrieves data from the internal buffer. It returns null when no data remains. The readable event indicates data availability.

Setting Encoding:

Data is typically a Buffer object. For strings, use Readable.setEncoding():

const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
let data = '';
let chunk;
readableStream.on('readable', () => {
  while ((chunk = readableStream.read()) !== null) {
    data += chunk;
  }
});
readableStream.on('end', () => { console.log(data); });
Copy after login

This interprets data as UTF-8, passing it as a string to the callback.

Piping:

Piping simplifies data transfer between source and destination:

const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
let data = '';
readableStream.setEncoding('utf8');
readableStream.on('data', (chunk) => { data += chunk; });
readableStream.on('end', () => { console.log(data); });
Copy after login

pipe() handles data flow automatically.

Chaining:

Streams can be chained:

const fs = require('fs');
const readableStream = fs.createReadStream('file1.txt');
const writableStream = fs.createWriteStream('file2.txt');
readableStream.pipe(writableStream);
Copy after login

This decompresses input.txt.gz and writes the result to output.txt.

Additional Readable Stream Methods:

  • Readable.pause(): Pauses the stream.
  • Readable.resume(): Resumes a paused stream.
  • readable.unpipe(): Removes destination streams from the pipe.

Writable Streams:

Writable streams send data to a destination. Like readable streams, they are EventEmitters.

Writing to Streams:

Use write() to send data:

const fs = require('fs');
const zlib = require('zlib');
fs.createReadStream('input.txt.gz')
  .pipe(zlib.createGunzip())
  .pipe(fs.createWriteStream('output.txt'));
Copy after login

write() returns a boolean indicating success. If false, the stream is temporarily full; wait for the drain event before writing more.

End of Data:

Call end() to signal the end of data. The finish event is emitted after all data is flushed. You cannot write after calling end().

Important Writable Stream Events:

  • error: Indicates an error.
  • pipe: Emitted when a readable stream is piped.
  • unpipe: Emitted when unpipe() is called on the readable stream.

Conclusion:

Streams are a powerful feature in Node.js, enhancing I/O efficiency. Understanding streams, piping, and chaining enables writing clean, performant code.

Node.js Streams FAQ:

  • What are Node.js streams? They are objects that allow for efficient, incremental processing of data, avoiding loading entire datasets into memory.

  • Main types of Node.js streams? Readable, Writable, Duplex, and Transform.

  • Creating a Readable stream? Use stream.Readable and implement the _read method.

  • Common use cases for Readable streams? Reading large files, processing data from HTTP requests, real-time data handling.

  • Creating a Writable stream? Use stream.Writable and implement the _write method.

  • Common uses of Writable streams? Saving data to files, sending data to services.

  • Duplex stream? Combines Readable and Writable functionality.

  • Transform streams? Modify data as it passes through (e.g., compression, encryption).

  • Piping data between streams? Use the .pipe() method.

  • Best practices for working with Node.js streams? Use them for large datasets, handle errors and backpressure, and consider util.promisify for promise-based operations.

The above is the detailed content of The Basics of Node.js Streams. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Hot Topics

Java Tutorial
1655
14
PHP Tutorial
1253
29
C# Tutorial
1227
24
Demystifying JavaScript: What It Does and Why It Matters Demystifying JavaScript: What It Does and Why It Matters Apr 09, 2025 am 12:07 AM

JavaScript is the cornerstone of modern web development, and its main functions include event-driven programming, dynamic content generation and asynchronous programming. 1) Event-driven programming allows web pages to change dynamically according to user operations. 2) Dynamic content generation allows page content to be adjusted according to conditions. 3) Asynchronous programming ensures that the user interface is not blocked. JavaScript is widely used in web interaction, single-page application and server-side development, greatly improving the flexibility of user experience and cross-platform development.

The Evolution of JavaScript: Current Trends and Future Prospects The Evolution of JavaScript: Current Trends and Future Prospects Apr 10, 2025 am 09:33 AM

The latest trends in JavaScript include the rise of TypeScript, the popularity of modern frameworks and libraries, and the application of WebAssembly. Future prospects cover more powerful type systems, the development of server-side JavaScript, the expansion of artificial intelligence and machine learning, and the potential of IoT and edge computing.

JavaScript Engines: Comparing Implementations JavaScript Engines: Comparing Implementations Apr 13, 2025 am 12:05 AM

Different JavaScript engines have different effects when parsing and executing JavaScript code, because the implementation principles and optimization strategies of each engine differ. 1. Lexical analysis: convert source code into lexical unit. 2. Grammar analysis: Generate an abstract syntax tree. 3. Optimization and compilation: Generate machine code through the JIT compiler. 4. Execute: Run the machine code. V8 engine optimizes through instant compilation and hidden class, SpiderMonkey uses a type inference system, resulting in different performance performance on the same code.

JavaScript: Exploring the Versatility of a Web Language JavaScript: Exploring the Versatility of a Web Language Apr 11, 2025 am 12:01 AM

JavaScript is the core language of modern web development and is widely used for its diversity and flexibility. 1) Front-end development: build dynamic web pages and single-page applications through DOM operations and modern frameworks (such as React, Vue.js, Angular). 2) Server-side development: Node.js uses a non-blocking I/O model to handle high concurrency and real-time applications. 3) Mobile and desktop application development: cross-platform development is realized through ReactNative and Electron to improve development efficiency.

Python vs. JavaScript: The Learning Curve and Ease of Use Python vs. JavaScript: The Learning Curve and Ease of Use Apr 16, 2025 am 12:12 AM

Python is more suitable for beginners, with a smooth learning curve and concise syntax; JavaScript is suitable for front-end development, with a steep learning curve and flexible syntax. 1. Python syntax is intuitive and suitable for data science and back-end development. 2. JavaScript is flexible and widely used in front-end and server-side programming.

How to Build a Multi-Tenant SaaS Application with Next.js (Frontend Integration) How to Build a Multi-Tenant SaaS Application with Next.js (Frontend Integration) Apr 11, 2025 am 08:22 AM

This article demonstrates frontend integration with a backend secured by Permit, building a functional EdTech SaaS application using Next.js. The frontend fetches user permissions to control UI visibility and ensures API requests adhere to role-base

From C/C   to JavaScript: How It All Works From C/C to JavaScript: How It All Works Apr 14, 2025 am 12:05 AM

The shift from C/C to JavaScript requires adapting to dynamic typing, garbage collection and asynchronous programming. 1) C/C is a statically typed language that requires manual memory management, while JavaScript is dynamically typed and garbage collection is automatically processed. 2) C/C needs to be compiled into machine code, while JavaScript is an interpreted language. 3) JavaScript introduces concepts such as closures, prototype chains and Promise, which enhances flexibility and asynchronous programming capabilities.

How do I install JavaScript? How do I install JavaScript? Apr 05, 2025 am 12:16 AM

JavaScript does not require installation because it is already built into modern browsers. You just need a text editor and a browser to get started. 1) In the browser environment, run it by embedding the HTML file through tags. 2) In the Node.js environment, after downloading and installing Node.js, run the JavaScript file through the command line.

See all articles