Understanding Streams in Node.js
Related recommendations: "nodejs Tutorial"
For most students with back-end experience, the Stream object is a reasonable and common object, but front-end classmate Stream is not so taken for granted. There is even an article with more than 9,000 stars on github introducing what Stream is - stream-handbook (https://link.zhihu.com/?target=https: //github.com/substack/stream-handbook). In order to better understand Stream, let’s briefly summarize it based on this article.
What is Stream
Stream is a very common and important concept in Unix systems. In terminology, stream is an abstraction of input and output devices.
ls | grep *.js
We often encounter codes like this when writing scripts. Use |
to connect two commands and use the result of the previous command as the next one. The parameters of the command are passed in, so that the data is transmitted like water in the pipeline. Each command is like a processor, doing some processing on the data, so | is called "Pipeline symbol ".
Several types of Stream in NodeJS
From a program perspective, a stream is directional data, which can be divided into three types according to the flow direction
Device flow to program: readable
Program flow to device: writable
-
Bidirectional: duplex, transform
NodeJS’s stream operations are encapsulated into the Stream module, which is also referenced by multiple core modules. According to the Unix philosophy: everything is a file, most file processing in NodeJS uses streams to complete
Ordinary files
Device files ( stdin, stdout)
Network files (http, net)
There is a knowledge point that is easily overlooked: all Streams in NodeJS All are instances of EventEmitter.
Small example
When we write a program, we suddenly need to read a certain configuration file config.json. At this time, we will briefly analyze the
- data :Contents of config.json
- Direction: Device (physical disk file) -> NodeJS program
We should use readable stream to do this
const fs = require('fs'); const FILEPATH = '...'; const rs = fs.createReadStream(FILEPATH);
Through the createReadStream()
method provided by the fs module, we easily create a readable stream. At this time, the content of config.json flows from the device to the program. We do not use the Stream module directly because fs has already referenced the Stream module internally and encapsulated it.
After we have the data, we need to process it. For example, we need to write to a certain path DEST. At this time, we need a writable stream to allow the data to flow from the program to the device.
const ws = fs.createWriteStream(DEST);
Now that we have two streams, that is, two data processors, how do we connect the streams through the Unix-like pipe symbol |
? The pipe symbol in NodeJS is the pipe()
method.
const fs = require('fs'); const FILEPATH = '...'; const rs = fs.createReadStream(FILEPATH); const ws = fs.createWriteStream(DEST); rs.pipe(ws);
In this way, we use the stream to implement a simple file copy function. The implementation principle of the pipe() method will be mentioned later, but there is one thing worth noting: the data must be piped from the upstream to the downstream, that is Pipe from a readable stream to a writable stream.
Process the data
The readable and writable streams mentioned above, we call them processors, which is actually not appropriate because we are not processing anything, we are just reading the data, and then Storing data.
If there is a need, change all the letters in the local package.json file to lowercase and save it to the package-lower.json file in the same directory.
At this time we need to use a two-way stream. Assume that we have a stream lower that specializes in converting characters to lowercase. Then the code written is probably like this
const fs = require('fs'); const rs = fs.createReadStream('./package.json'); const ws = fs.createWriteStream('./package-lower.json'); rs.pipe(lower).pipe(ws);
At this time we can You can see why the stream connected by pipe() is called a processor. According to what was said above, the pipe must be from a readable stream to a writable stream:
- rs -> lower: lower is downstream, so lower needs to be a writable stream
- lower -> ws: Relatively speaking, lower is upstream, so lower needs to be a readable stream
It’s a bit of reasoning, but it can satisfy The lower we need must be a bidirectional flow. We will mention the specific use of duplex or transform later.
Of course, if we have some additional processing actions, such as letters that need to be converted into ASCII codes, assuming there is an ascii stream, then our code may be
rs.pipe(lower).pipe(acsii).pipe(ws);
Similarly, ascii must also be a bidirectional stream . The logic of this processing is very clear, so in addition to clear code, what are the benefits of using streams?
Why Stream should be used
There is a scenario where a user needs to watch a video online. Assume that we return movie content to the user through an HTTP request, then the code may be written like this
const http = require('http'); const fs = require('fs'); http.createServer((req, res) => { fs.readFile(moviePath, (err, data) => { res.end(data); }); }).listen(8080);
这样的代码又两个明显的问题
电影文件需要读完之后才能返回给客户,等待时间超长
电影文件需要一次放入内存中,相似动作多了,内存吃不消
用流可以讲电影文件一点点的放入内存中,然后一点点的返回给客户(利用了 HTTP 协议的 Transfer-Encoding: chunked 分段传输特性),用户体验得到优化,同时对内存的开销明显下降
const http = require('http'); const fs = require('fs'); http.createServer((req, res) => { fs.createReadStream(moviePath).pipe(res); }).listen(8080);
除了上述好处,代码优雅了很多,拓展也比较简单。比如需要对视频内容压缩,我们可以引入一个专门做此事的流,这个流不用关心其它部分做了什么,只要是接入管道中就可以了
const http = require('http'); const fs = require('fs'); const oppressor = require(oppressor); http.createServer((req, res) => { fs.createReadStream(moviePath) .pipe(oppressor) .pipe(res); }).listen(8080);
可以看出来,使用流后,我们的代码逻辑变得相对独立,可维护性也会有一定的改善,关于几种流的具体使用方式且听下回分解。
更多编程相关知识,请访问:编程视频课程!!
The above is the detailed content of Understanding Streams in Node.js. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics











This article will give you an in-depth understanding of the memory and garbage collector (GC) of the NodeJS V8 engine. I hope it will be helpful to you!

The Node service built based on non-blocking and event-driven has the advantage of low memory consumption and is very suitable for handling massive network requests. Under the premise of massive requests, issues related to "memory control" need to be considered. 1. V8’s garbage collection mechanism and memory limitations Js is controlled by the garbage collection machine

The file module is an encapsulation of underlying file operations, such as file reading/writing/opening/closing/delete adding, etc. The biggest feature of the file module is that all methods provide two versions of **synchronous** and **asynchronous**, with Methods with the sync suffix are all synchronization methods, and those without are all heterogeneous methods.

The event loop is a fundamental part of Node.js and enables asynchronous programming by ensuring that the main thread is not blocked. Understanding the event loop is crucial to building efficient applications. The following article will give you an in-depth understanding of the event loop in Node. I hope it will be helpful to you!

The reason why node cannot use the npm command is because the environment variables are not configured correctly. The solution is: 1. Open "System Properties"; 2. Find "Environment Variables" -> "System Variables", and then edit the environment variables; 3. Find the location of nodejs folder; 4. Click "OK".

At the beginning, JS only ran on the browser side. It was easy to process Unicode-encoded strings, but it was difficult to process binary and non-Unicode-encoded strings. And binary is the lowest level data format of the computer, video/audio/program/network package

Stream operation is a highlight of Java8! Although java.util.stream is very powerful, there are still many developers who rarely use it in actual work. One of the most complained reasons is that it is difficult to debug. This was indeed the case at the beginning, because streaming operations such as stream cannot be used in DEBUG When it is one line of code, when it comes to the next step, many operations are actually passed at once, so it is difficult for us to judge which line in it is the problem. Plug-in: JavaStreamDebugger If the IDEA version you are using is relatively new, this plug-in is already included and does not need to be installed. If it is not installed yet, install it manually and then continue below.

java8's stream takes maxpublicstaticvoidmain(String[]args){Listlist=Arrays.asList(1,2,3,4,5,6);Integermax=list.stream().max((a,b)->{if (a>b){return1;}elsereturn-1;}).get();System.out.println(max);}Note: The size is determined here through positive and negative numbers and 0 values. Instead of writing it directly if(a>b){returna;}elseretur
