早就学C++的STL中的IOStream,输入输出流,看个代码

using namespace std;
cout<<x;

眼角湿润了,这是大学的记得啊,大学时我们幸苦的就学C++,为了指针的获释和指针的指针搞的一筹莫展,更何况记念中不仅仅有代码,还有小编的后生和他。算了,搬砖的腰又酸了,如故回到现实,看看node.js中的流吧。

啥是流啊。。。

流顾名思义就是流水的意思,stream英文也是溪流。假使把二进制数据从二个地点很接踵而至 蜂拥而至的送到另二个地方,像水流一样的机能,就叫流。

A stream is an abstract interface implemented by various objects in
Node.js. For example a request to an HTTP
server

is a stream, as is process.stdout .Streams
are readable, writable, or both. All streams are instances of
EventEmitter.

愿意啃文档的弟兄可以看stream

stream的例子

因为node.js十二分擅长处理数量(那里数据的可能是服务器的网页,可能重临的json数据,恐怕此外事物),所以大家来探视一些例子,表明stream对服务器的要紧成效。node.js里面很多类都是两次三番了流的接口。

始建三个echo服务

echo是回声的意趣,我们对着大山喊话,回听到本人喊的声息,那边大家做个服务器干那么些无聊的事体吗。

var http = require('http');
http.createServer(function(request, response) {
    response.writeHead(200);
    request.pipe(response);
}).listen(8080);

运作之后调用curl -d 'hello' http://localhost:8080。大概不敢相信服务器这么简单就写好了,这就是node.js的吸引力吧。
上面的pipe就是管道的趣味,和linux的命令行|三个趣味,我们应该熟谙命令行的管道吧,概念都是相通的。大家应该知道这几个

gulp

就是依据stream来做的。

上传文件

我们在看贰个上传文件的例子。

var http = require('http');
var fs = require('fs');
http.createServer(function(request, response) {
    var newFile = fs.createWriteStream("copy" + new Date() + ".md");
    var fileBytes = request.headers['content-length'];
    var uploadedBytes = 0;

    response.write("server receive request\n");
    request.pipe(newFile);

    request.on('readable', function() {
        var chunk = null;
        response.write("progress: start\n");
        while (null !== (chunk = request.read())) {
            uploadedBytes += chunk.length;
            var progress = (uploadedBytes / fileBytes) * 100;
            response.write("progress: " + parseInt(progress, 10) + "%\n");
        }
    });


    request.on('end', function() {
        response.end('uploaded!\n');
    });

}).listen(8080);
//curl --upload-file uploadFiles.js http://localhost:8080

上传文件例子

此地的看点是

  1. 哪些回到进程的:request.on('readable', function() {,有没有认为那种异步I/O格局的长处。
  2. 怎样保存文件request.pipe(newFile);,是还是不是很有利。

流的贯彻

上边大家看来流的结构的粗略易用,现在咱们看看node.js的流是怎么规划的。

To implement any sort of stream, the pattern is the same:

  1. Extend the appropriate parent class in your own subclass.
    (Theutil.inherits()
    method is particularly helpful for this.)
  2. Call the appropriate parent class constructor in your constructor,
    to be sure that the internal mechanisms are set up properly.
  3. Implement one or more specific methods, as detailed below.

The class to extend and the method(s) to implement depend on the sort of
stream class you are writing:

流的根基

翻译一下流落成的长河:

  1. 接轨合适的class
  2. 绝不遗忘调用基类构造函数
  3. 重写基类方法

数数的可读流

看贰个事例就领会了,上面那段程序就是数数,1数到1000000。

const Readable = require('stream').Readable;
const util = require('util');
util.inherits(Counter, Readable);

function Counter(opt) {
    Readable.call(this, opt);
    this._max = 1000000;
    this._index = 1;
}

Counter.prototype._read = function() {
    var i = this._index++;
    if (i > this._max)
        this.push(null);
    else {
        var str = '' + i;
        var buf = new Buffer(str, 'ascii');
        this.push(buf);
    }
};

///////////////////////////////////////////////////////////
//test 
var fs = require('fs');
var newFile = fs.createWriteStream("test_counter.txt");
var myCounter = new Counter();
myCounter.pipe(newFile);

下边的Counter已毕了三部曲,测试程序把那一个conter输出到文件。假设我们想协调落成3个流,这样就足以了。若是上边例子太不难了,我们看一下复杂点的事例,比如transform

啥是transform流

Transform streams are
Duplex
streams where the output is in some way computed from the input. They
implement both the
Readable
and
Writable
interfaces.
Examples of Transform streams include:
zlib
streams

crypto
streams

翻译一下就是用来把输入流变化一下,再出口。比如收缩,加密等。

const gzip = zlib.createGzip();
const fs = require('fs');
const inp = fs.createReadStream('input.txt');
const out = fs.createWriteStream('input.txt.gz');

inp.pipe(gzip).pipe(out);

实现transform流

以此事例解析一个多少,发生二个readable
stream,这几个stream是由此变换的哦。

  1. 浅析的格式:有多个换行符的数据流,换行符后面是头,后边是内容
格式
  1. 剖析的进程中发出一个风浪header,用来浮现底部新闻
  2. 最终去掉尾部,保留内容新闻
    明天来看一下代码吧。

const util = require('util');
const Transform = require('stream').Transform;
util.inherits(SimpleProtocol, Transform);

function SimpleProtocol(options) {
  if (!(this instanceof SimpleProtocol))
    return new SimpleProtocol(options);

  Transform.call(this, options);
  this._inBody = false;
  this._sawFirstCr = false;
  this._rawHeader = [];
  this.header = null;
}

SimpleProtocol.prototype._transform = function(chunk, encoding, done) {
  if (!this._inBody) {
    // check if the chunk has a \n\n
    var split = -1;
    for (var i = 0; i < chunk.length; i++) {
      if (chunk[i] === 10) { // '\n'
        if (this._sawFirstCr) {
          split = i;
          break;
        } else {
          this._sawFirstCr = true;
        }
      } else {
        this._sawFirstCr = false;
      }
    }

    if (split === -1) {
      // still waiting for the \n\n
      // stash the chunk, and try again.
      this._rawHeader.push(chunk);
    } else {
      this._inBody = true;
      var h = chunk.slice(0, split);
      this._rawHeader.push(h);
      var header = Buffer.concat(this._rawHeader).toString();
      try {
        this.header = JSON.parse(header);
      } catch (er) {
        this.emit('error', new Error('invalid simple protocol data'));
        return;
      }
      // and let them know that we are done parsing the header.
      this.emit('header', this.header);

      // now, because we got some extra data, emit this first.
      this.push(chunk.slice(split));
    }
  } else {
    // from there on, just provide the data to our consumer as-is.
    this.push(chunk);
  }
  done();
};

// Usage:
var fs = require('fs');
const source = fs.createReadStream('input.txt');
const out = fs.createWriteStream('output.txt');

var parser = new SimpleProtocol();

// Now parser is a readable stream that will emit 'header'
// with the parsed header data.
source.pipe(parser).pipe(out);
parser.on('header',function(header){
  console.log(header);
});

虽说代码长了点,但是有注释,我就不表明了,注意最终什么利用的哦。看看运转的结果吗。

运行结果

流就介绍到那里了,倘若还意犹未尽,可以看看node的源码node in
github
要么文档stream

相关文章

网站地图xml地图