Skip to content

Commit

Permalink
Merge pull request #1 from andrewrk/maxChunkSize
Browse files Browse the repository at this point in the history
add maxChunkSize option
  • Loading branch information
thejoshwolfe authored Jun 3, 2018
2 parents dbca15b + 4bbbf2a commit bcd2887
Show file tree
Hide file tree
Showing 2 changed files with 37 additions and 7 deletions.
13 changes: 12 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,14 +96,19 @@ to use `createWriteStream` make sure you open it for writing.
`false`. `ref()` and `unref()` can be used to increase or decrease the
reference count, respectively.

### fdSlicer.createFromBuffer(buffer)
### fdSlicer.createFromBuffer(buffer, [options])

```js
var fdSlicer = require('fd-slicer');
var slicer = fdSlicer.createFromBuffer(someBuffer);
// ...
```

`options` is an optional object which can contain:

* `maxChunkSize` - A `Number` of bytes. see `createReadStream()`.
If falsey, defaults to unlimited.

#### Properties

##### fd
Expand Down Expand Up @@ -132,6 +137,12 @@ The ReadableStream that this returns has these additional methods:
will be emitted in order to cause the streaming to stop. Defaults to
`new Error("stream destroyed")`.

If `maxChunkSize` was specified (see `createFromBuffer()`), the read stream
will provide chunks of at most that size. Normally, the read stream provides
the entire range requested in a single chunk, but this can cause performance
problems in some circumstances.
See [thejoshwolfe/yauzl#87](https://github.com/thejoshwolfe/yauzl/issues/87).

##### createWriteStream(options)

Available `options`:
Expand Down
31 changes: 25 additions & 6 deletions index.js
Original file line number Diff line number Diff line change
Expand Up @@ -184,11 +184,13 @@ WriteStream.prototype.destroy = function() {
};

util.inherits(BufferSlicer, EventEmitter);
function BufferSlicer(buffer) {
function BufferSlicer(buffer, options) {
EventEmitter.call(this);

options = options || {};
this.refCount = 0;
this.buffer = buffer;
this.maxChunkSize = options.maxChunkSize || Number.MAX_SAFE_INTEGER;
}

BufferSlicer.prototype.read = function(buffer, offset, length, position, callback) {
Expand All @@ -211,11 +213,28 @@ BufferSlicer.prototype.write = function(buffer, offset, length, position, callba
BufferSlicer.prototype.createReadStream = function(options) {
options = options || {};
var readStream = new PassThrough(options);
readStream.destroyed = false;
readStream.start = options.start || 0;
readStream.endOffset = options.end;
readStream.pos = readStream.endOffset || this.buffer.length; // yep, we're already done
readStream.destroyed = false;
readStream.write(this.buffer.slice(readStream.start, readStream.pos));
// by the time this function returns, we'll be done.
readStream.pos = readStream.endOffset || this.buffer.length;

// respect the maxChunkSize option to slice up the chunk into smaller pieces.
var entireSlice = this.buffer.slice(readStream.start, readStream.pos);
var offset = 0;
while (true) {
var nextOffset = offset + this.maxChunkSize;
if (nextOffset >= entireSlice.length) {
// last chunk
if (offset < entireSlice.length) {
readStream.write(entireSlice.slice(offset, entireSlice.length));
}
break;
}
readStream.write(entireSlice.slice(offset, nextOffset));
offset = nextOffset;
}

readStream.end();
readStream.destroy = function() {
readStream.destroyed = true;
Expand Down Expand Up @@ -268,8 +287,8 @@ BufferSlicer.prototype.unref = function() {
}
};

function createFromBuffer(buffer) {
return new BufferSlicer(buffer);
function createFromBuffer(buffer, options) {
return new BufferSlicer(buffer, options);
}

function createFromFd(fd, options) {
Expand Down

0 comments on commit bcd2887

Please sign in to comment.