Skip to content

Commit

Permalink
Update changelog for pg-query-stream
Browse files Browse the repository at this point in the history
Document the conversion to typescript as a semver major change. Closes #2412.
  • Loading branch information
brianc committed Nov 30, 2020
1 parent fa4549a commit 54b8752
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 7 deletions.
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@ For richer information consult the commit log on github with referenced pull req

We do not include break-fix version release in this file.

### pg-query-stream@4.0.0

- Library has been [converted](https://github.com/brianc/node-postgres/pull/2376) to Typescript. The behavior is identical, but there could be subtle breaking changes due to class names changing or other small inconsistencies introduced by the conversion.

### pg@8.5.0

- Fix bug forwarding [ssl key](https://github.com/brianc/node-postgres/pull/2394).
Expand Down
10 changes: 3 additions & 7 deletions packages/pg-query-stream/README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,7 @@
# pg-query-stream

[![Build Status](https://travis-ci.org/brianc/node-pg-query-stream.svg)](https://travis-ci.org/brianc/node-pg-query-stream)

Receive result rows from [pg](https://github.com/brianc/node-postgres) as a readable (object) stream.


## installation

```bash
Expand All @@ -14,7 +11,6 @@ $ npm install pg-query-stream --save

_requires pg>=2.8.1_


## use

```js
Expand All @@ -24,7 +20,7 @@ const JSONStream = require('JSONStream')

//pipe 1,000,000 rows to stdout without blowing up your memory usage
pg.connect((err, client, done) => {
if (err) throw err;
if (err) throw err
const query = new QueryStream('SELECT * FROM generate_series(0, $1) num', [1000000])
const stream = client.query(query)
//release the client when the stream is finished
Expand All @@ -35,13 +31,13 @@ pg.connect((err, client, done) => {

The stream uses a cursor on the server so it efficiently keeps only a low number of rows in memory.

This is especially useful when doing [ETL](http://en.wikipedia.org/wiki/Extract,_transform,_load) on a huge table. Using manual `limit` and `offset` queries to fake out async itteration through your data is cumbersome, and _way way way_ slower than using a cursor.
This is especially useful when doing [ETL](http://en.wikipedia.org/wiki/Extract,_transform,_load) on a huge table. Using manual `limit` and `offset` queries to fake out async itteration through your data is cumbersome, and _way way way_ slower than using a cursor.

_note: this module only works with the JavaScript client, and does not work with the native bindings. libpq doesn't expose the protocol at a level where a cursor can be manipulated directly_

## contribution

I'm very open to contribution! Open a pull request with your code or idea and we'll talk about it. If it's not way insane we'll merge it in too: isn't open source awesome?
I'm very open to contribution! Open a pull request with your code or idea and we'll talk about it. If it's not way insane we'll merge it in too: isn't open source awesome?

## license

Expand Down

0 comments on commit 54b8752

Please sign in to comment.