From 6f0279ff439a7849851adca506d09bba80510c62 Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Kat=20March=C3=A1n?= Run If you want a more fancy pants install (a different version, customized
paths, etc.) then read on. There's a pretty robust install script at
https://www.npmjs.com/install.sh. You can download that and run it. Here's an example using curl:Other Sorts of Unices
make install
. npm will be installed with node.Fancy Install (Unix)
+Fancy Install (Unix)
SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-access.html b/deps/npm/html/doc/cli/npm-access.html
index c010b55a792ead..0904466dd0aa42 100644
--- a/deps/npm/html/doc/cli/npm-access.html
+++ b/deps/npm/html/doc/cli/npm-access.html
@@ -17,6 +17,9 @@ SYNOPSIS
npm access grant <read-only|read-write> <scope:team> [<package>]
npm access revoke <scope:team> [<package>]
+npm access 2fa-required [<package>]
+npm access 2fa-not-required [<package>]
+
npm access ls-packages [<user>|<scope>|<scope:team>]
npm access ls-collaborators [<package> [<user>]]
npm access edit [<package>]DESCRIPTION
@@ -32,6 +35,10 @@ SYNOPSIS
Add or remove the ability of users and teams to have read-only or read-write
access to a package.
2fa-required / 2fa-not-required: +Configure whether a package requires that anyone publishing it have two-factor +authentication enabled on their account.
+ls-packages: Show all of the packages a user or a team is able to access, along with the access level, except for read-only public packages (it won't print the whole @@ -68,6 +75,7 @@
Management of teams and team memberships is done with the npm team
command.
libnpmaccess
npm will not remove data by itself: the cache will grow as new packages are installed.
-The npm cache is strictly a cache: it should not be relied upon as a persistent and reliable data store for package data. npm makes no guarantee that a previously-cached piece of data will be available later, and will automatically @@ -88,5 +88,5 @@
ls: Show all of the dist-tags for a package, defaulting to the package in the current prefix.
+This is the default action if none is specified.
A tag can be used when installing packages as a reference to a version instead @@ -85,5 +86,5 @@
npm ping
npm config get registry
), and if you're using a
private registry that doesn't support the /whoami
endpoint supported by the
primary registry, this check may fail.
-npm -v
npm -v
While Node.js may come bundled with a particular version of npm, it's the
policy of the CLI team that we recommend all users run npm@latest
if they
can. As the CLI is maintained by a small team of contributors, there are only
@@ -49,7 +49,7 @@
npm -v
node -v
node -v
For most users, in most circumstances, the best version of Node will be the latest long-term support (LTS) release. Those of you who want access to new ECMAscript features or bleeding-edge changes to Node's standard library may be @@ -102,4 +102,4 @@
See npm-folders(5) for a more detailed description of the specific folder structures that npm creates.
-npm will refuse to install any package with an identical name to the
current package. This can be overridden with the --force
flag, but in
most cases can simply be addressed by changing the local package name.
npm ls promzard
in npm's source tree will show:
-npm@6.5.0 /path/to/npm
+npm@6.7.0 /path/to/npm
└─┬ init-package-json@0.0.4
└── promzard@0.1.5
It will print out extraneous, missing, and invalid packages.
If a project specifies git urls for dependencies these are shown
@@ -60,13 +60,13 @@
depth
Type: Int
Max display depth of the dependency tree.
-prod / production
+prod / production
- Type: Boolean
- Default: false
Display only the dependency tree for packages in dependencies
.
-dev / development
+dev / development
- Type: Boolean
- Default: false
@@ -108,5 +108,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-org.html b/deps/npm/html/doc/cli/npm-org.html
new file mode 100644
index 00000000000000..2132e290adefc3
--- /dev/null
+++ b/deps/npm/html/doc/cli/npm-org.html
@@ -0,0 +1,43 @@
+
+
+ npm-org
+
+
+
+
+
+
+
+
+npm-org
Manage orgs
+SYNOPSIS
+npm org set <orgname> <username> [developer | admin | owner]
+npm org rm <orgname> <username>
+npm org ls <orgname> [<username>]
EXAMPLE
+Add a new developer to an org:
+$ npm org set my-org @mx-smith
Add a new admin to an org (or change a developer to an admin):
+$ npm org set my-org @mx-santos admin
Remove a user from an org:
+$ npm org rm my-org mx-santos
List all users in an org:
+$ npm org ls my-org
List all users in JSON format:
+$ npm org ls my-org --json
See what role a user has in an org:
+$ npm org ls my-org @mx-santos
DESCRIPTION
+You can use the npm org
commands to manage and view users of an organization.
+It supports adding and removing users, changing their roles, listing them, and
+finding specific ones and their roles.
+SEE ALSO
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/deps/npm/html/doc/cli/npm-outdated.html b/deps/npm/html/doc/cli/npm-outdated.html
index 560afe03379ce0..b39aaeac54de2f 100644
--- a/deps/npm/html/doc/cli/npm-outdated.html
+++ b/deps/npm/html/doc/cli/npm-outdated.html
@@ -116,5 +116,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-owner.html b/deps/npm/html/doc/cli/npm-owner.html
index 07500f648bf8de..42a1f852edfff8 100644
--- a/deps/npm/html/doc/cli/npm-owner.html
+++ b/deps/npm/html/doc/cli/npm-owner.html
@@ -53,5 +53,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-pack.html b/deps/npm/html/doc/cli/npm-pack.html
index 312d856657f5ed..c6494d46c0c7a8 100644
--- a/deps/npm/html/doc/cli/npm-pack.html
+++ b/deps/npm/html/doc/cli/npm-pack.html
@@ -42,5 +42,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-ping.html b/deps/npm/html/doc/cli/npm-ping.html
index 70d992572facdd..e320419a0ef4f1 100644
--- a/deps/npm/html/doc/cli/npm-ping.html
+++ b/deps/npm/html/doc/cli/npm-ping.html
@@ -33,5 +33,5 @@ SYNOPSIS
-
+
diff --git a/deps/npm/html/doc/cli/npm-prefix.html b/deps/npm/html/doc/cli/npm-prefix.html
index 1dcd48214280d7..af9969c0746d17 100644
--- a/deps/npm/html/doc/cli/npm-prefix.html
+++ b/deps/npm/html/doc/cli/npm-prefix.html
@@ -13,7 +13,8 @@ npm-prefix
Display prefix
SYNOPSIS
npm prefix [-g]
DESCRIPTION
Print the local prefix to standard out. This is the closest parent directory
-to contain a package.json file unless -g
is also specified.
+to contain a package.json
file or node_modules
directory, unless -g
is
+also specified.
If -g
is specified, this will be the value of the global prefix. See
npm-config(7)
for more detail.
SEE ALSO
@@ -37,5 +38,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-profile.html b/deps/npm/html/doc/cli/npm-profile.html
index 2d2c030ae7702b..e9e81d60f1b373 100644
--- a/deps/npm/html/doc/cli/npm-profile.html
+++ b/deps/npm/html/doc/cli/npm-profile.html
@@ -88,4 +88,4 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-prune.html b/deps/npm/html/doc/cli/npm-prune.html
index f9a791e28e6d8a..aa4a4a9233fae0 100644
--- a/deps/npm/html/doc/cli/npm-prune.html
+++ b/deps/npm/html/doc/cli/npm-prune.html
@@ -47,5 +47,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-publish.html b/deps/npm/html/doc/cli/npm-publish.html
index 7de9cf6cbacaa5..622389b2e1763e 100644
--- a/deps/npm/html/doc/cli/npm-publish.html
+++ b/deps/npm/html/doc/cli/npm-publish.html
@@ -87,5 +87,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-rebuild.html b/deps/npm/html/doc/cli/npm-rebuild.html
index c201f50e16a2fb..793bbad2487420 100644
--- a/deps/npm/html/doc/cli/npm-rebuild.html
+++ b/deps/npm/html/doc/cli/npm-rebuild.html
@@ -34,5 +34,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-repo.html b/deps/npm/html/doc/cli/npm-repo.html
index fa2f1e04a06bbb..a62eccf9c0a192 100644
--- a/deps/npm/html/doc/cli/npm-repo.html
+++ b/deps/npm/html/doc/cli/npm-repo.html
@@ -40,5 +40,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-restart.html b/deps/npm/html/doc/cli/npm-restart.html
index f095d31c4298ba..14c430e012f049 100644
--- a/deps/npm/html/doc/cli/npm-restart.html
+++ b/deps/npm/html/doc/cli/npm-restart.html
@@ -52,5 +52,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-root.html b/deps/npm/html/doc/cli/npm-root.html
index 583d7394b42465..13ec0912a8d125 100644
--- a/deps/npm/html/doc/cli/npm-root.html
+++ b/deps/npm/html/doc/cli/npm-root.html
@@ -34,5 +34,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-run-script.html b/deps/npm/html/doc/cli/npm-run-script.html
index 5ecad32712a66b..862a9fb324fd45 100644
--- a/deps/npm/html/doc/cli/npm-run-script.html
+++ b/deps/npm/html/doc/cli/npm-run-script.html
@@ -79,5 +79,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-search.html b/deps/npm/html/doc/cli/npm-search.html
index 8a346ced27133c..6818a07dd9df03 100644
--- a/deps/npm/html/doc/cli/npm-search.html
+++ b/deps/npm/html/doc/cli/npm-search.html
@@ -32,7 +32,7 @@ SYNOPSIS
quoted in most shells.)
A Note on caching
CONFIGURATION
-description
+description
- Default: true
- Type: Boolean
@@ -108,5 +108,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-shrinkwrap.html b/deps/npm/html/doc/cli/npm-shrinkwrap.html
index f4706141d5caab..2b07d44400901e 100644
--- a/deps/npm/html/doc/cli/npm-shrinkwrap.html
+++ b/deps/npm/html/doc/cli/npm-shrinkwrap.html
@@ -40,5 +40,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-star.html b/deps/npm/html/doc/cli/npm-star.html
index a8786726c8dd31..282fe6058e9aa8 100644
--- a/deps/npm/html/doc/cli/npm-star.html
+++ b/deps/npm/html/doc/cli/npm-star.html
@@ -35,5 +35,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-stars.html b/deps/npm/html/doc/cli/npm-stars.html
index 61e9564eea5c3d..48bbc461a81d6b 100644
--- a/deps/npm/html/doc/cli/npm-stars.html
+++ b/deps/npm/html/doc/cli/npm-stars.html
@@ -35,5 +35,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-start.html b/deps/npm/html/doc/cli/npm-start.html
index 23ee465c885454..063eed365633ae 100644
--- a/deps/npm/html/doc/cli/npm-start.html
+++ b/deps/npm/html/doc/cli/npm-start.html
@@ -38,5 +38,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-stop.html b/deps/npm/html/doc/cli/npm-stop.html
index ad5dc240459114..989210046d0d38 100644
--- a/deps/npm/html/doc/cli/npm-stop.html
+++ b/deps/npm/html/doc/cli/npm-stop.html
@@ -33,5 +33,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-team.html b/deps/npm/html/doc/cli/npm-team.html
index 26706b7fcc9d37..5368ed6ed41df0 100644
--- a/deps/npm/html/doc/cli/npm-team.html
+++ b/deps/npm/html/doc/cli/npm-team.html
@@ -69,5 +69,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-test.html b/deps/npm/html/doc/cli/npm-test.html
index 37a699f381f14c..0c5839262983e9 100644
--- a/deps/npm/html/doc/cli/npm-test.html
+++ b/deps/npm/html/doc/cli/npm-test.html
@@ -35,5 +35,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-token.html b/deps/npm/html/doc/cli/npm-token.html
index eba3d9e90b1b11..33d11852c8a4a9 100644
--- a/deps/npm/html/doc/cli/npm-token.html
+++ b/deps/npm/html/doc/cli/npm-token.html
@@ -14,28 +14,39 @@ SYNOPSIS
npm token list [--json|--parseable]
npm token create [--read-only] [--cidr=1.1.1.1/24,2.2.2.2/16]
npm token revoke <id|token>
DESCRIPTION
-This list you list, create and revoke authentication tokens.
+This lets you list, create and revoke authentication tokens.
npm token list
:
Shows a table of all active authentication tokens. You can request this as
-JSON with --json
or tab-separated values with --parseable
.
-+--------+---------+------------+----------+----------------+
-| id | token | created | read-only | CIDR whitelist |
-+--------+---------+------------+----------+----------------+
-| 7f3134 | 1fa9ba… | 2017-10-02 | yes | |
-+--------+---------+------------+----------+----------------+
-| c03241 | af7aef… | 2017-10-02 | no | 192.168.0.1/24 |
-+--------+---------+------------+----------+----------------+
-| e0cf92 | 3a436a… | 2017-10-02 | no | |
-+--------+---------+------------+----------+----------------+
-| 63eb9d | 74ef35… | 2017-09-28 | no | |
-+--------+---------+------------+----------+----------------+
-| 2daaa8 | cbad5f… | 2017-09-26 | no | |
-+--------+---------+------------+----------+----------------+
-| 68c2fe | 127e51… | 2017-09-23 | no | |
-+--------+---------+------------+----------+----------------+
-| 6334e1 | 1dadd1… | 2017-09-23 | no | |
-+--------+---------+------------+----------+----------------+
+JSON with --json
or tab-separated values with --parseable
.
+```
+
--------+---------+------------+----------+----------------+ +| id | token | created | read-only | CIDR whitelist |
+--------+---------+------------+----------+----------------+ +| 7f3134 | 1fa9ba… | 2017-10-02 | yes | |
+--------+---------+------------+----------+----------------+ +| c03241 | af7aef… | 2017-10-02 | no | 192.168.0.1/24 |
+--------+---------+------------+----------+----------------+ +| e0cf92 | 3a436a… | 2017-10-02 | no | |
+--------+---------+------------+----------+----------------+ +| 63eb9d | 74ef35… | 2017-09-28 | no | |
+--------+---------+------------+----------+----------------+ +| 2daaa8 | cbad5f… | 2017-09-26 | no | |
+--------+---------+------------+----------+----------------+ +| 68c2fe | 127e51… | 2017-09-23 | no | |
+--------+---------+------------+----------+----------------+ +| 6334e1 | 1dadd1… | 2017-09-23 | no | |
+--------+---------+------------+----------+----------------+
+
npm token create [--read-only] [--cidr=<cidr-ranges>]
:
Create a new authentication token. It can be --read-only
or accept a list of
CIDR ranges to
@@ -70,4 +81,4 @@
1.2.2
, this version does not satisfy ~1.1.1
, which is equivalent
to >=1.1.1 <1.2.0
. So the highest-sorting version that satisfies ~1.1.1
is used,
which is 1.1.2
.
-Suppose app
has a caret dependency on a version below 1.0.0
, for example:
"dependencies": {
"dep1": "^0.2.0"
@@ -100,5 +100,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-version.html b/deps/npm/html/doc/cli/npm-version.html
index 90d3d14c6e39b3..31bf550d7baee8 100644
--- a/deps/npm/html/doc/cli/npm-version.html
+++ b/deps/npm/html/doc/cli/npm-version.html
@@ -116,5 +116,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-view.html b/deps/npm/html/doc/cli/npm-view.html
index 34edabba5ac40a..9ec17116957985 100644
--- a/deps/npm/html/doc/cli/npm-view.html
+++ b/deps/npm/html/doc/cli/npm-view.html
@@ -75,5 +75,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm-whoami.html b/deps/npm/html/doc/cli/npm-whoami.html
index b6838b1b7a627d..df62dacb1a4c51 100644
--- a/deps/npm/html/doc/cli/npm-whoami.html
+++ b/deps/npm/html/doc/cli/npm-whoami.html
@@ -32,5 +32,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/cli/npm.html b/deps/npm/html/doc/cli/npm.html
index 6f4eacb9b3b623..6864d44a472a76 100644
--- a/deps/npm/html/doc/cli/npm.html
+++ b/deps/npm/html/doc/cli/npm.html
@@ -12,7 +12,7 @@
npm
javascript package manager
SYNOPSIS
npm <command> [args]
VERSION
-6.5.0
+6.7.0
DESCRIPTION
npm is the package manager for the Node JavaScript platform. It puts
modules in place so that node can find them, and manages dependency
@@ -130,7 +130,7 @@
AUTHOR
Isaac Z. Schlueter ::
isaacs ::
@izs ::
-i@izs.me
+i@izs.me
SEE ALSO
- npm-help(1)
@@ -154,5 +154,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/files/npm-folders.html b/deps/npm/html/doc/files/npm-folders.html
index 70e06061e914c2..5ad8e306134f0e 100644
--- a/deps/npm/html/doc/files/npm-folders.html
+++ b/deps/npm/html/doc/files/npm-folders.html
@@ -13,7 +13,7 @@ npm-folders
Folder Structure
DESCRIPTION
npm puts various things on your computer. That's its job.
This document will tell you what it puts where.
-tl;dr
+tl;dr
- Local install (default): puts stuff in
./node_modules
of the current
package root.
@@ -179,5 +179,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/files/npm-global.html b/deps/npm/html/doc/files/npm-global.html
index 70e06061e914c2..5ad8e306134f0e 100644
--- a/deps/npm/html/doc/files/npm-global.html
+++ b/deps/npm/html/doc/files/npm-global.html
@@ -13,7 +13,7 @@ npm-folders
Folder Structure
DESCRIPTION
npm puts various things on your computer. That's its job.
This document will tell you what it puts where.
-tl;dr
+tl;dr
- Local install (default): puts stuff in
./node_modules
of the current
package root.
@@ -179,5 +179,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/files/npm-json.html b/deps/npm/html/doc/files/npm-json.html
index bff086cf76420d..751bbebde906d6 100644
--- a/deps/npm/html/doc/files/npm-json.html
+++ b/deps/npm/html/doc/files/npm-json.html
@@ -54,7 +54,7 @@ version
node-semver, which is bundled
with npm as a dependency. (npm install semver
to use it yourself.)
More on version numbers and ranges at semver(7).
-description
+description
Put a description in it. It's a string. This helps people discover your
package, as it's listed in npm search
.
keywords
@@ -230,25 +230,25 @@ directories
object. If you look at npm's package.json,
you'll see that it has directories for doc, lib, and man.
In the future, this information may be used in other creative ways.
-directories.lib
+directories.lib
Tell people where the bulk of your library is. Nothing special is done
with the lib folder in any way, but it's useful meta info.
-directories.bin
+directories.bin
If you specify a bin
directory in directories.bin
, all the files in
that folder will be added.
Because of the way the bin
directive works, specifying both a
bin
path and setting directories.bin
is an error. If you want to
specify individual files, use bin
, and for all the files in an
existing bin
directory, use directories.bin
.
-directories.man
+directories.man
A folder that is full of man pages. Sugar to generate a "man" array by
walking the folder.
-directories.doc
+directories.doc
Put markdown files in here. Eventually, these will be displayed nicely,
maybe, someday.
-directories.example
+directories.example
Put example scripts in here. Someday, it might be exposed in some clever way.
-directories.test
+directories.test
Put your tests in here. It is currently not exposed, but it might be in the
future.
repository
@@ -574,5 +574,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/files/npm-package-locks.html b/deps/npm/html/doc/files/npm-package-locks.html
index 7e0f6e31d19bdb..6e273ed342d717 100644
--- a/deps/npm/html/doc/files/npm-package-locks.html
+++ b/deps/npm/html/doc/files/npm-package-locks.html
@@ -154,4 +154,4 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/files/npm-shrinkwrap.json.html b/deps/npm/html/doc/files/npm-shrinkwrap.json.html
index 0399e5a1252721..70316f153aa794 100644
--- a/deps/npm/html/doc/files/npm-shrinkwrap.json.html
+++ b/deps/npm/html/doc/files/npm-shrinkwrap.json.html
@@ -42,4 +42,4 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/files/npmrc.html b/deps/npm/html/doc/files/npmrc.html
index a02b8ace99e16c..9b34b7a13270c5 100644
--- a/deps/npm/html/doc/files/npmrc.html
+++ b/deps/npm/html/doc/files/npmrc.html
@@ -82,5 +82,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/files/package-lock.json.html b/deps/npm/html/doc/files/package-lock.json.html
index 0426de30cfb5ef..1d6399445e7c39 100644
--- a/deps/npm/html/doc/files/package-lock.json.html
+++ b/deps/npm/html/doc/files/package-lock.json.html
@@ -57,7 +57,7 @@ preserveSymlinks
dependencies
A mapping of package name to dependency object. Dependency objects have the
following properties:
-version
+version
This is a specifier that uniquely identifies this package and should be
usable in fetching a new copy of it.
@@ -108,7 +108,7 @@ requires
this module requires, regardless of where it will be installed. The version
should match via normal matching rules a dependency either in our
dependencies
or in a level higher than us.
-dependencies
+dependencies
The dependencies of this dependency, exactly as at the top level.
SEE ALSO
@@ -130,4 +130,4 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/files/package.json.html b/deps/npm/html/doc/files/package.json.html
index bff086cf76420d..751bbebde906d6 100644
--- a/deps/npm/html/doc/files/package.json.html
+++ b/deps/npm/html/doc/files/package.json.html
@@ -54,7 +54,7 @@ version
node-semver, which is bundled
with npm as a dependency. (npm install semver
to use it yourself.)
More on version numbers and ranges at semver(7).
-description
+description
Put a description in it. It's a string. This helps people discover your
package, as it's listed in npm search
.
keywords
@@ -230,25 +230,25 @@ directories
object. If you look at npm's package.json,
you'll see that it has directories for doc, lib, and man.
In the future, this information may be used in other creative ways.
-directories.lib
+directories.lib
Tell people where the bulk of your library is. Nothing special is done
with the lib folder in any way, but it's useful meta info.
-directories.bin
+directories.bin
If you specify a bin
directory in directories.bin
, all the files in
that folder will be added.
Because of the way the bin
directive works, specifying both a
bin
path and setting directories.bin
is an error. If you want to
specify individual files, use bin
, and for all the files in an
existing bin
directory, use directories.bin
.
-directories.man
+directories.man
A folder that is full of man pages. Sugar to generate a "man" array by
walking the folder.
-directories.doc
+directories.doc
Put markdown files in here. Eventually, these will be displayed nicely,
maybe, someday.
-directories.example
+directories.example
Put example scripts in here. Someday, it might be exposed in some clever way.
-directories.test
+directories.test
Put your tests in here. It is currently not exposed, but it might be in the
future.
repository
@@ -574,5 +574,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/index.html b/deps/npm/html/doc/index.html
index 3a64ccc8c58529..b372a629232be9 100644
--- a/deps/npm/html/doc/index.html
+++ b/deps/npm/html/doc/index.html
@@ -10,163 +10,165 @@
npm-index
Index of all npm documentation
-README
+README
a JavaScript package manager
Command Line Documentation
Using npm on the command line
-npm(1)
+npm(1)
javascript package manager
-npm-access(1)
+npm-access(1)
Set access level on published packages
-npm-adduser(1)
+npm-adduser(1)
Add a registry user account
-npm-audit(1)
+npm-audit(1)
Run a security audit
-npm-bin(1)
+npm-bin(1)
Display npm bin folder
-npm-bugs(1)
+npm-bugs(1)
Bugs for a package in a web browser maybe
-npm-build(1)
+npm-build(1)
Build a package
-npm-bundle(1)
+npm-bundle(1)
REMOVED
-npm-cache(1)
+npm-cache(1)
Manipulates packages cache
-npm-ci(1)
+npm-ci(1)
Install a project with a clean slate
-npm-completion(1)
+npm-completion(1)
Tab Completion for npm
-npm-config(1)
+npm-config(1)
Manage the npm configuration files
-npm-dedupe(1)
+npm-dedupe(1)
Reduce duplication
-npm-deprecate(1)
+npm-deprecate(1)
Deprecate a version of a package
-npm-dist-tag(1)
+npm-dist-tag(1)
Modify package distribution tags
-npm-docs(1)
+npm-docs(1)
Docs for a package in a web browser maybe
-npm-doctor(1)
+npm-doctor(1)
Check your environments
-npm-edit(1)
+npm-edit(1)
Edit an installed package
-npm-explore(1)
+npm-explore(1)
Browse an installed package
-npm-help-search(1)
+npm-help-search(1)
Search npm help documentation
-npm-help(1)
+npm-help(1)
Get help on npm
-npm-hook(1)
+npm-hook(1)
Manage registry hooks
-npm-init(1)
+npm-init(1)
create a package.json file
-npm-install-ci-test(1)
+npm-install-ci-test(1)
Install a project with a clean slate and run tests
-npm-install-test(1)
+npm-install-test(1)
Install package(s) and run tests
-npm-install(1)
+npm-install(1)
Install a package
-npm-link(1)
+npm-link(1)
Symlink a package folder
-npm-logout(1)
+npm-logout(1)
Log out of the registry
-npm-ls(1)
+npm-ls(1)
List installed packages
-npm-outdated(1)
+npm-org(1)
+Manage orgs
+npm-outdated(1)
Check for outdated packages
-npm-owner(1)
+npm-owner(1)
Manage package owners
-npm-pack(1)
+npm-pack(1)
Create a tarball from a package
-npm-ping(1)
+npm-ping(1)
Ping npm registry
-npm-prefix(1)
+npm-prefix(1)
Display prefix
-npm-profile(1)
+npm-profile(1)
Change settings on your registry profile
-npm-prune(1)
+npm-prune(1)
Remove extraneous packages
-npm-publish(1)
+npm-publish(1)
Publish a package
-npm-rebuild(1)
+npm-rebuild(1)
Rebuild a package
-npm-repo(1)
+npm-repo(1)
Open package repository page in the browser
-npm-restart(1)
+npm-restart(1)
Restart a package
-npm-root(1)
+npm-root(1)
Display npm root
-npm-run-script(1)
+npm-run-script(1)
Run arbitrary package scripts
-npm-search(1)
+npm-search(1)
Search for packages
-npm-shrinkwrap(1)
+npm-shrinkwrap(1)
Lock down dependency versions for publication
-npm-star(1)
+npm-star(1)
Mark your favorite packages
-npm-stars(1)
+npm-stars(1)
View packages marked as favorites
-npm-start(1)
+npm-start(1)
Start a package
-npm-stop(1)
+npm-stop(1)
Stop a package
-npm-team(1)
+npm-team(1)
Manage organization teams and team memberships
-npm-test(1)
+npm-test(1)
Test a package
-npm-token(1)
+npm-token(1)
Manage your authentication tokens
-npm-uninstall(1)
+npm-uninstall(1)
Remove a package
-npm-unpublish(1)
+npm-unpublish(1)
Remove a package from the registry
-npm-update(1)
+npm-update(1)
Update a package
-npm-version(1)
+npm-version(1)
Bump a package version
-npm-view(1)
+npm-view(1)
View registry info
-npm-whoami(1)
+npm-whoami(1)
Display npm username
API Documentation
Using npm in your Node programs
Files
File system structures npm uses
-npm-folders(5)
+npm-folders(5)
Folder Structures Used by npm
-npm-package-locks(5)
+npm-package-locks(5)
An explanation of npm lockfiles
-npm-shrinkwrap.json(5)
+npm-shrinkwrap.json(5)
A publishable lockfile
-npmrc(5)
+npmrc(5)
The npm config files
-package-lock.json(5)
+package-lock.json(5)
A manifestation of the manifest
-package.json(5)
+package.json(5)
Specifics of npm's package.json handling
Misc
Various other bits and bobs
-npm-coding-style(7)
+npm-coding-style(7)
npm's "funny" coding style
-npm-config(7)
+npm-config(7)
More than you probably want to know about npm configuration
-npm-developers(7)
+npm-developers(7)
Developer Guide
-npm-disputes(7)
+npm-disputes(7)
Handling Module Name Disputes
-npm-index(7)
+npm-index(7)
Index of all npm documentation
-npm-orgs(7)
+npm-orgs(7)
Working with Teams & Orgs
-npm-registry(7)
+npm-registry(7)
The JavaScript Package Registry
-npm-scope(7)
+npm-scope(7)
Scoped packages
-npm-scripts(7)
+npm-scripts(7)
How npm handles the "scripts" field
-removing-npm(7)
+removing-npm(7)
Cleaning the Slate
-semver(7)
+semver(7)
The semantic versioner for npm
@@ -180,5 +182,5 @@ semver(7)
-
+
diff --git a/deps/npm/html/doc/misc/npm-coding-style.html b/deps/npm/html/doc/misc/npm-coding-style.html
index 6005929dc42899..3e4632d6789390 100644
--- a/deps/npm/html/doc/misc/npm-coding-style.html
+++ b/deps/npm/html/doc/misc/npm-coding-style.html
@@ -89,7 +89,7 @@ Comma First
lines. Don't use more spaces than are helpful.
Functions
Use named functions. They make stack traces a lot easier to read.
-Callbacks, Sync/async Style
+Callbacks, Sync/async Style
Use the asynchronous/non-blocking versions of things as much as possible.
It might make more sense for npm to use the synchronous fs APIs, but this
way, the fs and http and child process stuff all uses the same callback-passing
@@ -110,7 +110,7 @@
Logging
occurs.
Use appropriate log levels. See npm-config(7)
and search for
"loglevel".
-Case, naming, etc.
+Case, naming, etc.
Use lowerCamelCase
for multiword identifiers when they refer to objects,
functions, methods, properties, or anything not specified in this section.
Use UpperCamelCase
for class names (things that you'd pass to "new").
@@ -145,5 +145,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/misc/npm-config.html b/deps/npm/html/doc/misc/npm-config.html
index 91af998a0471ac..d09706648f6df1 100644
--- a/deps/npm/html/doc/misc/npm-config.html
+++ b/deps/npm/html/doc/misc/npm-config.html
@@ -259,7 +259,7 @@ depth
since that gives more useful information. To show the outdated status
of all packages and dependents, use a large integer value,
e.g., npm outdated --depth 9999
-description
+description
- Default: true
- Type: Boolean
@@ -1064,5 +1064,5 @@ SEE ALSO
-
+
diff --git a/deps/npm/html/doc/misc/npm-developers.html b/deps/npm/html/doc/misc/npm-developers.html
index 04b069a5cdba6d..04c4430a51f0a2 100644
--- a/deps/npm/html/doc/misc/npm-developers.html
+++ b/deps/npm/html/doc/misc/npm-developers.html
@@ -41,7 +41,7 @@ What is a package
git+http://user@hostname/project/blah.git#commit-ish
git+https://user@hostname/project/blah.git#commit-ish
The commit-ish
can be any tag, sha, or branch which can be supplied as
an argument to git checkout
. The default is master
.
You need to have a package.json
file in the root of your project to do
much of anything with npm. That is basically the whole interface.
See package.json(5)
for details about what goes in that file. At the very
@@ -198,5 +198,5 @@
Handling Module npm Code of Conduct, and nothing in this document should be interpreted to contradict any aspect of the npm Code of Conduct.
-npm owner ls <pkgname>
Don't squat on package names. Publish code or move out of the way.
@@ -58,13 +58,13 @@Alice emails Yusuf, explaining the situation as respectfully as possible,
and what she would like to do with the module name. She adds the npm support
-staff support@npmjs.com to the CC list of the email. Mention in the email
+staff support@npmjs.com to the CC list of the email. Mention in the email
that Yusuf can run npm owner add alice foo
to add Alice as an owner of the
foo package.
After a reasonable amount of time, if Yusuf has not responded, or if Yusuf and Alice can't come to any sort of resolution, email support -support@npmjs.com and we'll sort it out. ("Reasonable" is usually at least +support@npmjs.com and we'll sort it out. ("Reasonable" is usually at least 4 weeks.)
If you see bad behavior like this, please report it to abuse@npmjs.com right +
If you see bad behavior like this, please report it to abuse@npmjs.com right away. You are never expected to resolve abusive behavior on your own. We are here to help.
If you think another npm publisher is infringing your trademark, such as by -using a confusingly similar package name, email abuse@npmjs.com with a link to +using a confusingly similar package name, email abuse@npmjs.com with a link to the package or user account on https://www.npmjs.com/. Attach a copy of your trademark registration certificate.
If we see that the package's publisher is intentionally misleading others by @@ -139,5 +139,5 @@
Index of all npm documentation
-a JavaScript package manager
Using npm on the command line
-javascript package manager
-Set access level on published packages
-Add a registry user account
-Run a security audit
-Display npm bin folder
-Bugs for a package in a web browser maybe
-Build a package
-REMOVED
-Manipulates packages cache
-Install a project with a clean slate
-Tab Completion for npm
-Manage the npm configuration files
-Reduce duplication
-Deprecate a version of a package
-Modify package distribution tags
-Docs for a package in a web browser maybe
-Check your environments
-Edit an installed package
-Browse an installed package
-Search npm help documentation
-Get help on npm
-Manage registry hooks
-create a package.json file
-Install a project with a clean slate and run tests
-Install package(s) and run tests
-Install a package
-Symlink a package folder
-Log out of the registry
-List installed packages
-Manage orgs
+Check for outdated packages
-Manage package owners
-Create a tarball from a package
-Ping npm registry
-Display prefix
-Change settings on your registry profile
-Remove extraneous packages
-Publish a package
-Rebuild a package
-Open package repository page in the browser
-Restart a package
-Display npm root
-Run arbitrary package scripts
-Search for packages
-Lock down dependency versions for publication
-Mark your favorite packages
-View packages marked as favorites
-Start a package
-Stop a package
-Manage organization teams and team memberships
-Test a package
-Manage your authentication tokens
-Remove a package
-Remove a package from the registry
-Update a package
-Bump a package version
-View registry info
-Display npm username
Using npm in your Node programs
File system structures npm uses
-Folder Structures Used by npm
-An explanation of npm lockfiles
-A publishable lockfile
-The npm config files
-A manifestation of the manifest
-Specifics of npm's package.json handling
Various other bits and bobs
-npm's "funny" coding style
-More than you probably want to know about npm configuration
-Developer Guide
-Handling Module Name Disputes
-Index of all npm documentation
-Working with Teams & Orgs
-The JavaScript Package Registry
-Scoped packages
-How npm handles the "scripts" field
-Cleaning the Slate
-The semantic versioner for npm
npm-scope(7)
). If no scope is specified, the default registry is used, which is
supplied by the registry
config parameter. See npm-config(1)
,
npmrc(5)
, and npm-config(7)
for more on managing npm's configuration.
-Yes.
When making requests of the registry npm adds two headers with information about your environment:
@@ -51,7 +51,7 @@The npm registry does not try to correlate the information in these headers with any authenticated accounts that may be used in the same requests.
-Yes!
The easiest way is to replicate the couch database, and use the same (or similar) design doc to implement the APIs.
@@ -61,20 +61,20 @@If you then want to publish a package for the whole world to see, you can
simply override the --registry
option for that publish
command.
Set "private": true
in your package.json to prevent it from being
published at all, or
"publishConfig":{"registry":"http://my-internal-registry.local"}
to force it to be published only to your internal registry.
See package.json(5)
for more info on what goes in the package.json file.
No. If you want things to be public, then publish them into the public registry using npm. What little security there is would be for nought otherwise.
-No, but it's way easier. Basically, yes, you do, or you have to effectively implement the entire CouchDB API anyway.
-Yes, head over to https://www.npmjs.com/
then you could run npm start
to execute the bar
script, which is
exported into the node_modules/.bin
directory on npm install
.
The package.json fields are tacked onto the npm_package_
prefix. So,
for instance, if you had {"name":"foo", "version":"1.2.5"}
in your
package.json file, then your package scripts would have the
@@ -139,7 +139,7 @@
Configuration parameters are put in the environment with the
npm_config_
prefix. For instance, you can view the effective root
config by checking the npm_config_root
environment variable.
The package.json "config" keys are overwritten in the environment if
there is a config param of <name>[@<version>]:<key>
. For example,
if the package.json has this:
The semantic versioner for npm
npm install --save semver
-`
+npm install --save semver
As a node module:
const semver = require('semver')
@@ -28,8 +27,6 @@ Usage
As a command-line utility:
$ semver -h
-SemVer 5.3.0
-
A JavaScript implementation of the http://semver.org/ specification
Copyright Isaac Z. Schlueter
@@ -53,6 +50,9 @@ Usage
-l --loose
Interpret versions and ranges loosely
+-p --include-prerelease
+ Always include prerelease versions in range matching
+
-c --coerce
Coerce a string into SemVer if possible
(does not imply --loose)
@@ -133,7 +133,7 @@ Advanced Range Syntax
deterministic ways.
Advanced ranges may be combined in the same way as primitive
comparators using white space or ||
.
-Hyphen Ranges X.Y.Z - A.B.C
+Hyphen Ranges X.Y.Z - A.B.C
Specifies an inclusive set.
1.2.3 - 2.3.4
:= >=1.2.3 <=2.3.4
@@ -151,7 +151,7 @@ Hyphen Ranges X.Y.Z - A.B.C
1.2.3 - 2.3
:= >=1.2.3 <2.4.0
1.2.3 - 2
:= >=1.2.3 <3.0.0
-X-Ranges 1.2.x
1.X
1.2.*
*
+X-Ranges 1.2.x
1.X
1.2.*
*
Any of X
, x
, or *
may be used to "stand in" for one of the
numeric values in the [major, minor, patch]
tuple.
@@ -166,7 +166,7 @@ X-Ranges 1.2.x
1.X
1
:= 1.x.x
:= >=1.0.0 <2.0.0
1.2
:= 1.2.x
:= >=1.2.0 <1.3.0
-Tilde Ranges ~1.2.3
~1.2
~1
+Tilde Ranges ~1.2.3
~1.2
~1
Allows patch-level changes if a minor version is specified on the
comparator. Allows minor-level changes if not.
@@ -182,7 +182,7 @@ Tilde Ranges ~1.2.3
~1.21.2.4-beta.2
would not, because it is a prerelease of a
different [major, minor, patch]
tuple.
-Caret Ranges ^1.2.3
^0.2.5
^0.0.4
+Caret Ranges ^1.2.3
^0.2.5
^0.0.4
Allows changes that do not modify the left-most non-zero digit in the
[major, minor, patch]
tuple. In other words, this allows patch and
minor updates for versions 1.0.0
and above, patch updates for
@@ -242,9 +242,20 @@
Range Grammar
parts ::= part ( '.' part ) *
part ::= nr | [-0-9A-Za-z]+
Functions
-All methods and classes take a final loose
boolean argument that, if
-true, will be more forgiving about not-quite-valid semver strings.
-The resulting output will always be 100% strict, of course.
+All methods and classes take a final options
object argument. All
+options in this object are false
by default. The options supported
+are:
+
+loose
Be more forgiving about not-quite-valid semver strings.
+(Any resulting output will always be 100% strict compliant, of
+course.) For backwards compatibility reasons, if the options
+argument is a boolean value instead of an object, it is interpreted
+to be the loose
param.
+includePrerelease
Set to suppress the default
+behavior of
+excluding prerelease tagged versions from ranges unless they are
+explicitly opted into.
+
Strict-mode Comparators and Ranges will be strict about the SemVer
strings that they parse.
@@ -295,7 +306,7 @@ Comparators
intersects(comparator)
: Return true if the comparators intersect
-Ranges
+Ranges
validRange(range)
: Return the valid range or null if it's not valid
satisfies(version, range)
: Return true if the version satisfies the
@@ -350,5 +361,5 @@ Coercion
-
+
diff --git a/deps/npm/lib/access.js b/deps/npm/lib/access.js
index 164ea3b7d741a1..4bb93fda1d0ee2 100644
--- a/deps/npm/lib/access.js
+++ b/deps/npm/lib/access.js
@@ -1,28 +1,50 @@
'use strict'
/* eslint-disable standard/no-callback-literal */
-var resolve = require('path').resolve
+const BB = require('bluebird')
-var readPackageJson = require('read-package-json')
-var mapToRegistry = require('./utils/map-to-registry.js')
-var npm = require('./npm.js')
-var output = require('./utils/output.js')
-
-var whoami = require('./whoami')
+const figgyPudding = require('figgy-pudding')
+const libaccess = require('libnpm/access')
+const npmConfig = require('./config/figgy-config.js')
+const output = require('./utils/output.js')
+const otplease = require('./utils/otplease.js')
+const path = require('path')
+const prefix = require('./npm.js').prefix
+const readPackageJson = BB.promisify(require('read-package-json'))
+const usage = require('./utils/usage.js')
+const whoami = require('./whoami.js')
module.exports = access
-access.usage =
+access.usage = usage(
+ 'npm access',
'npm access public []\n' +
'npm access restricted []\n' +
'npm access grant []\n' +
'npm access revoke []\n' +
+ 'npm access 2fa-required []\n' +
+ 'npm access 2fa-not-required []\n' +
'npm access ls-packages [||]\n' +
'npm access ls-collaborators [ []]\n' +
'npm access edit []'
+)
+
+access.subcommands = [
+ 'public', 'restricted', 'grant', 'revoke',
+ 'ls-packages', 'ls-collaborators', 'edit',
+ '2fa-required', '2fa-not-required'
+]
+
+const AccessConfig = figgyPudding({
+ json: {}
+})
-access.subcommands = ['public', 'restricted', 'grant', 'revoke',
- 'ls-packages', 'ls-collaborators', 'edit']
+function UsageError (msg = '') {
+ throw Object.assign(new Error(
+ (msg ? `\nUsage: ${msg}\n\n` : '') +
+ access.usage
+ ), {code: 'EUSAGE'})
+}
access.completion = function (opts, cb) {
var argv = opts.conf.argv.remain
@@ -42,6 +64,8 @@ access.completion = function (opts, cb) {
case 'ls-packages':
case 'ls-collaborators':
case 'edit':
+ case '2fa-required':
+ case '2fa-not-required':
return cb(null, [])
case 'revoke':
return cb(null, [])
@@ -50,81 +74,125 @@ access.completion = function (opts, cb) {
}
}
-function access (args, cb) {
- var cmd = args.shift()
- var params
- return parseParams(cmd, args, function (err, p) {
- if (err) { return cb(err) }
- params = p
- return mapToRegistry(params.package, npm.config, invokeCmd)
- })
+function access ([cmd, ...args], cb) {
+ return BB.try(() => {
+ const fn = access.subcommands.includes(cmd) && access[cmd]
+ if (!cmd) { UsageError('Subcommand is required.') }
+ if (!fn) { UsageError(`${cmd} is not a recognized subcommand.`) }
- function invokeCmd (err, uri, auth, base) {
- if (err) { return cb(err) }
- params.auth = auth
- try {
- return npm.registry.access(cmd, uri, params, function (err, data) {
- if (!err && data) {
- output(JSON.stringify(data, undefined, 2))
- }
- cb(err, data)
- })
- } catch (e) {
- cb(e.message + '\n\nUsage:\n' + access.usage)
- }
- }
+ return fn(args, AccessConfig(npmConfig()))
+ }).then(
+ x => cb(null, x),
+ err => err.code === 'EUSAGE' ? cb(err.message) : cb(err)
+ )
}
-function parseParams (cmd, args, cb) {
- // mapToRegistry will complain if package is undefined,
- // but it's not needed for ls-packages
- var params = { 'package': '' }
- if (cmd === 'grant') {
- params.permissions = args.shift()
- }
- if (['grant', 'revoke', 'ls-packages'].indexOf(cmd) !== -1) {
- var entity = (args.shift() || '').split(':')
- params.scope = entity[0]
- params.team = entity[1]
- }
+access.public = ([pkg], opts) => {
+ return modifyPackage(pkg, opts, libaccess.public)
+}
- if (cmd === 'ls-packages') {
- if (!params.scope) {
- whoami([], true, function (err, scope) {
- params.scope = scope
- cb(err, params)
- })
- } else {
- cb(null, params)
+access.restricted = ([pkg], opts) => {
+ return modifyPackage(pkg, opts, libaccess.restricted)
+}
+
+access.grant = ([perms, scopeteam, pkg], opts) => {
+ return BB.try(() => {
+ if (!perms || (perms !== 'read-only' && perms !== 'read-write')) {
+ UsageError('First argument must be either `read-only` or `read-write.`')
}
- } else {
- getPackage(args.shift(), function (err, pkg) {
- if (err) return cb(err)
- params.package = pkg
+ if (!scopeteam) {
+ UsageError('`` argument is required.')
+ }
+ const [, scope, team] = scopeteam.match(/^@?([^:]+):(.*)$/) || []
+ if (!scope && !team) {
+ UsageError(
+ 'Second argument used incorrect format.\n' +
+ 'Example: @example:developers'
+ )
+ }
+ return modifyPackage(pkg, opts, (pkgName, opts) => {
+ return libaccess.grant(pkgName, scopeteam, perms, opts)
+ })
+ })
+}
- if (cmd === 'ls-collaborators') params.user = args.shift()
- cb(null, params)
+access.revoke = ([scopeteam, pkg], opts) => {
+ return BB.try(() => {
+ if (!scopeteam) {
+ UsageError('`` argument is required.')
+ }
+ const [, scope, team] = scopeteam.match(/^@?([^:]+):(.*)$/) || []
+ if (!scope || !team) {
+ UsageError(
+ 'First argument used incorrect format.\n' +
+ 'Example: @example:developers'
+ )
+ }
+ return modifyPackage(pkg, opts, (pkgName, opts) => {
+ return libaccess.revoke(pkgName, scopeteam, opts)
})
- }
+ })
+}
+
+access['2fa-required'] = access.tfaRequired = ([pkg], opts) => {
+ return modifyPackage(pkg, opts, libaccess.tfaRequired, false)
+}
+
+access['2fa-not-required'] = access.tfaNotRequired = ([pkg], opts) => {
+ return modifyPackage(pkg, opts, libaccess.tfaNotRequired, false)
+}
+
+access['ls-packages'] = access.lsPackages = ([owner], opts) => {
+ return (
+ owner ? BB.resolve(owner) : BB.fromNode(cb => whoami([], true, cb))
+ ).then(owner => {
+ return libaccess.lsPackages(owner, opts)
+ }).then(pkgs => {
+ // TODO - print these out nicely (breaking change)
+ output(JSON.stringify(pkgs, null, 2))
+ })
+}
+
+access['ls-collaborators'] = access.lsCollaborators = ([pkg, usr], opts) => {
+ return getPackage(pkg).then(pkgName =>
+ libaccess.lsCollaborators(pkgName, usr, opts)
+ ).then(collabs => {
+ // TODO - print these out nicely (breaking change)
+ output(JSON.stringify(collabs, null, 2))
+ })
}
-function getPackage (name, cb) {
- if (name && name.trim()) {
- cb(null, name.trim())
- } else {
- readPackageJson(
- resolve(npm.prefix, 'package.json'),
- function (err, data) {
- if (err) {
+access['edit'] = () => BB.reject(new Error('edit subcommand is not implemented yet'))
+
+function modifyPackage (pkg, opts, fn, requireScope = true) {
+ return getPackage(pkg, requireScope).then(pkgName =>
+ otplease(opts, opts => fn(pkgName, opts))
+ )
+}
+
+function getPackage (name, requireScope = true) {
+ return BB.try(() => {
+ if (name && name.trim()) {
+ return name.trim()
+ } else {
+ return readPackageJson(
+ path.resolve(prefix, 'package.json')
+ ).then(
+ data => data.name,
+ err => {
if (err.code === 'ENOENT') {
- cb(new Error('no package name passed to command and no package.json found'))
+ throw new Error('no package name passed to command and no package.json found')
} else {
- cb(err)
+ throw err
}
- } else {
- cb(null, data.name)
}
- }
- )
- }
+ )
+ }
+ }).then(name => {
+ if (requireScope && !name.match(/^@[^/]+\/.*$/)) {
+ UsageError('This command is only available for scoped packages.')
+ } else {
+ return name
+ }
+ })
}
diff --git a/deps/npm/lib/audit.js b/deps/npm/lib/audit.js
index 06852610e64663..2cabef9d27d0d3 100644
--- a/deps/npm/lib/audit.js
+++ b/deps/npm/lib/audit.js
@@ -3,17 +3,37 @@
const Bluebird = require('bluebird')
const audit = require('./install/audit.js')
+const figgyPudding = require('figgy-pudding')
const fs = require('graceful-fs')
const Installer = require('./install.js').Installer
const lockVerify = require('lock-verify')
const log = require('npmlog')
-const npa = require('npm-package-arg')
+const npa = require('libnpm/parse-arg')
const npm = require('./npm.js')
+const npmConfig = require('./config/figgy-config.js')
const output = require('./utils/output.js')
const parseJson = require('json-parse-better-errors')
const readFile = Bluebird.promisify(fs.readFile)
+const AuditConfig = figgyPudding({
+ also: {},
+ 'audit-level': {},
+ deepArgs: 'deep-args',
+ 'deep-args': {},
+ dev: {},
+ force: {},
+ 'dry-run': {},
+ global: {},
+ json: {},
+ only: {},
+ parseable: {},
+ prod: {},
+ production: {},
+ registry: {},
+ runId: {}
+})
+
module.exports = auditCmd
const usage = require('./utils/usage')
@@ -110,12 +130,12 @@ function maybeReadFile (name) {
})
}
-function filterEnv (action) {
- const includeDev = npm.config.get('dev') ||
- (!/^prod(uction)?$/.test(npm.config.get('only')) && !npm.config.get('production')) ||
- /^dev(elopment)?$/.test(npm.config.get('only')) ||
- /^dev(elopment)?$/.test(npm.config.get('also'))
- const includeProd = !/^dev(elopment)?$/.test(npm.config.get('only'))
+function filterEnv (action, opts) {
+ const includeDev = opts.dev ||
+ (!/^prod(uction)?$/.test(opts.only) && !opts.production) ||
+ /^dev(elopment)?$/.test(opts.only) ||
+ /^dev(elopment)?$/.test(opts.also)
+ const includeProd = !/^dev(elopment)?$/.test(opts.only)
const resolves = action.resolves.filter(({dev}) => {
return (dev && includeDev) || (!dev && includeProd)
})
@@ -125,7 +145,8 @@ function filterEnv (action) {
}
function auditCmd (args, cb) {
- if (npm.config.get('global')) {
+ const opts = AuditConfig(npmConfig())
+ if (opts.global) {
const err = new Error('`npm audit` does not support testing globals')
err.code = 'EAUDITGLOBAL'
throw err
@@ -168,8 +189,16 @@ function auditCmd (args, cb) {
}).then((auditReport) => {
return audit.submitForFullReport(auditReport)
}).catch((err) => {
- if (err.statusCode === 404 || err.statusCode >= 500) {
- const ne = new Error(`Your configured registry (${npm.config.get('registry')}) does not support audit requests.`)
+ if (err.statusCode >= 400) {
+ let msg
+ if (err.statusCode === 401) {
+ msg = `Either your login credentials are invalid or your registry (${opts.registry}) does not support audit.`
+ } else if (err.statusCode === 404) {
+ msg = `Your configured registry (${opts.registry}) does not support audit requests.`
+ } else {
+ msg = `Your configured registry (${opts.registry}) does not support audit requests, or the audit endpoint is temporarily unavailable.`
+ }
+ const ne = new Error(msg)
ne.code = 'ENOAUDIT'
ne.wrapped = err
throw ne
@@ -178,7 +207,7 @@ function auditCmd (args, cb) {
}).then((auditResult) => {
if (args[0] === 'fix') {
const actions = (auditResult.actions || []).reduce((acc, action) => {
- action = filterEnv(action)
+ action = filterEnv(action, opts)
if (!action) { return acc }
if (action.isMajor) {
acc.major.add(`${action.module}@${action.target}`)
@@ -215,7 +244,7 @@ function auditCmd (args, cb) {
review: new Set()
})
return Bluebird.try(() => {
- const installMajor = npm.config.get('force')
+ const installMajor = opts.force
const installCount = actions.install.size + (installMajor ? actions.major.size : 0) + actions.update.size
const vulnFixCount = new Set([...actions.installFixes, ...actions.updateFixes, ...(installMajor ? actions.majorFixes : [])]).size
const metavuln = auditResult.metadata.vulnerabilities
@@ -230,16 +259,16 @@ function auditCmd (args, cb) {
return Bluebird.fromNode(cb => {
new Auditor(
npm.prefix,
- !!npm.config.get('dry-run'),
+ !!opts['dry-run'],
[...actions.install, ...(installMajor ? actions.major : [])],
- {
+ opts.concat({
runId: auditResult.runId,
deepArgs: [...actions.update].map(u => u.split('>'))
- }
+ }).toJSON()
).run(cb)
}).then(() => {
const numScanned = auditResult.metadata.totalDependencies
- if (!npm.config.get('json') && !npm.config.get('parseable')) {
+ if (!opts.json && !opts.parseable) {
output(`fixed ${vulnFixCount} of ${total} vulnerabilit${total === 1 ? 'y' : 'ies'} in ${numScanned} scanned package${numScanned === 1 ? '' : 's'}`)
if (actions.review.size) {
output(` ${actions.review.size} vulnerabilit${actions.review.size === 1 ? 'y' : 'ies'} required manual review and could not be updated`)
@@ -258,12 +287,12 @@ function auditCmd (args, cb) {
})
} else {
const levels = ['low', 'moderate', 'high', 'critical']
- const minLevel = levels.indexOf(npm.config.get('audit-level'))
+ const minLevel = levels.indexOf(opts['audit-level'])
const vulns = levels.reduce((count, level, i) => {
return i < minLevel ? count : count + (auditResult.metadata.vulnerabilities[level] || 0)
}, 0)
if (vulns > 0) process.exitCode = 1
- if (npm.config.get('parseable')) {
+ if (opts.parseable) {
return audit.printParseableReport(auditResult)
} else {
return audit.printFullReport(auditResult)
diff --git a/deps/npm/lib/auth/legacy.js b/deps/npm/lib/auth/legacy.js
index 8c25df0288e677..7ad678be5e5c18 100644
--- a/deps/npm/lib/auth/legacy.js
+++ b/deps/npm/lib/auth/legacy.js
@@ -1,11 +1,11 @@
'use strict'
+
const read = require('../utils/read-user-info.js')
-const profile = require('npm-profile')
+const profile = require('libnpm/profile')
const log = require('npmlog')
-const npm = require('../npm.js')
+const figgyPudding = require('figgy-pudding')
+const npmConfig = require('../config/figgy-config.js')
const output = require('../utils/output.js')
-const pacoteOpts = require('../config/pacote')
-const fetchOpts = require('../config/fetch-opts')
const openUrl = require('../utils/open-url')
const openerPromise = (url) => new Promise((resolve, reject) => {
@@ -26,54 +26,54 @@ const loginPrompter = (creds) => {
})
}
-module.exports.login = (creds, registry, scope, cb) => {
- const conf = {
- log: log,
- creds: creds,
- registry: registry,
- auth: {
- otp: npm.config.get('otp')
- },
- scope: scope,
- opts: fetchOpts.fromPacote(pacoteOpts())
- }
- login(conf).then((newCreds) => cb(null, newCreds)).catch(cb)
+const LoginOpts = figgyPudding({
+ 'always-auth': {},
+ creds: {},
+ log: {default: () => log},
+ registry: {},
+ scope: {}
+})
+
+module.exports.login = (creds = {}, registry, scope, cb) => {
+ const opts = LoginOpts(npmConfig()).concat({scope, registry, creds})
+ login(opts).then((newCreds) => cb(null, newCreds)).catch(cb)
}
-function login (conf) {
- return profile.login(openerPromise, loginPrompter, conf)
+function login (opts) {
+ return profile.login(openerPromise, loginPrompter, opts)
.catch((err) => {
if (err.code === 'EOTP') throw err
- const u = conf.creds.username
- const p = conf.creds.password
- const e = conf.creds.email
+ const u = opts.creds.username
+ const p = opts.creds.password
+ const e = opts.creds.email
if (!(u && p && e)) throw err
- return profile.adduserCouch(u, e, p, conf)
+ return profile.adduserCouch(u, e, p, opts)
})
.catch((err) => {
if (err.code !== 'EOTP') throw err
- return read.otp('Enter one-time password from your authenticator app: ').then((otp) => {
- conf.auth.otp = otp
- const u = conf.creds.username
- const p = conf.creds.password
- return profile.loginCouch(u, p, conf)
+ return read.otp(
+ 'Enter one-time password from your authenticator app: '
+ ).then(otp => {
+ const u = opts.creds.username
+ const p = opts.creds.password
+ return profile.loginCouch(u, p, opts.concat({otp}))
})
}).then((result) => {
const newCreds = {}
if (result && result.token) {
newCreds.token = result.token
} else {
- newCreds.username = conf.creds.username
- newCreds.password = conf.creds.password
- newCreds.email = conf.creds.email
- newCreds.alwaysAuth = npm.config.get('always-auth')
+ newCreds.username = opts.creds.username
+ newCreds.password = opts.creds.password
+ newCreds.email = opts.creds.email
+ newCreds.alwaysAuth = opts['always-auth']
}
- const usermsg = conf.creds.username ? ' user ' + conf.creds.username : ''
- conf.log.info('login', 'Authorized' + usermsg)
- const scopeMessage = conf.scope ? ' to scope ' + conf.scope : ''
- const userout = conf.creds.username ? ' as ' + conf.creds.username : ''
- output('Logged in%s%s on %s.', userout, scopeMessage, conf.registry)
+ const usermsg = opts.creds.username ? ' user ' + opts.creds.username : ''
+ opts.log.info('login', 'Authorized' + usermsg)
+ const scopeMessage = opts.scope ? ' to scope ' + opts.scope : ''
+ const userout = opts.creds.username ? ' as ' + opts.creds.username : ''
+ output('Logged in%s%s on %s.', userout, scopeMessage, opts.registry)
return newCreds
})
}
diff --git a/deps/npm/lib/auth/sso.js b/deps/npm/lib/auth/sso.js
index 519ca8496c74c2..099e764e3ab40b 100644
--- a/deps/npm/lib/auth/sso.js
+++ b/deps/npm/lib/auth/sso.js
@@ -1,56 +1,73 @@
-var log = require('npmlog')
-var npm = require('../npm.js')
-var output = require('../utils/output')
-var openUrl = require('../utils/open-url')
+'use strict'
+
+const BB = require('bluebird')
+
+const figgyPudding = require('figgy-pudding')
+const log = require('npmlog')
+const npmConfig = require('../config/figgy-config.js')
+const npmFetch = require('npm-registry-fetch')
+const output = require('../utils/output.js')
+const openUrl = BB.promisify(require('../utils/open-url.js'))
+const otplease = require('../utils/otplease.js')
+const profile = require('libnpm/profile')
+
+const SsoOpts = figgyPudding({
+ ssoType: 'sso-type',
+ 'sso-type': {},
+ ssoPollFrequency: 'sso-poll-frequency',
+ 'sso-poll-frequency': {}
+})
module.exports.login = function login (creds, registry, scope, cb) {
- var ssoType = npm.config.get('sso-type')
+ const opts = SsoOpts(npmConfig()).concat({creds, registry, scope})
+ const ssoType = opts.ssoType
if (!ssoType) { return cb(new Error('Missing option: sso-type')) }
- var params = {
- // We're reusing the legacy login endpoint, so we need some dummy
- // stuff here to pass validation. They're never used.
- auth: {
- username: 'npm_' + ssoType + '_auth_dummy_user',
- password: 'placeholder',
- email: 'support@npmjs.com',
- authType: ssoType
- }
+ // We're reusing the legacy login endpoint, so we need some dummy
+ // stuff here to pass validation. They're never used.
+ const auth = {
+ username: 'npm_' + ssoType + '_auth_dummy_user',
+ password: 'placeholder',
+ email: 'support@npmjs.com',
+ authType: ssoType
}
- npm.registry.adduser(registry, params, function (er, doc) {
- if (er) return cb(er)
- if (!doc || !doc.token) return cb(new Error('no SSO token returned'))
- if (!doc.sso) return cb(new Error('no SSO URL returned by services'))
-
- openUrl(doc.sso, 'to complete your login please visit', function () {
- pollForSession(registry, doc.token, function (err, username) {
- if (err) return cb(err)
- log.info('adduser', 'Authorized user %s', username)
- var scopeMessage = scope ? ' to scope ' + scope : ''
- output('Logged in as %s%s on %s.', username, scopeMessage, registry)
-
- cb(null, { token: doc.token })
- })
+ otplease(opts,
+ opts => profile.loginCouch(auth.username, auth.password, opts)
+ ).then(({token, sso}) => {
+ if (!token) { throw new Error('no SSO token returned') }
+ if (!sso) { throw new Error('no SSO URL returned by services') }
+ return openUrl(sso, 'to complete your login please visit').then(() => {
+ return pollForSession(registry, token, opts)
+ }).then(username => {
+ log.info('adduser', 'Authorized user %s', username)
+ var scopeMessage = scope ? ' to scope ' + scope : ''
+ output('Logged in as %s%s on %s.', username, scopeMessage, registry)
+ return {token}
})
- })
+ }).nodeify(cb)
}
-function pollForSession (registry, token, cb) {
+function pollForSession (registry, token, opts) {
log.info('adduser', 'Polling for validated SSO session')
- npm.registry.whoami(registry, {
- auth: {
- token: token
- }
- }, function (er, username) {
- if (er && er.statusCode !== 401) {
- cb(er)
- } else if (!username) {
- setTimeout(function () {
- pollForSession(registry, token, cb)
- }, npm.config.get('sso-poll-frequency'))
- } else {
- cb(null, username)
+ return npmFetch.json(
+ '/-/whoami', opts.concat({registry, forceAuth: {token}})
+ ).then(
+ ({username}) => username,
+ err => {
+ if (err.code === 'E401') {
+ return sleep(opts['sso-poll-frequency']).then(() => {
+ return pollForSession(registry, token, opts)
+ })
+ } else {
+ throw err
+ }
}
+ )
+}
+
+function sleep (time) {
+ return new BB((resolve) => {
+ setTimeout(resolve, time)
})
}
diff --git a/deps/npm/lib/cache.js b/deps/npm/lib/cache.js
index 169f192cad5f2c..00abd8c746ab73 100644
--- a/deps/npm/lib/cache.js
+++ b/deps/npm/lib/cache.js
@@ -9,9 +9,9 @@ const finished = BB.promisify(require('mississippi').finished)
const log = require('npmlog')
const npa = require('npm-package-arg')
const npm = require('./npm.js')
+const npmConfig = require('./config/figgy-config.js')
const output = require('./utils/output.js')
const pacote = require('pacote')
-const pacoteOpts = require('./config/pacote')
const path = require('path')
const rm = BB.promisify(require('./utils/gently-rm.js'))
const unbuild = BB.promisify(npm.commands.unbuild)
@@ -107,7 +107,7 @@ function add (args, where) {
log.verbose('cache add', 'spec', spec)
if (!spec) return BB.reject(new Error(usage))
log.silly('cache add', 'parsed spec', spec)
- return finished(pacote.tarball.stream(spec, pacoteOpts({where})).resume())
+ return finished(pacote.tarball.stream(spec, npmConfig({where})).resume())
}
cache.verify = verify
@@ -131,7 +131,7 @@ function verify () {
cache.unpack = unpack
function unpack (pkg, ver, unpackTarget, dmode, fmode, uid, gid) {
return unbuild([unpackTarget], true).then(() => {
- const opts = pacoteOpts({dmode, fmode, uid, gid, offline: true})
+ const opts = npmConfig({dmode, fmode, uid, gid, offline: true})
return pacote.extract(npa.resolve(pkg, ver), unpackTarget, opts)
})
}
diff --git a/deps/npm/lib/ci.js b/deps/npm/lib/ci.js
index 03822b9528d1d4..1fbb28b570f6fa 100644
--- a/deps/npm/lib/ci.js
+++ b/deps/npm/lib/ci.js
@@ -1,40 +1,19 @@
'use strict'
const Installer = require('libcipm')
-const lifecycleOpts = require('./config/lifecycle.js')
-const npm = require('./npm.js')
+const npmConfig = require('./config/figgy-config.js')
const npmlog = require('npmlog')
-const pacoteOpts = require('./config/pacote.js')
ci.usage = 'npm ci'
ci.completion = (cb) => cb(null, [])
-Installer.CipmConfig.impl(npm.config, {
- get: npm.config.get,
- set: npm.config.set,
- toLifecycle (moreOpts) {
- return lifecycleOpts(moreOpts)
- },
- toPacote (moreOpts) {
- return pacoteOpts(moreOpts)
- }
-})
-
module.exports = ci
function ci (args, cb) {
- return new Installer({
- config: npm.config,
- log: npmlog
- })
- .run()
- .then(
- (details) => {
- npmlog.disableProgress()
- console.log(`added ${details.pkgCount} packages in ${
- details.runTime / 1000
- }s`)
- }
- )
- .then(() => cb(), cb)
+ return new Installer(npmConfig({ log: npmlog })).run().then(details => {
+ npmlog.disableProgress()
+ console.log(`added ${details.pkgCount} packages in ${
+ details.runTime / 1000
+ }s`)
+ }).then(() => cb(), cb)
}
diff --git a/deps/npm/lib/config/cmd-list.js b/deps/npm/lib/config/cmd-list.js
index a453082adc1bc8..fa4390fcdcba77 100644
--- a/deps/npm/lib/config/cmd-list.js
+++ b/deps/npm/lib/config/cmd-list.js
@@ -50,7 +50,9 @@ var affordances = {
'rm': 'uninstall',
'r': 'uninstall',
'rum': 'run-script',
- 'sit': 'cit'
+ 'sit': 'cit',
+ 'urn': 'run-script',
+ 'ogr': 'org'
}
// these are filenames in .
@@ -89,6 +91,7 @@ var cmdList = [
'token',
'profile',
'audit',
+ 'org',
'help',
'help-search',
diff --git a/deps/npm/lib/config/defaults.js b/deps/npm/lib/config/defaults.js
index 991a2129f68944..25926595391207 100644
--- a/deps/npm/lib/config/defaults.js
+++ b/deps/npm/lib/config/defaults.js
@@ -239,7 +239,7 @@ Object.defineProperty(exports, 'defaults', {get: function () {
process.getuid() !== 0,
'update-notifier': true,
usage: false,
- user: process.platform === 'win32' ? 0 : 'nobody',
+ user: (process.platform === 'win32' || os.type() === 'OS400') ? 0 : 'nobody',
userconfig: path.resolve(home, '.npmrc'),
umask: process.umask ? process.umask() : umask.fromString('022'),
version: false,
diff --git a/deps/npm/lib/config/figgy-config.js b/deps/npm/lib/config/figgy-config.js
new file mode 100644
index 00000000000000..9e9ca0ba561efb
--- /dev/null
+++ b/deps/npm/lib/config/figgy-config.js
@@ -0,0 +1,87 @@
+'use strict'
+
+const BB = require('bluebird')
+
+const crypto = require('crypto')
+const figgyPudding = require('figgy-pudding')
+const log = require('npmlog')
+const npm = require('../npm.js')
+const pack = require('../pack.js')
+const path = require('path')
+
+const npmSession = crypto.randomBytes(8).toString('hex')
+log.verbose('npm-session', npmSession)
+
+const SCOPE_REGISTRY_REGEX = /@.*:registry$/gi
+const NpmConfig = figgyPudding({}, {
+ other (key) {
+ return key.match(SCOPE_REGISTRY_REGEX)
+ }
+})
+
+let baseConfig
+
+module.exports = mkConfig
+function mkConfig (...providers) {
+ if (!baseConfig) {
+ baseConfig = NpmConfig(npm.config, {
+ // Add some non-npm-config opts by hand.
+ cache: path.join(npm.config.get('cache'), '_cacache'),
+ // NOTE: npm has some magic logic around color distinct from the config
+ // value, so we have to override it here
+ color: !!npm.color,
+ dirPacker: pack.packGitDep,
+ hashAlgorithm: 'sha1',
+ includeDeprecated: false,
+ log,
+ 'npm-session': npmSession,
+ 'project-scope': npm.projectScope,
+ refer: npm.referer,
+ dmode: npm.modes.exec,
+ fmode: npm.modes.file,
+ umask: npm.modes.umask,
+ npmVersion: npm.version,
+ tmp: npm.tmp,
+ Promise: BB
+ })
+ const ownerStats = calculateOwner()
+ if (ownerStats.uid != null || ownerStats.gid != null) {
+ baseConfig = baseConfig.concat(ownerStats)
+ }
+ }
+ let conf = baseConfig.concat(...providers)
+ // Adapt some other configs if missing
+ if (npm.config.get('prefer-online') === undefined) {
+ conf = conf.concat({
+ 'prefer-online': npm.config.get('cache-max') <= 0
+ })
+ }
+ if (npm.config.get('prefer-online') === undefined) {
+ conf = conf.concat({
+ 'prefer-online': npm.config.get('cache-min') >= 9999
+ })
+ }
+ return conf
+}
+
+let effectiveOwner
+function calculateOwner () {
+ if (!effectiveOwner) {
+ effectiveOwner = { uid: 0, gid: 0 }
+
+ // Pretty much only on windows
+ if (!process.getuid) {
+ return effectiveOwner
+ }
+
+ effectiveOwner.uid = +process.getuid()
+ effectiveOwner.gid = +process.getgid()
+
+ if (effectiveOwner.uid === 0) {
+ if (process.env.SUDO_UID) effectiveOwner.uid = +process.env.SUDO_UID
+ if (process.env.SUDO_GID) effectiveOwner.gid = +process.env.SUDO_GID
+ }
+ }
+
+ return effectiveOwner
+}
diff --git a/deps/npm/lib/config/pacote.js b/deps/npm/lib/config/pacote.js
deleted file mode 100644
index 505b69da375a44..00000000000000
--- a/deps/npm/lib/config/pacote.js
+++ /dev/null
@@ -1,141 +0,0 @@
-'use strict'
-
-const Buffer = require('safe-buffer').Buffer
-
-const crypto = require('crypto')
-const npm = require('../npm')
-const log = require('npmlog')
-let pack
-const path = require('path')
-
-let effectiveOwner
-
-const npmSession = crypto.randomBytes(8).toString('hex')
-log.verbose('npm-session', npmSession)
-
-module.exports = pacoteOpts
-function pacoteOpts (moreOpts) {
- if (!pack) {
- pack = require('../pack.js')
- }
- const ownerStats = calculateOwner()
- const opts = {
- cache: path.join(npm.config.get('cache'), '_cacache'),
- ca: npm.config.get('ca'),
- cert: npm.config.get('cert'),
- defaultTag: npm.config.get('tag'),
- dirPacker: pack.packGitDep,
- hashAlgorithm: 'sha1',
- includeDeprecated: false,
- key: npm.config.get('key'),
- localAddress: npm.config.get('local-address'),
- log: log,
- maxAge: npm.config.get('cache-min'),
- maxSockets: npm.config.get('maxsockets'),
- npmSession: npmSession,
- offline: npm.config.get('offline'),
- preferOffline: npm.config.get('prefer-offline') || npm.config.get('cache-min') > 9999,
- preferOnline: npm.config.get('prefer-online') || npm.config.get('cache-max') <= 0,
- projectScope: npm.projectScope,
- proxy: npm.config.get('https-proxy') || npm.config.get('proxy'),
- noProxy: npm.config.get('noproxy'),
- refer: npm.registry.refer,
- registry: npm.config.get('registry'),
- retry: {
- retries: npm.config.get('fetch-retries'),
- factor: npm.config.get('fetch-retry-factor'),
- minTimeout: npm.config.get('fetch-retry-mintimeout'),
- maxTimeout: npm.config.get('fetch-retry-maxtimeout')
- },
- scope: npm.config.get('scope'),
- strictSSL: npm.config.get('strict-ssl'),
- userAgent: npm.config.get('user-agent'),
-
- dmode: npm.modes.exec,
- fmode: npm.modes.file,
- umask: npm.modes.umask
- }
-
- if (ownerStats.uid != null || ownerStats.gid != null) {
- Object.assign(opts, ownerStats)
- }
-
- npm.config.keys.forEach(function (k) {
- const authMatchGlobal = k.match(
- /^(_authToken|username|_password|password|email|always-auth|_auth)$/
- )
- const authMatchScoped = k[0] === '/' && k.match(
- /(.*):(_authToken|username|_password|password|email|always-auth|_auth)$/
- )
-
- // if it matches scoped it will also match global
- if (authMatchGlobal || authMatchScoped) {
- let nerfDart = null
- let key = null
- let val = null
-
- if (!opts.auth) { opts.auth = {} }
-
- if (authMatchScoped) {
- nerfDart = authMatchScoped[1]
- key = authMatchScoped[2]
- val = npm.config.get(k)
- if (!opts.auth[nerfDart]) {
- opts.auth[nerfDart] = {
- alwaysAuth: !!npm.config.get('always-auth')
- }
- }
- } else {
- key = authMatchGlobal[1]
- val = npm.config.get(k)
- opts.auth.alwaysAuth = !!npm.config.get('always-auth')
- }
-
- const auth = authMatchScoped ? opts.auth[nerfDart] : opts.auth
- if (key === '_authToken') {
- auth.token = val
- } else if (key.match(/password$/i)) {
- auth.password =
- // the config file stores password auth already-encoded. pacote expects
- // the actual username/password pair.
- Buffer.from(val, 'base64').toString('utf8')
- } else if (key === 'always-auth') {
- auth.alwaysAuth = val === 'false' ? false : !!val
- } else {
- auth[key] = val
- }
- }
-
- if (k[0] === '@') {
- if (!opts.scopeTargets) { opts.scopeTargets = {} }
- opts.scopeTargets[k.replace(/:registry$/, '')] = npm.config.get(k)
- }
- })
-
- Object.keys(moreOpts || {}).forEach((k) => {
- opts[k] = moreOpts[k]
- })
-
- return opts
-}
-
-function calculateOwner () {
- if (!effectiveOwner) {
- effectiveOwner = { uid: 0, gid: 0 }
-
- // Pretty much only on windows
- if (!process.getuid) {
- return effectiveOwner
- }
-
- effectiveOwner.uid = +process.getuid()
- effectiveOwner.gid = +process.getgid()
-
- if (effectiveOwner.uid === 0) {
- if (process.env.SUDO_UID) effectiveOwner.uid = +process.env.SUDO_UID
- if (process.env.SUDO_GID) effectiveOwner.gid = +process.env.SUDO_GID
- }
- }
-
- return effectiveOwner
-}
diff --git a/deps/npm/lib/config/reg-client.js b/deps/npm/lib/config/reg-client.js
deleted file mode 100644
index d4e2417097fa09..00000000000000
--- a/deps/npm/lib/config/reg-client.js
+++ /dev/null
@@ -1,29 +0,0 @@
-'use strict'
-
-module.exports = regClientConfig
-function regClientConfig (npm, log, config) {
- return {
- proxy: {
- http: config.get('proxy'),
- https: config.get('https-proxy'),
- localAddress: config.get('local-address')
- },
- ssl: {
- certificate: config.get('cert'),
- key: config.get('key'),
- ca: config.get('ca'),
- strict: config.get('strict-ssl')
- },
- retry: {
- retries: config.get('fetch-retries'),
- factor: config.get('fetch-retry-factor'),
- minTimeout: config.get('fetch-retry-mintimeout'),
- maxTimeout: config.get('fetch-retry-maxtimeout')
- },
- userAgent: config.get('user-agent'),
- log: log,
- defaultTag: config.get('tag'),
- maxSockets: config.get('maxsockets'),
- scope: npm.projectScope
- }
-}
diff --git a/deps/npm/lib/deprecate.js b/deps/npm/lib/deprecate.js
index 9b71d1de494ad7..7fe2fbed4ba554 100644
--- a/deps/npm/lib/deprecate.js
+++ b/deps/npm/lib/deprecate.js
@@ -1,55 +1,72 @@
-/* eslint-disable standard/no-callback-literal */
-var npm = require('./npm.js')
-var mapToRegistry = require('./utils/map-to-registry.js')
-var npa = require('npm-package-arg')
+'use strict'
+
+const BB = require('bluebird')
+
+const npmConfig = require('./config/figgy-config.js')
+const fetch = require('libnpm/fetch')
+const figgyPudding = require('figgy-pudding')
+const otplease = require('./utils/otplease.js')
+const npa = require('libnpm/parse-arg')
+const semver = require('semver')
+const whoami = require('./whoami.js')
+
+const DeprecateConfig = figgyPudding({})
module.exports = deprecate
deprecate.usage = 'npm deprecate [@] '
deprecate.completion = function (opts, cb) {
- // first, get a list of remote packages this user owns.
- // once we have a user account, then don't complete anything.
- if (opts.conf.argv.remain.length > 2) return cb()
- // get the list of packages by user
- var path = '/-/by-user/'
- mapToRegistry(path, npm.config, function (er, uri, c) {
- if (er) return cb(er)
-
- if (!(c && c.username)) return cb()
-
- var params = {
- timeout: 60000,
- auth: c
- }
- npm.registry.get(uri + c.username, params, function (er, list) {
- if (er) return cb()
- console.error(list)
- return cb(null, list[c.username])
+ return BB.try(() => {
+ if (opts.conf.argv.remain.length > 2) { return }
+ return whoami([], true, () => {}).then(username => {
+ if (username) {
+ // first, get a list of remote packages this user owns.
+ // once we have a user account, then don't complete anything.
+ // get the list of packages by user
+ return fetch(
+ `/-/by-user/${encodeURIComponent(username)}`,
+ DeprecateConfig()
+ ).then(list => list[username])
+ }
})
- })
+ }).nodeify(cb)
}
-function deprecate (args, cb) {
- var pkg = args[0]
- var msg = args[1]
- if (msg === undefined) return cb('Usage: ' + deprecate.usage)
+function deprecate ([pkg, msg], opts, cb) {
+ if (typeof cb !== 'function') {
+ cb = opts
+ opts = null
+ }
+ opts = DeprecateConfig(opts || npmConfig())
+ return BB.try(() => {
+ if (msg == null) throw new Error(`Usage: ${deprecate.usage}`)
+ // fetch the data and make sure it exists.
+ const p = npa(pkg)
- // fetch the data and make sure it exists.
- var p = npa(pkg)
+ // npa makes the default spec "latest", but for deprecation
+ // "*" is the appropriate default.
+ const spec = p.rawSpec === '' ? '*' : p.fetchSpec
- // npa makes the default spec "latest", but for deprecation
- // "*" is the appropriate default.
- var spec = p.rawSpec === '' ? '*' : p.fetchSpec
-
- mapToRegistry(p.name, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- var params = {
- version: spec,
- message: msg,
- auth: auth
+ if (semver.validRange(spec, true) === null) {
+ throw new Error('invalid version range: ' + spec)
}
- npm.registry.deprecate(uri, params, cb)
- })
+
+ const uri = '/' + p.escapedName
+ return fetch.json(uri, opts.concat({
+ spec: p,
+ query: {write: true}
+ })).then(packument => {
+ // filter all the versions that match
+ Object.keys(packument.versions)
+ .filter(v => semver.satisfies(v, spec))
+ .forEach(v => { packument.versions[v].deprecated = msg })
+ return otplease(opts, opts => fetch(uri, opts.concat({
+ spec: p,
+ method: 'PUT',
+ body: packument,
+ ignoreBody: true
+ })))
+ })
+ }).nodeify(cb)
}
diff --git a/deps/npm/lib/dist-tag.js b/deps/npm/lib/dist-tag.js
index bd0c5ae8a27a7d..176e61221eef0e 100644
--- a/deps/npm/lib/dist-tag.js
+++ b/deps/npm/lib/dist-tag.js
@@ -1,15 +1,22 @@
/* eslint-disable standard/no-callback-literal */
module.exports = distTag
-var log = require('npmlog')
-var npa = require('npm-package-arg')
-var semver = require('semver')
-
-var npm = require('./npm.js')
-var mapToRegistry = require('./utils/map-to-registry.js')
-var readLocalPkg = require('./utils/read-local-package.js')
-var usage = require('./utils/usage')
-var output = require('./utils/output.js')
+const BB = require('bluebird')
+
+const figgyPudding = require('figgy-pudding')
+const log = require('npmlog')
+const npa = require('libnpm/parse-arg')
+const npmConfig = require('./config/figgy-config.js')
+const output = require('./utils/output.js')
+const otplease = require('./utils/otplease.js')
+const readLocalPkg = BB.promisify(require('./utils/read-local-package.js'))
+const regFetch = require('libnpm/fetch')
+const semver = require('semver')
+const usage = require('./utils/usage')
+
+const DistTagOpts = figgyPudding({
+ tag: {}
+})
distTag.usage = usage(
'dist-tag',
@@ -30,130 +37,127 @@ distTag.completion = function (opts, cb) {
}
}
-function distTag (args, cb) {
- var cmd = args.shift()
- switch (cmd) {
- case 'add': case 'a': case 'set': case 's':
- return add(args[0], args[1], cb)
- case 'rm': case 'r': case 'del': case 'd': case 'remove':
- return remove(args[1], args[0], cb)
- case 'ls': case 'l': case 'sl': case 'list':
- return list(args[0], cb)
- default:
- return cb('Usage:\n' + distTag.usage)
- }
+function UsageError () {
+ throw Object.assign(new Error('Usage:\n' + distTag.usage), {
+ code: 'EUSAGE'
+ })
}
-function add (spec, tag, cb) {
- var thing = npa(spec || '')
- var pkg = thing.name
- var version = thing.rawSpec
- var t = (tag || npm.config.get('tag')).trim()
+function distTag ([cmd, pkg, tag], cb) {
+ const opts = DistTagOpts(npmConfig())
+ return BB.try(() => {
+ switch (cmd) {
+ case 'add': case 'a': case 'set': case 's':
+ return add(pkg, tag, opts)
+ case 'rm': case 'r': case 'del': case 'd': case 'remove':
+ return remove(pkg, tag, opts)
+ case 'ls': case 'l': case 'sl': case 'list':
+ return list(pkg, opts)
+ default:
+ if (!pkg) {
+ return list(cmd, opts)
+ } else {
+ UsageError()
+ }
+ }
+ }).then(
+ x => cb(null, x),
+ err => {
+ if (err.code === 'EUSAGE') {
+ cb(err.message)
+ } else {
+ cb(err)
+ }
+ }
+ )
+}
- log.verbose('dist-tag add', t, 'to', pkg + '@' + version)
+function add (spec, tag, opts) {
+ spec = npa(spec || '')
+ const version = spec.rawSpec
+ const t = (tag || opts.tag).trim()
- if (!pkg || !version || !t) return cb('Usage:\n' + distTag.usage)
+ log.verbose('dist-tag add', t, 'to', spec.name + '@' + version)
+
+ if (!spec || !version || !t) UsageError()
if (semver.validRange(t)) {
- var er = new Error('Tag name must not be a valid SemVer range: ' + t)
- return cb(er)
+ throw new Error('Tag name must not be a valid SemVer range: ' + t)
}
- fetchTags(pkg, function (er, tags) {
- if (er) return cb(er)
-
+ return fetchTags(spec, opts).then(tags => {
if (tags[t] === version) {
log.warn('dist-tag add', t, 'is already set to version', version)
- return cb()
+ return
}
tags[t] = version
-
- mapToRegistry(pkg, npm.config, function (er, uri, auth, base) {
- var params = {
- 'package': pkg,
- distTag: t,
- version: version,
- auth: auth
- }
-
- npm.registry.distTags.add(base, params, function (er) {
- if (er) return cb(er)
-
- output('+' + t + ': ' + pkg + '@' + version)
- cb()
- })
+ const url = `/-/package/${spec.escapedName}/dist-tags/${encodeURIComponent(t)}`
+ const reqOpts = opts.concat({
+ method: 'PUT',
+ body: JSON.stringify(version),
+ headers: {
+ 'content-type': 'application/json'
+ },
+ spec
+ })
+ return otplease(reqOpts, reqOpts => regFetch(url, reqOpts)).then(() => {
+ output(`+${t}: ${spec.name}@${version}`)
})
})
}
-function remove (tag, pkg, cb) {
- log.verbose('dist-tag del', tag, 'from', pkg)
-
- fetchTags(pkg, function (er, tags) {
- if (er) return cb(er)
+function remove (spec, tag, opts) {
+ spec = npa(spec || '')
+ log.verbose('dist-tag del', tag, 'from', spec.name)
+ return fetchTags(spec, opts).then(tags => {
if (!tags[tag]) {
- log.info('dist-tag del', tag, 'is not a dist-tag on', pkg)
- return cb(new Error(tag + ' is not a dist-tag on ' + pkg))
+ log.info('dist-tag del', tag, 'is not a dist-tag on', spec.name)
+ throw new Error(tag + ' is not a dist-tag on ' + spec.name)
}
-
- var version = tags[tag]
+ const version = tags[tag]
delete tags[tag]
-
- mapToRegistry(pkg, npm.config, function (er, uri, auth, base) {
- var params = {
- 'package': pkg,
- distTag: tag,
- auth: auth
- }
-
- npm.registry.distTags.rm(base, params, function (er) {
- if (er) return cb(er)
-
- output('-' + tag + ': ' + pkg + '@' + version)
- cb()
- })
+ const url = `/-/package/${spec.escapedName}/dist-tags/${encodeURIComponent(tag)}`
+ const reqOpts = opts.concat({
+ method: 'DELETE'
+ })
+ return otplease(reqOpts, reqOpts => regFetch(url, reqOpts)).then(() => {
+ output(`-${tag}: ${spec.name}@${version}`)
})
})
}
-function list (pkg, cb) {
- if (!pkg) {
- return readLocalPkg(function (er, pkg) {
- if (er) return cb(er)
- if (!pkg) return cb(distTag.usage)
- list(pkg, cb)
+function list (spec, opts) {
+ if (!spec) {
+ return readLocalPkg().then(pkg => {
+ if (!pkg) { UsageError() }
+ return list(pkg, opts)
})
}
+ spec = npa(spec)
- fetchTags(pkg, function (er, tags) {
- if (er) {
- log.error('dist-tag ls', "Couldn't get dist-tag data for", pkg)
- return cb(er)
- }
- var msg = Object.keys(tags).map(function (k) {
- return k + ': ' + tags[k]
- }).sort().join('\n')
+ return fetchTags(spec, opts).then(tags => {
+ var msg = Object.keys(tags).map(k => `${k}: ${tags[k]}`).sort().join('\n')
output(msg)
- cb(er, tags)
+ return tags
+ }, err => {
+ log.error('dist-tag ls', "Couldn't get dist-tag data for", spec)
+ throw err
})
}
-function fetchTags (pkg, cb) {
- mapToRegistry(pkg, npm.config, function (er, uri, auth, base) {
- if (er) return cb(er)
-
- var params = {
- 'package': pkg,
- auth: auth
- }
- npm.registry.distTags.fetch(base, params, function (er, tags) {
- if (er) return cb(er)
- if (!tags || !Object.keys(tags).length) {
- return cb(new Error('No dist-tags found for ' + pkg))
- }
-
- cb(null, tags)
+function fetchTags (spec, opts) {
+ return regFetch.json(
+ `/-/package/${spec.escapedName}/dist-tags`,
+ opts.concat({
+ 'prefer-online': true,
+ spec
})
+ ).then(data => {
+ if (data && typeof data === 'object') delete data._etag
+ if (!data || !Object.keys(data).length) {
+ throw new Error('No dist-tags found for ' + spec.name)
+ }
+ return data
})
}
diff --git a/deps/npm/lib/doctor/check-ping.js b/deps/npm/lib/doctor/check-ping.js
index e7e82902a7165c..70db255480c371 100644
--- a/deps/npm/lib/doctor/check-ping.js
+++ b/deps/npm/lib/doctor/check-ping.js
@@ -4,8 +4,12 @@ var ping = require('../ping.js')
function checkPing (cb) {
var tracker = log.newItem('checkPing', 1)
tracker.info('checkPing', 'Pinging registry')
- ping({}, true, (_err, pong, data, res) => {
- cb(null, [res.statusCode, res.statusMessage])
+ ping({}, true, (err, pong) => {
+ if (err && err.code && err.code.match(/^E\d{3}$/)) {
+ return cb(null, [err.code.substr(1)])
+ } else {
+ cb(null, [200, 'OK'])
+ }
})
}
diff --git a/deps/npm/lib/fetch-package-metadata.js b/deps/npm/lib/fetch-package-metadata.js
index cca6dc64f4168e..78eed42bdf0002 100644
--- a/deps/npm/lib/fetch-package-metadata.js
+++ b/deps/npm/lib/fetch-package-metadata.js
@@ -8,11 +8,11 @@ const rimraf = require('rimraf')
const validate = require('aproba')
const npa = require('npm-package-arg')
const npm = require('./npm')
+let npmConfig
const npmlog = require('npmlog')
const limit = require('call-limit')
const tempFilename = require('./utils/temp-filename')
const pacote = require('pacote')
-let pacoteOpts
const isWindows = require('./utils/is-windows.js')
function andLogAndFinish (spec, tracker, done) {
@@ -52,10 +52,10 @@ function fetchPackageMetadata (spec, where, opts, done) {
err.code = 'EWINDOWSPATH'
return logAndFinish(err)
}
- if (!pacoteOpts) {
- pacoteOpts = require('./config/pacote')
+ if (!npmConfig) {
+ npmConfig = require('./config/figgy-config.js')
}
- pacote.manifest(dep, pacoteOpts({
+ pacote.manifest(dep, npmConfig({
annotate: true,
fullMetadata: opts.fullMetadata,
log: tracker || npmlog,
@@ -85,9 +85,6 @@ function fetchPackageMetadata (spec, where, opts, done) {
module.exports.addBundled = addBundled
function addBundled (pkg, next) {
validate('OF', arguments)
- if (!pacoteOpts) {
- pacoteOpts = require('./config/pacote')
- }
if (pkg._bundled !== undefined) return next(null, pkg)
if (!pkg.bundleDependencies && pkg._requested.type !== 'directory') return next(null, pkg)
@@ -101,7 +98,10 @@ function addBundled (pkg, next) {
}
pkg._bundled = null
const target = tempFilename('unpack')
- const opts = pacoteOpts({integrity: pkg._integrity})
+ if (!npmConfig) {
+ npmConfig = require('./config/figgy-config.js')
+ }
+ const opts = npmConfig({integrity: pkg._integrity})
pacote.extract(pkg._resolved || pkg._requested || npa.resolve(pkg.name, pkg.version), target, opts).then(() => {
log.silly('addBundled', 'read tarball')
readPackageTree(target, (err, tree) => {
diff --git a/deps/npm/lib/hook.js b/deps/npm/lib/hook.js
index b0552c74740ea3..54aea9f1e9d207 100644
--- a/deps/npm/lib/hook.js
+++ b/deps/npm/lib/hook.js
@@ -2,129 +2,146 @@
const BB = require('bluebird')
-const crypto = require('crypto')
-const hookApi = require('libnpmhook')
-const log = require('npmlog')
-const npm = require('./npm.js')
+const hookApi = require('libnpm/hook')
+const npmConfig = require('./config/figgy-config.js')
const output = require('./utils/output.js')
+const otplease = require('./utils/otplease.js')
const pudding = require('figgy-pudding')
const relativeDate = require('tiny-relative-date')
const Table = require('cli-table3')
-const usage = require('./utils/usage.js')
const validate = require('aproba')
-hook.usage = usage([
+hook.usage = [
'npm hook add [--type=]',
'npm hook ls [pkg]',
'npm hook rm ',
'npm hook update '
-])
+].join('\n')
hook.completion = (opts, cb) => {
validate('OF', [opts, cb])
return cb(null, []) // fill in this array with completion values
}
-const npmSession = crypto.randomBytes(8).toString('hex')
-const hookConfig = pudding()
-function config () {
- return hookConfig({
- refer: npm.refer,
- projectScope: npm.projectScope,
- log,
- npmSession
- }, npm.config)
+const HookConfig = pudding({
+ json: {},
+ loglevel: {},
+ parseable: {},
+ silent: {},
+ unicode: {}
+})
+
+function UsageError () {
+ throw Object.assign(new Error(hook.usage), {code: 'EUSAGE'})
}
-module.exports = (args, cb) => BB.try(() => hook(args)).nodeify(cb)
+module.exports = (args, cb) => BB.try(() => hook(args)).then(
+ val => cb(null, val),
+ err => err.code === 'EUSAGE' ? cb(err.message) : cb(err)
+)
function hook (args) {
- switch (args[0]) {
- case 'add':
- return add(args[1], args[2], args[3])
- case 'ls':
- return ls(args[1])
- case 'rm':
- return rm(args[1])
- case 'update':
- case 'up':
- return update(args[1], args[2], args[3])
- }
+ return otplease(npmConfig(), opts => {
+ opts = HookConfig(opts)
+ switch (args[0]) {
+ case 'add':
+ return add(args[1], args[2], args[3], opts)
+ case 'ls':
+ return ls(args[1], opts)
+ case 'rm':
+ return rm(args[1], opts)
+ case 'update':
+ case 'up':
+ return update(args[1], args[2], args[3], opts)
+ default:
+ UsageError()
+ }
+ })
}
-function add (pkg, uri, secret) {
- return hookApi.add(pkg, uri, secret, config())
- .then((hook) => {
- if (npm.config.get('json')) {
- output(JSON.stringify(hook, null, 2))
- } else {
- output(`+ ${hookName(hook)} ${
- npm.config.get('unicode') ? ' ➜ ' : ' -> '
- } ${hook.endpoint}`)
- }
- })
+function add (pkg, uri, secret, opts) {
+ return hookApi.add(pkg, uri, secret, opts).then(hook => {
+ if (opts.json) {
+ output(JSON.stringify(hook, null, 2))
+ } else if (opts.parseable) {
+ output(Object.keys(hook).join('\t'))
+ output(Object.keys(hook).map(k => hook[k]).join('\t'))
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`+ ${hookName(hook)} ${
+ opts.unicode ? ' ➜ ' : ' -> '
+ } ${hook.endpoint}`)
+ }
+ })
}
-function ls (pkg) {
- return hookApi.ls(pkg, config())
- .then((hooks) => {
- if (npm.config.get('json')) {
- output(JSON.stringify(hooks, null, 2))
- } else if (!hooks.length) {
- output("You don't have any hooks configured yet.")
+function ls (pkg, opts) {
+ return hookApi.ls(opts.concat({package: pkg})).then(hooks => {
+ if (opts.json) {
+ output(JSON.stringify(hooks, null, 2))
+ } else if (opts.parseable) {
+ output(Object.keys(hooks[0]).join('\t'))
+ hooks.forEach(hook => {
+ output(Object.keys(hook).map(k => hook[k]).join('\t'))
+ })
+ } else if (!hooks.length) {
+ output("You don't have any hooks configured yet.")
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ if (hooks.length === 1) {
+ output('You have one hook configured.')
} else {
- if (hooks.length === 1) {
- output('You have one hook configured.')
- } else {
- output(`You have ${hooks.length} hooks configured.`)
- }
- const table = new Table({head: ['id', 'target', 'endpoint']})
- hooks.forEach((hook) => {
+ output(`You have ${hooks.length} hooks configured.`)
+ }
+ const table = new Table({head: ['id', 'target', 'endpoint']})
+ hooks.forEach((hook) => {
+ table.push([
+ {rowSpan: 2, content: hook.id},
+ hookName(hook),
+ hook.endpoint
+ ])
+ if (hook.last_delivery) {
table.push([
- {rowSpan: 2, content: hook.id},
- hookName(hook),
- hook.endpoint
+ {
+ colSpan: 1,
+ content: `triggered ${relativeDate(hook.last_delivery)}`
+ },
+ hook.response_code
])
- if (hook.last_delivery) {
- table.push([
- {
- colSpan: 1,
- content: `triggered ${relativeDate(hook.last_delivery)}`
- },
- hook.response_code
- ])
- } else {
- table.push([{colSpan: 2, content: 'never triggered'}])
- }
- })
- output(table.toString())
- }
- })
+ } else {
+ table.push([{colSpan: 2, content: 'never triggered'}])
+ }
+ })
+ output(table.toString())
+ }
+ })
}
-function rm (id) {
- return hookApi.rm(id, config())
- .then((hook) => {
- if (npm.config.get('json')) {
- output(JSON.stringify(hook, null, 2))
- } else {
- output(`- ${hookName(hook)} ${
- npm.config.get('unicode') ? ' ✘ ' : ' X '
- } ${hook.endpoint}`)
- }
- })
+function rm (id, opts) {
+ return hookApi.rm(id, opts).then(hook => {
+ if (opts.json) {
+ output(JSON.stringify(hook, null, 2))
+ } else if (opts.parseable) {
+ output(Object.keys(hook).join('\t'))
+ output(Object.keys(hook).map(k => hook[k]).join('\t'))
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`- ${hookName(hook)} ${
+ opts.unicode ? ' ✘ ' : ' X '
+ } ${hook.endpoint}`)
+ }
+ })
}
-function update (id, uri, secret) {
- return hookApi.update(id, uri, secret, config())
- .then((hook) => {
- if (npm.config.get('json')) {
- output(JSON.stringify(hook, null, 2))
- } else {
- output(`+ ${hookName(hook)} ${
- npm.config.get('unicode') ? ' ➜ ' : ' -> '
- } ${hook.endpoint}`)
- }
- })
+function update (id, uri, secret, opts) {
+ return hookApi.update(id, uri, secret, opts).then(hook => {
+ if (opts.json) {
+ output(JSON.stringify(hook, null, 2))
+ } else if (opts.parseable) {
+ output(Object.keys(hook).join('\t'))
+ output(Object.keys(hook).map(k => hook[k]).join('\t'))
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`+ ${hookName(hook)} ${
+ opts.unicode ? ' ➜ ' : ' -> '
+ } ${hook.endpoint}`)
+ }
+ })
}
function hookName (hook) {
diff --git a/deps/npm/lib/install/action/extract-worker.js b/deps/npm/lib/install/action/extract-worker.js
index 2b082b4a574c25..225e5b4aeab668 100644
--- a/deps/npm/lib/install/action/extract-worker.js
+++ b/deps/npm/lib/install/action/extract-worker.js
@@ -3,16 +3,16 @@
const BB = require('bluebird')
const extract = require('pacote/extract')
-const npmlog = require('npmlog')
+// const npmlog = require('npmlog')
module.exports = (args, cb) => {
const parsed = typeof args === 'string' ? JSON.parse(args) : args
const spec = parsed[0]
const extractTo = parsed[1]
const opts = parsed[2]
- if (!opts.log) {
- opts.log = npmlog
- }
- opts.log.level = opts.loglevel || opts.log.level
+ // if (!opts.log) {
+ // opts.log = npmlog
+ // }
+ // opts.log.level = opts.loglevel || opts.log.level
BB.resolve(extract(spec, extractTo, opts)).nodeify(cb)
}
diff --git a/deps/npm/lib/install/action/extract.js b/deps/npm/lib/install/action/extract.js
index e8d7a6c4f6d1f0..c1c17cdf6c4f35 100644
--- a/deps/npm/lib/install/action/extract.js
+++ b/deps/npm/lib/install/action/extract.js
@@ -2,6 +2,7 @@
const BB = require('bluebird')
+const figgyPudding = require('figgy-pudding')
const stat = BB.promisify(require('graceful-fs').stat)
const gentlyRm = BB.promisify(require('../../utils/gently-rm.js'))
const mkdirp = BB.promisify(require('mkdirp'))
@@ -9,8 +10,8 @@ const moduleStagingPath = require('../module-staging-path.js')
const move = require('../../utils/move.js')
const npa = require('npm-package-arg')
const npm = require('../../npm.js')
+let npmConfig
const packageId = require('../../utils/package-id.js')
-let pacoteOpts
const path = require('path')
const localWorker = require('./extract-worker.js')
const workerFarm = require('worker-farm')
@@ -19,19 +20,12 @@ const isRegistry = require('../../utils/is-registry.js')
const WORKER_PATH = require.resolve('./extract-worker.js')
let workers
-// NOTE: temporarily disabled on non-OSX due to ongoing issues:
-//
-// * Seems to make Windows antivirus issues much more common
-// * Messes with Docker (I think)
-//
-// There are other issues that should be fixed that affect OSX too:
-//
-// * Logging is messed up right now because pacote does its own thing
-// * Global deduplication in pacote breaks due to multiple procs
-//
-// As these get fixed, we can start experimenting with re-enabling it
-// at least on some platforms.
-const ENABLE_WORKERS = process.platform === 'darwin'
+const ExtractOpts = figgyPudding({
+ log: {}
+}, { other () { return true } })
+
+// Disabled for now. Re-enable someday. Just not today.
+const ENABLE_WORKERS = false
extract.init = () => {
if (ENABLE_WORKERS) {
@@ -53,10 +47,10 @@ module.exports = extract
function extract (staging, pkg, log) {
log.silly('extract', packageId(pkg))
const extractTo = moduleStagingPath(staging, pkg)
- if (!pacoteOpts) {
- pacoteOpts = require('../../config/pacote')
+ if (!npmConfig) {
+ npmConfig = require('../../config/figgy-config.js')
}
- const opts = pacoteOpts({
+ let opts = ExtractOpts(npmConfig()).concat({
integrity: pkg.package._integrity,
resolved: pkg.package._resolved
})
@@ -72,9 +66,18 @@ function extract (staging, pkg, log) {
args[0] = spec.raw
if (ENABLE_WORKERS && (isRegistry(spec) || spec.type === 'remote')) {
// We can't serialize these options
- opts.loglevel = opts.log.level
- opts.log = null
- opts.dirPacker = null
+ opts = opts.concat({
+ loglevel: opts.log.level,
+ log: null,
+ dirPacker: null,
+ Promise: null,
+ _events: null,
+ _eventsCount: null,
+ list: null,
+ sources: null,
+ _maxListeners: null,
+ root: null
+ })
// workers will run things in parallel!
launcher = workers
try {
diff --git a/deps/npm/lib/install/action/fetch.js b/deps/npm/lib/install/action/fetch.js
index 5ad34e29dd27ef..346194e51607e1 100644
--- a/deps/npm/lib/install/action/fetch.js
+++ b/deps/npm/lib/install/action/fetch.js
@@ -3,14 +3,14 @@
const BB = require('bluebird')
const finished = BB.promisify(require('mississippi').finished)
+const npmConfig = require('../../config/figgy-config.js')
const packageId = require('../../utils/package-id.js')
const pacote = require('pacote')
-const pacoteOpts = require('../../config/pacote')
module.exports = fetch
function fetch (staging, pkg, log, next) {
log.silly('fetch', packageId(pkg))
- const opts = pacoteOpts({integrity: pkg.package._integrity})
+ const opts = npmConfig({integrity: pkg.package._integrity})
return finished(pacote.tarball.stream(pkg.package._requested, opts))
.then(() => next(), next)
}
diff --git a/deps/npm/lib/install/audit.js b/deps/npm/lib/install/audit.js
index f372b425a6fd4e..f5bc5ae1a92d65 100644
--- a/deps/npm/lib/install/audit.js
+++ b/deps/npm/lib/install/audit.js
@@ -7,118 +7,115 @@ exports.printInstallReport = printInstallReport
exports.printParseableReport = printParseableReport
exports.printFullReport = printFullReport
-const Bluebird = require('bluebird')
const auditReport = require('npm-audit-report')
+const npmConfig = require('../config/figgy-config.js')
+const figgyPudding = require('figgy-pudding')
const treeToShrinkwrap = require('../shrinkwrap.js').treeToShrinkwrap
const packageId = require('../utils/package-id.js')
const output = require('../utils/output.js')
const npm = require('../npm.js')
const qw = require('qw')
-const registryFetch = require('npm-registry-fetch')
-const zlib = require('zlib')
-const gzip = Bluebird.promisify(zlib.gzip)
-const log = require('npmlog')
+const regFetch = require('npm-registry-fetch')
const perf = require('../utils/perf.js')
-const url = require('url')
const npa = require('npm-package-arg')
const uuid = require('uuid')
const ssri = require('ssri')
const cloneDeep = require('lodash.clonedeep')
-const pacoteOpts = require('../config/pacote.js')
// used when scrubbing module names/specifiers
const runId = uuid.v4()
+const InstallAuditConfig = figgyPudding({
+ color: {},
+ json: {},
+ unicode: {}
+}, {
+ other (key) {
+ return /:registry$/.test(key)
+ }
+})
+
function submitForInstallReport (auditData) {
- const cfg = npm.config // avoid the no-dynamic-lookups test
- const scopedRegistries = cfg.keys.filter(_ => /:registry$/.test(_)).map(_ => cfg.get(_))
- perf.emit('time', 'audit compress')
- // TODO: registryFetch will be adding native support for `Content-Encoding: gzip` at which point
- // we'll pass in something like `gzip: true` and not need to JSON stringify, gzip or headers.
- return gzip(JSON.stringify(auditData)).then(body => {
- perf.emit('timeEnd', 'audit compress')
- log.info('audit', 'Submitting payload of ' + body.length + 'bytes')
- scopedRegistries.forEach(reg => {
- // we don't care about the response so destroy the stream if we can, or leave it flowing
- // so it can eventually finish and clean up after itself
- fetchAudit(url.resolve(reg, '/-/npm/v1/security/audits/quick'))
- .then(_ => {
- _.body.on('error', () => {})
- if (_.body.destroy) {
- _.body.destroy()
- } else {
- _.body.resume()
- }
- }, _ => {})
- })
- perf.emit('time', 'audit submit')
- return fetchAudit('/-/npm/v1/security/audits/quick', body).then(response => {
- perf.emit('timeEnd', 'audit submit')
- perf.emit('time', 'audit body')
- return response.json()
- }).then(result => {
- perf.emit('timeEnd', 'audit body')
- return result
- })
+ const opts = InstallAuditConfig(npmConfig())
+ const scopedRegistries = [...opts.keys()].filter(
+ k => /:registry$/.test(k)
+ ).map(k => opts[k])
+ scopedRegistries.forEach(registry => {
+ // we don't care about the response so destroy the stream if we can, or leave it flowing
+ // so it can eventually finish and clean up after itself
+ regFetch('/-/npm/v1/security/audits/quick', opts.concat({
+ method: 'POST',
+ registry,
+ gzip: true,
+ body: auditData
+ })).then(_ => {
+ _.body.on('error', () => {})
+ if (_.body.destroy) {
+ _.body.destroy()
+ } else {
+ _.body.resume()
+ }
+ }, _ => {})
})
-}
-
-function submitForFullReport (auditData) {
- perf.emit('time', 'audit compress')
- // TODO: registryFetch will be adding native support for `Content-Encoding: gzip` at which point
- // we'll pass in something like `gzip: true` and not need to JSON stringify, gzip or headers.
- return gzip(JSON.stringify(auditData)).then(body => {
- perf.emit('timeEnd', 'audit compress')
- log.info('audit', 'Submitting payload of ' + body.length + ' bytes')
- perf.emit('time', 'audit submit')
- return fetchAudit('/-/npm/v1/security/audits', body).then(response => {
- perf.emit('timeEnd', 'audit submit')
- perf.emit('time', 'audit body')
- return response.json()
- }).then(result => {
- perf.emit('timeEnd', 'audit body')
- result.runId = runId
- return result
- })
+ perf.emit('time', 'audit submit')
+ return regFetch('/-/npm/v1/security/audits/quick', opts.concat({
+ method: 'POST',
+ gzip: true,
+ body: auditData
+ })).then(response => {
+ perf.emit('timeEnd', 'audit submit')
+ perf.emit('time', 'audit body')
+ return response.json()
+ }).then(result => {
+ perf.emit('timeEnd', 'audit body')
+ return result
})
}
-function fetchAudit (href, body) {
- const opts = pacoteOpts()
- return registryFetch(href, {
+function submitForFullReport (auditData) {
+ perf.emit('time', 'audit submit')
+ const opts = InstallAuditConfig(npmConfig())
+ return regFetch('/-/npm/v1/security/audits', opts.concat({
method: 'POST',
- headers: { 'content-encoding': 'gzip', 'content-type': 'application/json' },
- config: npm.config,
- npmSession: opts.npmSession,
- projectScope: npm.projectScope,
- log: log,
- body: body
+ gzip: true,
+ body: auditData
+ })).then(response => {
+ perf.emit('timeEnd', 'audit submit')
+ perf.emit('time', 'audit body')
+ return response.json()
+ }).then(result => {
+ perf.emit('timeEnd', 'audit body')
+ result.runId = runId
+ return result
})
}
function printInstallReport (auditResult) {
+ const opts = InstallAuditConfig(npmConfig())
return auditReport(auditResult, {
reporter: 'install',
- withColor: npm.color,
- withUnicode: npm.config.get('unicode')
+ withColor: opts.color,
+ withUnicode: opts.unicode
}).then(result => output(result.report))
}
function printFullReport (auditResult) {
+ const opts = InstallAuditConfig(npmConfig())
return auditReport(auditResult, {
log: output,
- reporter: npm.config.get('json') ? 'json' : 'detail',
- withColor: npm.color,
- withUnicode: npm.config.get('unicode')
+ reporter: opts.json ? 'json' : 'detail',
+ withColor: opts.color,
+ withUnicode: opts.unicode
}).then(result => output(result.report))
}
function printParseableReport (auditResult) {
+ const opts = InstallAuditConfig(npmConfig())
return auditReport(auditResult, {
log: output,
reporter: 'parseable',
- withColor: npm.color,
- withUnicode: npm.config.get('unicode')
+ withColor: opts.color,
+ withUnicode: opts.unicode
}).then(result => output(result.report))
}
diff --git a/deps/npm/lib/install/is-only-dev.js b/deps/npm/lib/install/is-only-dev.js
index ef41e8ad1a2659..2877c61a227d09 100644
--- a/deps/npm/lib/install/is-only-dev.js
+++ b/deps/npm/lib/install/is-only-dev.js
@@ -28,6 +28,7 @@ function andIsOnlyDev (name, seen) {
return isDev && !isProd
} else {
if (seen.has(req)) return true
+ seen = new Set(seen)
seen.add(req)
return isOnlyDev(req, seen)
}
diff --git a/deps/npm/lib/install/is-only-optional.js b/deps/npm/lib/install/is-only-optional.js
index 72d6f065e6745b..f1b731578d9422 100644
--- a/deps/npm/lib/install/is-only-optional.js
+++ b/deps/npm/lib/install/is-only-optional.js
@@ -10,6 +10,7 @@ function isOptional (node, seen) {
if (seen.has(node) || node.requiredBy.length === 0) {
return false
}
+ seen = new Set(seen)
seen.add(node)
const swOptional = node.fromShrinkwrap && node.package._optional
return node.requiredBy.every(function (req) {
diff --git a/deps/npm/lib/logout.js b/deps/npm/lib/logout.js
index a3287d42d16592..411f547210b8f1 100644
--- a/deps/npm/lib/logout.js
+++ b/deps/npm/lib/logout.js
@@ -1,43 +1,44 @@
-module.exports = logout
+'use strict'
-var dezalgo = require('dezalgo')
-var log = require('npmlog')
+const BB = require('bluebird')
-var npm = require('./npm.js')
-var mapToRegistry = require('./utils/map-to-registry.js')
+const eu = encodeURIComponent
+const getAuth = require('npm-registry-fetch/auth.js')
+const log = require('npmlog')
+const npm = require('./npm.js')
+const npmConfig = require('./config/figgy-config.js')
+const npmFetch = require('libnpm/fetch')
logout.usage = 'npm logout [--registry=] [--scope=<@scope>]'
-function afterLogout (normalized, cb) {
+function afterLogout (normalized) {
var scope = npm.config.get('scope')
if (scope) npm.config.del(scope + ':registry')
npm.config.clearCredentialsByURI(normalized)
- npm.config.save('user', cb)
+ return BB.fromNode(cb => npm.config.save('user', cb))
}
+module.exports = logout
function logout (args, cb) {
- cb = dezalgo(cb)
-
- mapToRegistry('/', npm.config, function (err, uri, auth, normalized) {
- if (err) return cb(err)
-
+ const opts = npmConfig()
+ BB.try(() => {
+ const reg = npmFetch.pickRegistry('foo', opts)
+ const auth = getAuth(reg, opts)
if (auth.token) {
- log.verbose('logout', 'clearing session token for', normalized)
- npm.registry.logout(normalized, { auth: auth }, function (err) {
- if (err) return cb(err)
-
- afterLogout(normalized, cb)
- })
+ log.verbose('logout', 'clearing session token for', reg)
+ return npmFetch(`/-/user/token/${eu(auth.token)}`, opts.concat({
+ method: 'DELETE',
+ ignoreBody: true
+ })).then(() => afterLogout(reg))
} else if (auth.username || auth.password) {
- log.verbose('logout', 'clearing user credentials for', normalized)
-
- afterLogout(normalized, cb)
+ log.verbose('logout', 'clearing user credentials for', reg)
+ return afterLogout(reg)
} else {
- cb(new Error(
- 'Not logged in to', normalized + ',', "so can't log out."
- ))
+ throw new Error(
+ 'Not logged in to', reg + ',', "so can't log out."
+ )
}
- })
+ }).nodeify(cb)
}
diff --git a/deps/npm/lib/npm.js b/deps/npm/lib/npm.js
index da5a3636021223..2ee9a991264c7a 100644
--- a/deps/npm/lib/npm.js
+++ b/deps/npm/lib/npm.js
@@ -40,9 +40,7 @@
var which = require('which')
var glob = require('glob')
var rimraf = require('rimraf')
- var lazyProperty = require('lazy-property')
var parseJSON = require('./utils/parse-json.js')
- var clientConfig = require('./config/reg-client.js')
var aliases = require('./config/cmd-list').aliases
var cmdList = require('./config/cmd-list').cmdList
var plumbing = require('./config/cmd-list').plumbing
@@ -106,7 +104,6 @@
})
var registryRefer
- var registryLoaded
Object.keys(abbrevs).concat(plumbing).forEach(function addCommand (c) {
Object.defineProperty(npm.commands, c, { get: function () {
@@ -153,7 +150,7 @@
}).filter(function (arg) {
return arg && arg.match
}).join(' ')
- if (registryLoaded) npm.registry.refer = registryRefer
+ npm.referer = registryRefer
}
cmd.apply(npm, args)
@@ -357,17 +354,6 @@
npm.projectScope = config.get('scope') ||
scopeifyScope(getProjectScope(npm.prefix))
- // at this point the configs are all set.
- // go ahead and spin up the registry client.
- lazyProperty(npm, 'registry', function () {
- registryLoaded = true
- var RegClient = require('npm-registry-client')
- var registry = new RegClient(clientConfig(npm, log, npm.config))
- registry.version = npm.version
- registry.refer = registryRefer
- return registry
- })
-
startMetrics()
return cb(null, npm)
diff --git a/deps/npm/lib/org.js b/deps/npm/lib/org.js
new file mode 100644
index 00000000000000..d8f857e3dfdd94
--- /dev/null
+++ b/deps/npm/lib/org.js
@@ -0,0 +1,151 @@
+'use strict'
+
+const figgyPudding = require('figgy-pudding')
+const liborg = require('libnpm/org')
+const npmConfig = require('./config/figgy-config.js')
+const output = require('./utils/output.js')
+const otplease = require('./utils/otplease.js')
+const Table = require('cli-table3')
+
+module.exports = org
+
+org.subcommands = ['set', 'rm', 'ls']
+
+org.usage =
+ 'npm org set orgname username [developer | admin | owner]\n' +
+ 'npm org rm orgname username\n' +
+ 'npm org ls orgname []'
+
+const OrgConfig = figgyPudding({
+ json: {},
+ loglevel: {},
+ parseable: {},
+ silent: {}
+})
+
+org.completion = function (opts, cb) {
+ var argv = opts.conf.argv.remain
+ if (argv.length === 2) {
+ return cb(null, org.subcommands)
+ }
+ switch (argv[2]) {
+ case 'ls':
+ case 'add':
+ case 'rm':
+ case 'set':
+ return cb(null, [])
+ default:
+ return cb(new Error(argv[2] + ' not recognized'))
+ }
+}
+
+function UsageError () {
+ throw Object.assign(new Error(org.usage), {code: 'EUSAGE'})
+}
+
+function org ([cmd, orgname, username, role], cb) {
+ otplease(npmConfig(), opts => {
+ opts = OrgConfig(opts)
+ switch (cmd) {
+ case 'add':
+ case 'set':
+ return orgSet(orgname, username, role, opts)
+ case 'rm':
+ return orgRm(orgname, username, opts)
+ case 'ls':
+ return orgList(orgname, username, opts)
+ default:
+ UsageError()
+ }
+ }).then(
+ x => cb(null, x),
+ err => cb(err.code === 'EUSAGE' ? err.message : err)
+ )
+}
+
+function orgSet (org, user, role, opts) {
+ role = role || 'developer'
+ if (!org) {
+ throw new Error('First argument `orgname` is required.')
+ }
+ if (!user) {
+ throw new Error('Second argument `username` is required.')
+ }
+ if (!['owner', 'admin', 'developer'].find(x => x === role)) {
+ throw new Error('Third argument `role` must be one of `owner`, `admin`, or `developer`, with `developer` being the default value if omitted.')
+ }
+ return liborg.set(org, user, role, opts).then(memDeets => {
+ if (opts.json) {
+ output(JSON.stringify(memDeets, null, 2))
+ } else if (opts.parseable) {
+ output(['org', 'orgsize', 'user', 'role'].join('\t'))
+ output([
+ memDeets.org.name,
+ memDeets.org.size,
+ memDeets.user,
+ memDeets.role
+ ])
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`Added ${memDeets.user} as ${memDeets.role} to ${memDeets.org.name}. You now ${memDeets.org.size} member${memDeets.org.size === 1 ? '' : 's'} in this org.`)
+ }
+ return memDeets
+ })
+}
+
+function orgRm (org, user, opts) {
+ if (!org) {
+ throw new Error('First argument `orgname` is required.')
+ }
+ if (!user) {
+ throw new Error('Second argument `username` is required.')
+ }
+ return liborg.rm(org, user, opts).then(() => {
+ return liborg.ls(org, opts)
+ }).then(roster => {
+ user = user.replace(/^[~@]?/, '')
+ org = org.replace(/^[~@]?/, '')
+ const userCount = Object.keys(roster).length
+ if (opts.json) {
+ output(JSON.stringify({
+ user,
+ org,
+ userCount,
+ deleted: true
+ }))
+ } else if (opts.parseable) {
+ output(['user', 'org', 'userCount', 'deleted'].join('\t'))
+ output([user, org, userCount, true].join('\t'))
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`Successfully removed ${user} from ${org}. You now have ${userCount} member${userCount === 1 ? '' : 's'} in this org.`)
+ }
+ })
+}
+
+function orgList (org, user, opts) {
+ if (!org) {
+ throw new Error('First argument `orgname` is required.')
+ }
+ return liborg.ls(org, opts).then(roster => {
+ if (user) {
+ const newRoster = {}
+ if (roster[user]) {
+ newRoster[user] = roster[user]
+ }
+ roster = newRoster
+ }
+ if (opts.json) {
+ output(JSON.stringify(roster, null, 2))
+ } else if (opts.parseable) {
+ output(['user', 'role'].join('\t'))
+ Object.keys(roster).forEach(user => {
+ output([user, roster[user]].join('\t'))
+ })
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ const table = new Table({head: ['user', 'role']})
+ Object.keys(roster).sort().forEach(user => {
+ table.push([user, roster[user]])
+ })
+ output(table.toString())
+ }
+ })
+}
diff --git a/deps/npm/lib/outdated.js b/deps/npm/lib/outdated.js
index 024e076c4f9ad4..ebd67fb6b37d5d 100644
--- a/deps/npm/lib/outdated.js
+++ b/deps/npm/lib/outdated.js
@@ -29,13 +29,15 @@ var color = require('ansicolors')
var styles = require('ansistyles')
var table = require('text-table')
var semver = require('semver')
-var npa = require('npm-package-arg')
+var npa = require('libnpm/parse-arg')
var pickManifest = require('npm-pick-manifest')
var fetchPackageMetadata = require('./fetch-package-metadata.js')
var mutateIntoLogicalTree = require('./install/mutate-into-logical-tree.js')
var npm = require('./npm.js')
+const npmConfig = require('./config/figgy-config.js')
+const figgyPudding = require('figgy-pudding')
+const packument = require('libnpm/packument')
var long = npm.config.get('long')
-var mapToRegistry = require('./utils/map-to-registry.js')
var isExtraneous = require('./install/is-extraneous.js')
var computeMetadata = require('./install/deps.js').computeMetadata
var computeVersionSpec = require('./install/deps.js').computeVersionSpec
@@ -43,6 +45,23 @@ var moduleName = require('./utils/module-name.js')
var output = require('./utils/output.js')
var ansiTrim = require('./utils/ansi-trim')
+const OutdatedConfig = figgyPudding({
+ also: {},
+ color: {},
+ depth: {},
+ dev: 'development',
+ development: {},
+ global: {},
+ json: {},
+ only: {},
+ parseable: {},
+ prod: 'production',
+ production: {},
+ save: {},
+ 'save-dev': {},
+ 'save-optional': {}
+})
+
function uniq (list) {
// we maintain the array because we need an array, not iterator, return
// value.
@@ -68,26 +87,27 @@ function outdated (args, silent, cb) {
cb = silent
silent = false
}
+ let opts = OutdatedConfig(npmConfig())
var dir = path.resolve(npm.dir, '..')
// default depth for `outdated` is 0 (cf. `ls`)
- if (npm.config.get('depth') === Infinity) npm.config.set('depth', 0)
+ if (opts.depth) opts = opts.concat({depth: 0})
readPackageTree(dir, andComputeMetadata(function (er, tree) {
if (!tree) return cb(er)
mutateIntoLogicalTree(tree)
- outdated_(args, '', tree, {}, 0, function (er, list) {
+ outdated_(args, '', tree, {}, 0, opts, function (er, list) {
list = uniq(list || []).sort(function (aa, bb) {
return aa[0].path.localeCompare(bb[0].path) ||
aa[1].localeCompare(bb[1])
})
if (er || silent || list.length === 0) return cb(er, list)
- if (npm.config.get('json')) {
- output(makeJSON(list))
- } else if (npm.config.get('parseable')) {
- output(makeParseable(list))
+ if (opts.json) {
+ output(makeJSON(list, opts))
+ } else if (opts.parseable) {
+ output(makeParseable(list, opts))
} else {
- var outList = list.map(makePretty)
+ var outList = list.map(x => makePretty(x, opts))
var outHead = [ 'Package',
'Current',
'Wanted',
@@ -97,7 +117,7 @@ function outdated (args, silent, cb) {
if (long) outHead.push('Package Type', 'Homepage')
var outTable = [outHead].concat(outList)
- if (npm.color) {
+ if (opts.color) {
outTable[0] = outTable[0].map(function (heading) {
return styles.underline(heading)
})
@@ -116,14 +136,14 @@ function outdated (args, silent, cb) {
}
// [[ dir, dep, has, want, latest, type ]]
-function makePretty (p) {
+function makePretty (p, opts) {
var depname = p[1]
var has = p[2]
var want = p[3]
var latest = p[4]
var type = p[6]
var deppath = p[7]
- var homepage = p[0].package.homepage
+ var homepage = p[0].package.homepage || ''
var columns = [ depname,
has || 'MISSING',
@@ -136,7 +156,7 @@ function makePretty (p) {
columns[6] = homepage
}
- if (npm.color) {
+ if (opts.color) {
columns[0] = color[has === want || want === 'linked' ? 'yellow' : 'red'](columns[0]) // dep
columns[2] = color.green(columns[2]) // want
columns[3] = color.magenta(columns[3]) // latest
@@ -167,7 +187,7 @@ function makeParseable (list) {
}).join(os.EOL)
}
-function makeJSON (list) {
+function makeJSON (list, opts) {
var out = {}
list.forEach(function (p) {
var dep = p[0]
@@ -177,7 +197,7 @@ function makeJSON (list) {
var want = p[3]
var latest = p[4]
var type = p[6]
- if (!npm.config.get('global')) {
+ if (!opts.global) {
dir = path.relative(process.cwd(), dir)
}
out[depname] = { current: has,
@@ -193,11 +213,11 @@ function makeJSON (list) {
return JSON.stringify(out, null, 2)
}
-function outdated_ (args, path, tree, parentHas, depth, cb) {
+function outdated_ (args, path, tree, parentHas, depth, opts, cb) {
if (!tree.package) tree.package = {}
if (path && tree.package.name) path += ' > ' + tree.package.name
if (!path && tree.package.name) path = tree.package.name
- if (depth > npm.config.get('depth')) {
+ if (depth > opts.depth) {
return cb(null, [])
}
var types = {}
@@ -227,11 +247,14 @@ function outdated_ (args, path, tree, parentHas, depth, cb) {
// (All the save checking here is because this gets called from npm-update currently
// and that requires this logic around dev deps.)
// FIXME: Refactor npm update to not be in terms of outdated.
- var dev = npm.config.get('dev') || /^dev(elopment)?$/.test(npm.config.get('also'))
- var prod = npm.config.get('production') || /^prod(uction)?$/.test(npm.config.get('only'))
- if ((dev || !prod) &&
- (npm.config.get('save-dev') || (
- !npm.config.get('save') && !npm.config.get('save-optional')))) {
+ var dev = opts.dev || /^dev(elopment)?$/.test(opts.also)
+ var prod = opts.production || /^prod(uction)?$/.test(opts.only)
+ if (
+ (dev || !prod) &&
+ (
+ opts['save-dev'] || (!opts.save && !opts['save-optional'])
+ )
+ ) {
Object.keys(tree.missingDevDeps).forEach(function (name) {
deps.push({
package: { name: name },
@@ -245,15 +268,15 @@ function outdated_ (args, path, tree, parentHas, depth, cb) {
})
}
- if (npm.config.get('save-dev')) {
+ if (opts['save-dev']) {
deps = deps.filter(function (dep) { return pkg.devDependencies[moduleName(dep)] })
deps.forEach(function (dep) {
types[moduleName(dep)] = 'devDependencies'
})
- } else if (npm.config.get('save')) {
+ } else if (opts.save) {
// remove optional dependencies from dependencies during --save.
deps = deps.filter(function (dep) { return !pkg.optionalDependencies[moduleName(dep)] })
- } else if (npm.config.get('save-optional')) {
+ } else if (opts['save-optional']) {
deps = deps.filter(function (dep) { return pkg.optionalDependencies[moduleName(dep)] })
deps.forEach(function (dep) {
types[moduleName(dep)] = 'optionalDependencies'
@@ -262,7 +285,7 @@ function outdated_ (args, path, tree, parentHas, depth, cb) {
var doUpdate = dev || (
!prod &&
!Object.keys(parentHas).length &&
- !npm.config.get('global')
+ !opts.global
)
if (doUpdate) {
Object.keys(pkg.devDependencies || {}).forEach(function (k) {
@@ -300,13 +323,13 @@ function outdated_ (args, path, tree, parentHas, depth, cb) {
required = computeVersionSpec(tree, dep)
}
- if (!long) return shouldUpdate(args, dep, name, has, required, depth, path, cb)
+ if (!long) return shouldUpdate(args, dep, name, has, required, depth, path, opts, cb)
- shouldUpdate(args, dep, name, has, required, depth, path, cb, types[name])
+ shouldUpdate(args, dep, name, has, required, depth, path, opts, cb, types[name])
}, cb)
}
-function shouldUpdate (args, tree, dep, has, req, depth, pkgpath, cb, type) {
+function shouldUpdate (args, tree, dep, has, req, depth, pkgpath, opts, cb, type) {
// look up the most recent version.
// if that's what we already have, or if it's not on the args list,
// then dive into it. Otherwise, cb() with the data.
@@ -322,6 +345,7 @@ function shouldUpdate (args, tree, dep, has, req, depth, pkgpath, cb, type) {
tree,
has,
depth + 1,
+ opts,
cb)
}
@@ -350,11 +374,9 @@ function shouldUpdate (args, tree, dep, has, req, depth, pkgpath, cb, type) {
} else if (parsed.type === 'file') {
return updateLocalDeps()
} else {
- return mapToRegistry(dep, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- npm.registry.get(uri, { auth: auth }, updateDeps)
- })
+ return packument(dep, opts.concat({
+ 'prefer-online': true
+ })).nodeify(updateDeps)
}
function updateLocalDeps (latestRegistryVersion) {
diff --git a/deps/npm/lib/owner.js b/deps/npm/lib/owner.js
index 3c2660ace113d5..a64cb5e14ccefb 100644
--- a/deps/npm/lib/owner.js
+++ b/deps/npm/lib/owner.js
@@ -1,12 +1,17 @@
-/* eslint-disable standard/no-callback-literal */
module.exports = owner
-var npm = require('./npm.js')
-var log = require('npmlog')
-var mapToRegistry = require('./utils/map-to-registry.js')
-var readLocalPkg = require('./utils/read-local-package.js')
-var usage = require('./utils/usage')
-var output = require('./utils/output.js')
+const BB = require('bluebird')
+
+const log = require('npmlog')
+const npa = require('libnpm/parse-arg')
+const npmConfig = require('./config/figgy-config.js')
+const npmFetch = require('libnpm/fetch')
+const output = require('./utils/output.js')
+const otplease = require('./utils/otplease.js')
+const packument = require('libnpm/packument')
+const readLocalPkg = BB.promisify(require('./utils/read-local-package.js'))
+const usage = require('./utils/usage')
+const whoami = BB.promisify(require('./whoami.js'))
owner.usage = usage(
'owner',
@@ -14,8 +19,9 @@ owner.usage = usage(
'\nnpm owner rm [<@scope>/]' +
'\nnpm owner ls [<@scope>/]'
)
+
owner.completion = function (opts, cb) {
- var argv = opts.conf.argv.remain
+ const argv = opts.conf.argv.remain
if (argv.length > 4) return cb()
if (argv.length <= 2) {
var subs = ['add', 'rm']
@@ -23,130 +29,109 @@ owner.completion = function (opts, cb) {
else subs.push('ls', 'list')
return cb(null, subs)
}
-
- npm.commands.whoami([], true, function (er, username) {
- if (er) return cb()
-
- var un = encodeURIComponent(username)
- var byUser, theUser
- switch (argv[2]) {
- case 'ls':
- // FIXME: there used to be registry completion here, but it stopped
- // making sense somewhere around 50,000 packages on the registry
- return cb()
-
- case 'rm':
- if (argv.length > 3) {
- theUser = encodeURIComponent(argv[3])
- byUser = '-/by-user/' + theUser + '|' + un
- return mapToRegistry(byUser, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- console.error(uri)
- npm.registry.get(uri, { auth: auth }, function (er, d) {
- if (er) return cb(er)
- // return the intersection
- return cb(null, d[theUser].filter(function (p) {
+ BB.try(() => {
+ const opts = npmConfig()
+ return whoami([], true).then(username => {
+ const un = encodeURIComponent(username)
+ let byUser, theUser
+ switch (argv[2]) {
+ case 'ls':
+ // FIXME: there used to be registry completion here, but it stopped
+ // making sense somewhere around 50,000 packages on the registry
+ return
+ case 'rm':
+ if (argv.length > 3) {
+ theUser = encodeURIComponent(argv[3])
+ byUser = `/-/by-user/${theUser}|${un}`
+ return npmFetch.json(byUser, opts).then(d => {
+ return d[theUser].filter(
// kludge for server adminery.
- return un === 'isaacs' || d[un].indexOf(p) === -1
- }))
+ p => un === 'isaacs' || d[un].indexOf(p) === -1
+ )
})
- })
- }
- // else fallthrough
- /* eslint no-fallthrough:0 */
- case 'add':
- if (argv.length > 3) {
- theUser = encodeURIComponent(argv[3])
- byUser = '-/by-user/' + theUser + '|' + un
- return mapToRegistry(byUser, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- console.error(uri)
- npm.registry.get(uri, { auth: auth }, function (er, d) {
- console.error(uri, er || d)
- // return mine that they're not already on.
- if (er) return cb(er)
+ }
+ // else fallthrough
+ /* eslint no-fallthrough:0 */
+ case 'add':
+ if (argv.length > 3) {
+ theUser = encodeURIComponent(argv[3])
+ byUser = `/-/by-user/${theUser}|${un}`
+ return npmFetch.json(byUser, opts).then(d => {
var mine = d[un] || []
var theirs = d[theUser] || []
- return cb(null, mine.filter(function (p) {
- return theirs.indexOf(p) === -1
- }))
+ return mine.filter(p => theirs.indexOf(p) === -1)
})
- })
- }
- // just list all users who aren't me.
- return mapToRegistry('-/users', npm.config, function (er, uri, auth) {
- if (er) return cb(er)
+ } else {
+ // just list all users who aren't me.
+ return npmFetch.json('/-/users', opts).then(list => {
+ return Object.keys(list).filter(n => n !== un)
+ })
+ }
- npm.registry.get(uri, { auth: auth }, function (er, list) {
- if (er) return cb()
- return cb(null, Object.keys(list).filter(function (n) {
- return n !== un
- }))
- })
- })
+ default:
+ return cb()
+ }
+ })
+ }).nodeify(cb)
+}
- default:
- return cb()
- }
- })
+function UsageError () {
+ throw Object.assign(new Error(owner.usage), {code: 'EUSAGE'})
}
-function owner (args, cb) {
- var action = args.shift()
- switch (action) {
- case 'ls': case 'list': return ls(args[0], cb)
- case 'add': return add(args[0], args[1], cb)
- case 'rm': case 'remove': return rm(args[0], args[1], cb)
- default: return unknown(action, cb)
- }
+function owner ([action, ...args], cb) {
+ const opts = npmConfig()
+ BB.try(() => {
+ switch (action) {
+ case 'ls': case 'list': return ls(args[0], opts)
+ case 'add': return add(args[0], args[1], opts)
+ case 'rm': case 'remove': return rm(args[0], args[1], opts)
+ default: UsageError()
+ }
+ }).then(
+ data => cb(null, data),
+ err => err.code === 'EUSAGE' ? cb(err.message) : cb(err)
+ )
}
-function ls (pkg, cb) {
+function ls (pkg, opts) {
if (!pkg) {
- return readLocalPkg(function (er, pkg) {
- if (er) return cb(er)
- if (!pkg) return cb(owner.usage)
- ls(pkg, cb)
+ return readLocalPkg().then(pkg => {
+ if (!pkg) { UsageError() }
+ return ls(pkg, opts)
})
}
- mapToRegistry(pkg, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- npm.registry.get(uri, { auth: auth }, function (er, data) {
- var msg = ''
- if (er) {
- log.error('owner ls', "Couldn't get owner data", pkg)
- return cb(er)
- }
+ const spec = npa(pkg)
+ return packument(spec, opts.concat({fullMetadata: true})).then(
+ data => {
var owners = data.maintainers
if (!owners || !owners.length) {
- msg = 'admin party!'
+ output('admin party!')
} else {
- msg = owners.map(function (o) {
- return o.name + ' <' + o.email + '>'
- }).join('\n')
+ output(owners.map(o => `${o.name} <${o.email}>`).join('\n'))
}
- output(msg)
- cb(er, owners)
- })
- })
+ return owners
+ },
+ err => {
+ log.error('owner ls', "Couldn't get owner data", pkg)
+ throw err
+ }
+ )
}
-function add (user, pkg, cb) {
- if (!user) return cb(owner.usage)
+function add (user, pkg, opts) {
+ if (!user) { UsageError() }
if (!pkg) {
- return readLocalPkg(function (er, pkg) {
- if (er) return cb(er)
- if (!pkg) return cb(new Error(owner.usage))
- add(user, pkg, cb)
+ return readLocalPkg().then(pkg => {
+ if (!pkg) { UsageError() }
+ return add(user, pkg, opts)
})
}
-
log.verbose('owner add', '%s to %s', user, pkg)
- mutate(pkg, user, function (u, owners) {
+
+ const spec = npa(pkg)
+ return withMutation(spec, user, opts, (u, owners) => {
if (!owners) owners = []
for (var i = 0, l = owners.length; i < l; i++) {
var o = owners[i]
@@ -160,22 +145,23 @@ function add (user, pkg, cb) {
}
owners.push(u)
return owners
- }, cb)
+ })
}
-function rm (user, pkg, cb) {
+function rm (user, pkg, opts) {
+ if (!user) { UsageError() }
if (!pkg) {
- return readLocalPkg(function (er, pkg) {
- if (er) return cb(er)
- if (!pkg) return cb(new Error(owner.usage))
- rm(user, pkg, cb)
+ return readLocalPkg().then(pkg => {
+ if (!pkg) { UsageError() }
+ return add(user, pkg, opts)
})
}
-
log.verbose('owner rm', '%s from %s', user, pkg)
- mutate(pkg, user, function (u, owners) {
- var found = false
- var m = owners.filter(function (o) {
+
+ const spec = npa(pkg)
+ return withMutation(spec, user, opts, function (u, owners) {
+ let found = false
+ const m = owners.filter(function (o) {
var match = (o.name === user)
found = found || match
return !match
@@ -187,92 +173,70 @@ function rm (user, pkg, cb) {
}
if (!m.length) {
- return new Error(
+ throw new Error(
'Cannot remove all owners of a package. Add someone else first.'
)
}
return m
- }, cb)
+ })
}
-function mutate (pkg, user, mutation, cb) {
- if (user) {
- var byUser = '-/user/org.couchdb.user:' + user
- mapToRegistry(byUser, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- npm.registry.get(uri, { auth: auth }, mutate_)
- })
- } else {
- mutate_(null, null)
- }
+function withMutation (spec, user, opts, mutation) {
+ return BB.try(() => {
+ if (user) {
+ const uri = `/-/user/org.couchdb.user:${encodeURIComponent(user)}`
+ return npmFetch.json(uri, opts).then(mutate_, err => {
+ log.error('owner mutate', 'Error getting user data for %s', user)
+ throw err
+ })
+ } else {
+ return mutate_(null)
+ }
+ })
- function mutate_ (er, u) {
- if (!er && user && (!u || u.error)) {
- er = new Error(
+ function mutate_ (u) {
+ if (user && (!u || u.error)) {
+ throw new Error(
"Couldn't get user data for " + user + ': ' + JSON.stringify(u)
)
}
- if (er) {
- log.error('owner mutate', 'Error getting user data for %s', user)
- return cb(er)
- }
-
if (u) u = { name: u.name, email: u.email }
- mapToRegistry(pkg, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- npm.registry.get(uri, { auth: auth }, function (er, data) {
- if (er) {
- log.error('owner mutate', 'Error getting package data for %s', pkg)
- return cb(er)
- }
-
- // save the number of maintainers before mutation so that we can figure
- // out if maintainers were added or removed
- var beforeMutation = data.maintainers.length
-
- var m = mutation(u, data.maintainers)
- if (!m) return cb() // handled
- if (m instanceof Error) return cb(m) // error
-
- data = {
- _id: data._id,
- _rev: data._rev,
- maintainers: m
- }
- var dataPath = pkg.replace('/', '%2f') + '/-rev/' + data._rev
- mapToRegistry(dataPath, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- var params = {
- method: 'PUT',
- body: data,
- auth: auth
- }
- npm.registry.request(uri, params, function (er, data) {
- if (!er && data.error) {
- er = new Error('Failed to update package metadata: ' + JSON.stringify(data))
- }
-
- if (er) {
- log.error('owner mutate', 'Failed to update package metadata')
- } else if (m.length > beforeMutation) {
- output('+ %s (%s)', user, pkg)
- } else if (m.length < beforeMutation) {
- output('- %s (%s)', user, pkg)
- }
-
- cb(er, data)
- })
+ return packument(spec, opts.concat({
+ fullMetadata: true
+ })).then(data => {
+ // save the number of maintainers before mutation so that we can figure
+ // out if maintainers were added or removed
+ const beforeMutation = data.maintainers.length
+
+ const m = mutation(u, data.maintainers)
+ if (!m) return // handled
+ if (m instanceof Error) throw m // error
+
+ data = {
+ _id: data._id,
+ _rev: data._rev,
+ maintainers: m
+ }
+ const dataPath = `/${spec.escapedName}/-rev/${encodeURIComponent(data._rev)}`
+ return otplease(opts, opts => {
+ const reqOpts = opts.concat({
+ method: 'PUT',
+ body: data,
+ spec
})
+ return npmFetch.json(dataPath, reqOpts)
+ }).then(data => {
+ if (data.error) {
+ throw new Error('Failed to update package metadata: ' + JSON.stringify(data))
+ } else if (m.length > beforeMutation) {
+ output('+ %s (%s)', user, spec.name)
+ } else if (m.length < beforeMutation) {
+ output('- %s (%s)', user, spec.name)
+ }
+ return data
})
})
}
}
-
-function unknown (action, cb) {
- cb('Usage: \n' + owner.usage)
-}
diff --git a/deps/npm/lib/pack.js b/deps/npm/lib/pack.js
index 3b3f5b7bbc7007..78e5bfd174d7b7 100644
--- a/deps/npm/lib/pack.js
+++ b/deps/npm/lib/pack.js
@@ -18,9 +18,9 @@ const lifecycle = BB.promisify(require('./utils/lifecycle'))
const log = require('npmlog')
const move = require('move-concurrently')
const npm = require('./npm')
+const npmConfig = require('./config/figgy-config.js')
const output = require('./utils/output')
const pacote = require('pacote')
-const pacoteOpts = require('./config/pacote')
const path = require('path')
const PassThrough = require('stream').PassThrough
const pathIsInside = require('path-is-inside')
@@ -88,8 +88,8 @@ function pack_ (pkg, dir) {
}
function packFromPackage (arg, target, filename) {
- const opts = pacoteOpts()
- return pacote.tarball.toFile(arg, target, pacoteOpts())
+ const opts = npmConfig()
+ return pacote.tarball.toFile(arg, target, opts)
.then(() => cacache.tmp.withTmp(npm.tmp, {tmpPrefix: 'unpacking'}, (tmp) => {
const tmpTarget = path.join(tmp, filename)
return pacote.extract(arg, tmpTarget, opts)
diff --git a/deps/npm/lib/ping.js b/deps/npm/lib/ping.js
index 13f390397ce18c..3023bab00e9943 100644
--- a/deps/npm/lib/ping.js
+++ b/deps/npm/lib/ping.js
@@ -1,5 +1,16 @@
-var npm = require('./npm.js')
-var output = require('./utils/output.js')
+'use strict'
+
+const npmConfig = require('./config/figgy-config.js')
+const fetch = require('libnpm/fetch')
+const figgyPudding = require('figgy-pudding')
+const log = require('npmlog')
+const npm = require('./npm.js')
+const output = require('./utils/output.js')
+
+const PingConfig = figgyPudding({
+ json: {},
+ registry: {}
+})
module.exports = ping
@@ -10,18 +21,27 @@ function ping (args, silent, cb) {
cb = silent
silent = false
}
- var registry = npm.config.get('registry')
- if (!registry) return cb(new Error('no default registry set'))
- var auth = npm.config.getCredentialsByURI(registry)
- npm.registry.ping(registry, {auth: auth}, function (er, pong, data, res) {
- if (!silent) {
- if (er) {
- output('Ping error: ' + er)
- } else {
- output('Ping success: ' + JSON.stringify(pong))
+ const opts = PingConfig(npmConfig())
+ const registry = opts.registry
+ log.notice('PING', registry)
+ const start = Date.now()
+ return fetch('/-/ping?write=true', opts).then(
+ res => res.json().catch(() => ({}))
+ ).then(details => {
+ if (silent) {
+ } else {
+ const time = Date.now() - start
+ log.notice('PONG', `${time / 1000}ms`)
+ if (npm.config.get('json')) {
+ output(JSON.stringify({
+ registry,
+ time,
+ details
+ }, null, 2))
+ } else if (Object.keys(details).length) {
+ log.notice('PONG', `${JSON.stringify(details, null, 2)}`)
}
}
- cb(er, er ? null : pong, data, res)
- })
+ }).nodeify(cb)
}
diff --git a/deps/npm/lib/profile.js b/deps/npm/lib/profile.js
index ff01db90f722f4..7ce9cb5cce5df2 100644
--- a/deps/npm/lib/profile.js
+++ b/deps/npm/lib/profile.js
@@ -1,18 +1,23 @@
'use strict'
-const profile = require('npm-profile')
-const npm = require('./npm.js')
+
+const BB = require('bluebird')
+
+const ansistyles = require('ansistyles')
+const figgyPudding = require('figgy-pudding')
+const inspect = require('util').inspect
const log = require('npmlog')
+const npm = require('./npm.js')
+const npmConfig = require('./config/figgy-config.js')
+const otplease = require('./utils/otplease.js')
const output = require('./utils/output.js')
+const profile = require('libnpm/profile')
+const pulseTillDone = require('./utils/pulse-till-done.js')
+const qrcodeTerminal = require('qrcode-terminal')
+const queryString = require('query-string')
const qw = require('qw')
-const Table = require('cli-table3')
-const ansistyles = require('ansistyles')
-const Bluebird = require('bluebird')
const readUserInfo = require('./utils/read-user-info.js')
-const qrcodeTerminal = require('qrcode-terminal')
+const Table = require('cli-table3')
const url = require('url')
-const queryString = require('query-string')
-const pulseTillDone = require('./utils/pulse-till-done.js')
-const inspect = require('util').inspect
module.exports = profileCmd
@@ -48,6 +53,13 @@ function withCb (prom, cb) {
prom.then((value) => cb(null, value), cb)
}
+const ProfileOpts = figgyPudding({
+ json: {},
+ otp: {},
+ parseable: {},
+ registry: {}
+})
+
function profileCmd (args, cb) {
if (args.length === 0) return cb(new Error(profileCmd.usage))
log.gauge.show('profile')
@@ -75,36 +87,13 @@ function profileCmd (args, cb) {
}
}
-function config () {
- const conf = {
- json: npm.config.get('json'),
- parseable: npm.config.get('parseable'),
- registry: npm.config.get('registry'),
- otp: npm.config.get('otp')
- }
- const creds = npm.config.getCredentialsByURI(conf.registry)
- if (creds.token) {
- conf.auth = {token: creds.token}
- } else if (creds.username) {
- conf.auth = {basic: {username: creds.username, password: creds.password}}
- } else if (creds.auth) {
- const auth = Buffer.from(creds.auth, 'base64').toString().split(':', 2)
- conf.auth = {basic: {username: auth[0], password: auth[1]}}
- } else {
- conf.auth = {}
- }
-
- if (conf.otp) conf.auth.otp = conf.otp
- return conf
-}
-
const knownProfileKeys = qw`
name email ${'two-factor auth'} fullname homepage
freenode twitter github created updated`
function get (args) {
const tfa = 'two-factor auth'
- const conf = config()
+ const conf = ProfileOpts(npmConfig())
return pulseTillDone.withPromise(profile.get(conf)).then((info) => {
if (!info.cidr_whitelist) delete info.cidr_whitelist
if (conf.json) {
@@ -150,7 +139,7 @@ const writableProfileKeys = qw`
email password fullname homepage freenode twitter github`
function set (args) {
- const conf = config()
+ let conf = ProfileOpts(npmConfig())
const prop = (args[0] || '').toLowerCase().trim()
let value = args.length > 1 ? args.slice(1).join(' ') : null
if (prop !== 'password' && value === null) {
@@ -164,7 +153,7 @@ function set (args) {
if (writableProfileKeys.indexOf(prop) === -1) {
return Promise.reject(Error(`"${prop}" is not a property we can set. Valid properties are: ` + writableProfileKeys.join(', ')))
}
- return Bluebird.try(() => {
+ return BB.try(() => {
if (prop === 'password') {
return readUserInfo.password('Current password: ').then((current) => {
return readPasswords().then((newpassword) => {
@@ -193,23 +182,18 @@ function set (args) {
const newUser = {}
writableProfileKeys.forEach((k) => { newUser[k] = user[k] })
newUser[prop] = value
- return profile.set(newUser, conf).catch((err) => {
- if (err.code !== 'EOTP') throw err
- return readUserInfo.otp().then((otp) => {
- conf.auth.otp = otp
- return profile.set(newUser, conf)
+ return otplease(conf, conf => profile.set(newUser, conf))
+ .then((result) => {
+ if (conf.json) {
+ output(JSON.stringify({[prop]: result[prop]}, null, 2))
+ } else if (conf.parseable) {
+ output(prop + '\t' + result[prop])
+ } else if (result[prop] != null) {
+ output('Set', prop, 'to', result[prop])
+ } else {
+ output('Set', prop)
+ }
})
- }).then((result) => {
- if (conf.json) {
- output(JSON.stringify({[prop]: result[prop]}, null, 2))
- } else if (conf.parseable) {
- output(prop + '\t' + result[prop])
- } else if (result[prop] != null) {
- output('Set', prop, 'to', result[prop])
- } else {
- output('Set', prop)
- }
- })
}))
})
}
@@ -225,7 +209,7 @@ function enable2fa (args) {
' auth-only - Require two-factor authentication only when logging in\n' +
' auth-and-writes - Require two-factor authentication when logging in AND when publishing'))
}
- const conf = config()
+ const conf = ProfileOpts(npmConfig())
if (conf.json || conf.parseable) {
return Promise.reject(new Error(
'Enabling two-factor authentication is an interactive operation and ' +
@@ -238,15 +222,18 @@ function enable2fa (args) {
}
}
- return Bluebird.try(() => {
+ return BB.try(() => {
// if they're using legacy auth currently then we have to update them to a
// bearer token before continuing.
- if (conf.auth.basic) {
+ const auth = getAuth(conf)
+ if (auth.basic) {
log.info('profile', 'Updating authentication to bearer token')
- return profile.login(conf.auth.basic.username, conf.auth.basic.password, conf).then((result) => {
+ return profile.createToken(
+ auth.basic.password, false, [], conf
+ ).then((result) => {
if (!result.token) throw new Error('Your registry ' + conf.registry + 'does not seem to support bearer tokens. Bearer tokens are required for two-factor authentication')
npm.config.setCredentialsByURI(conf.registry, {token: result.token})
- return Bluebird.fromNode((cb) => npm.config.save('user', cb))
+ return BB.fromNode((cb) => npm.config.save('user', cb))
})
}
}).then(() => {
@@ -295,18 +282,36 @@ function enable2fa (args) {
})
}
+function getAuth (conf) {
+ const creds = npm.config.getCredentialsByURI(conf.registry)
+ let auth
+ if (creds.token) {
+ auth = {token: creds.token}
+ } else if (creds.username) {
+ auth = {basic: {username: creds.username, password: creds.password}}
+ } else if (creds.auth) {
+ const basic = Buffer.from(creds.auth, 'base64').toString().split(':', 2)
+ auth = {basic: {username: basic[0], password: basic[1]}}
+ } else {
+ auth = {}
+ }
+
+ if (conf.otp) auth.otp = conf.otp
+ return auth
+}
+
function disable2fa (args) {
- const conf = config()
+ let conf = ProfileOpts(npmConfig())
return pulseTillDone.withPromise(profile.get(conf)).then((info) => {
if (!info.tfa || info.tfa.pending) {
output('Two factor authentication not enabled.')
return
}
return readUserInfo.password().then((password) => {
- return Bluebird.try(() => {
- if (conf.auth.otp) return
+ return BB.try(() => {
+ if (conf.otp) return
return readUserInfo.otp('Enter one-time password from your authenticator: ').then((otp) => {
- conf.auth.otp = otp
+ conf = conf.concat({otp})
})
}).then(() => {
log.info('profile', 'disabling tfa')
diff --git a/deps/npm/lib/publish.js b/deps/npm/lib/publish.js
index 25f2134b1b16d6..e81fc1a0574546 100644
--- a/deps/npm/lib/publish.js
+++ b/deps/npm/lib/publish.js
@@ -3,20 +3,20 @@
const BB = require('bluebird')
const cacache = require('cacache')
-const createReadStream = require('graceful-fs').createReadStream
-const getPublishConfig = require('./utils/get-publish-config.js')
+const figgyPudding = require('figgy-pudding')
+const libpub = require('libnpm/publish')
+const libunpub = require('libnpm/unpublish')
const lifecycle = BB.promisify(require('./utils/lifecycle.js'))
const log = require('npmlog')
-const mapToRegistry = require('./utils/map-to-registry.js')
-const npa = require('npm-package-arg')
-const npm = require('./npm.js')
+const npa = require('libnpm/parse-arg')
+const npmConfig = require('./config/figgy-config.js')
const output = require('./utils/output.js')
+const otplease = require('./utils/otplease.js')
const pack = require('./pack')
-const pacote = require('pacote')
-const pacoteOpts = require('./config/pacote')
+const { tarball, extract } = require('libnpm')
const path = require('path')
+const readFileAsync = BB.promisify(require('graceful-fs').readFile)
const readJson = BB.promisify(require('read-package-json'))
-const readUserInfo = require('./utils/read-user-info.js')
const semver = require('semver')
const statAsync = BB.promisify(require('graceful-fs').stat)
@@ -31,6 +31,16 @@ publish.completion = function (opts, cb) {
return cb()
}
+const PublishConfig = figgyPudding({
+ dryRun: 'dry-run',
+ 'dry-run': { default: false },
+ force: { default: false },
+ json: { default: false },
+ Promise: { default: () => Promise },
+ tag: { default: 'latest' },
+ tmp: {}
+})
+
module.exports = publish
function publish (args, isRetry, cb) {
if (typeof cb !== 'function') {
@@ -42,15 +52,16 @@ function publish (args, isRetry, cb) {
log.verbose('publish', args)
- const t = npm.config.get('tag').trim()
+ const opts = PublishConfig(npmConfig())
+ const t = opts.tag.trim()
if (semver.validRange(t)) {
return cb(new Error('Tag name must not be a valid SemVer range: ' + t))
}
- return publish_(args[0])
+ return publish_(args[0], opts)
.then((tarball) => {
const silent = log.level === 'silent'
- if (!silent && npm.config.get('json')) {
+ if (!silent && opts.json) {
output(JSON.stringify(tarball, null, 2))
} else if (!silent) {
output(`+ ${tarball.id}`)
@@ -59,7 +70,7 @@ function publish (args, isRetry, cb) {
.nodeify(cb)
}
-function publish_ (arg) {
+function publish_ (arg, opts) {
return statAsync(arg).then((stat) => {
if (stat.isDirectory()) {
return stat
@@ -69,17 +80,17 @@ function publish_ (arg) {
throw err
}
}).then(() => {
- return publishFromDirectory(arg)
+ return publishFromDirectory(arg, opts)
}, (err) => {
if (err.code !== 'ENOENT' && err.code !== 'ENOTDIR') {
throw err
} else {
- return publishFromPackage(arg)
+ return publishFromPackage(arg, opts)
}
})
}
-function publishFromDirectory (arg) {
+function publishFromDirectory (arg, opts) {
// All this readJson is because any of the given scripts might modify the
// package.json in question, so we need to refresh after every step.
let contents
@@ -90,12 +101,12 @@ function publishFromDirectory (arg) {
}).then(() => {
return readJson(path.join(arg, 'package.json'))
}).then((pkg) => {
- return cacache.tmp.withTmp(npm.tmp, {tmpPrefix: 'fromDir'}, (tmpDir) => {
+ return cacache.tmp.withTmp(opts.tmp, {tmpPrefix: 'fromDir'}, (tmpDir) => {
const target = path.join(tmpDir, 'package.tgz')
return pack.packDirectory(pkg, arg, target, null, true)
.tap((c) => { contents = c })
- .then((c) => !npm.config.get('json') && pack.logContents(c))
- .then(() => upload(arg, pkg, false, target))
+ .then((c) => !opts.json && pack.logContents(c))
+ .then(() => upload(pkg, false, target, opts))
})
}).then(() => {
return readJson(path.join(arg, 'package.json'))
@@ -107,121 +118,50 @@ function publishFromDirectory (arg) {
.then(() => contents)
}
-function publishFromPackage (arg) {
- return cacache.tmp.withTmp(npm.tmp, {tmpPrefix: 'fromPackage'}, (tmp) => {
+function publishFromPackage (arg, opts) {
+ return cacache.tmp.withTmp(opts.tmp, {tmpPrefix: 'fromPackage'}, tmp => {
const extracted = path.join(tmp, 'package')
const target = path.join(tmp, 'package.json')
- const opts = pacoteOpts()
- return pacote.tarball.toFile(arg, target, opts)
- .then(() => pacote.extract(arg, extracted, opts))
+ return tarball.toFile(arg, target, opts)
+ .then(() => extract(arg, extracted, opts))
.then(() => readJson(path.join(extracted, 'package.json')))
.then((pkg) => {
return BB.resolve(pack.getContents(pkg, target))
- .tap((c) => !npm.config.get('json') && pack.logContents(c))
- .tap(() => upload(arg, pkg, false, target))
+ .tap((c) => !opts.json && pack.logContents(c))
+ .tap(() => upload(pkg, false, target, opts))
})
})
}
-function upload (arg, pkg, isRetry, cached) {
- if (!pkg) {
- return BB.reject(new Error('no package.json file found'))
- }
- if (pkg.private) {
- return BB.reject(new Error(
- 'This package has been marked as private\n' +
- "Remove the 'private' field from the package.json to publish it."
- ))
- }
- const mappedConfig = getPublishConfig(
- pkg.publishConfig,
- npm.config,
- npm.registry
- )
- const config = mappedConfig.config
- const registry = mappedConfig.client
-
- pkg._npmVersion = npm.version
- pkg._nodeVersion = process.versions.node
-
- delete pkg.modules
-
- return BB.fromNode((cb) => {
- mapToRegistry(pkg.name, config, (err, registryURI, auth, registryBase) => {
- if (err) { return cb(err) }
- cb(null, [registryURI, auth, registryBase])
- })
- }).spread((registryURI, auth, registryBase) => {
- // we just want the base registry URL in this case
- log.verbose('publish', 'registryBase', registryBase)
- log.silly('publish', 'uploading', cached)
-
- pkg._npmUser = {
- name: auth.username,
- email: auth.email
- }
-
- const params = {
- metadata: pkg,
- body: !npm.config.get('dry-run') && createReadStream(cached),
- auth: auth
- }
-
- function closeFile () {
- if (!npm.config.get('dry-run')) {
- params.body.close()
- }
- }
-
- // registry-frontdoor cares about the access level, which is only
- // configurable for scoped packages
- if (config.get('access')) {
- if (!npa(pkg.name).scope && config.get('access') === 'restricted') {
- throw new Error("Can't restrict access to unscoped packages.")
- }
-
- params.access = config.get('access')
- }
-
- if (npm.config.get('dry-run')) {
- log.verbose('publish', '--dry-run mode enabled. Skipping upload.')
- return BB.resolve()
- }
-
- log.showProgress('publish:' + pkg._id)
- return BB.fromNode((cb) => {
- registry.publish(registryBase, params, cb)
- }).catch((err) => {
- if (
- err.code === 'EPUBLISHCONFLICT' &&
- npm.config.get('force') &&
- !isRetry
- ) {
- log.warn('publish', 'Forced publish over ' + pkg._id)
- return BB.fromNode((cb) => {
- npm.commands.unpublish([pkg._id], cb)
- }).finally(() => {
- // close the file we are trying to upload, we will open it again.
- closeFile()
- // ignore errors. Use the force. Reach out with your feelings.
- return upload(arg, pkg, true, cached).catch(() => {
- // but if it fails again, then report the first error.
- throw err
+function upload (pkg, isRetry, cached, opts) {
+ if (!opts.dryRun) {
+ return readFileAsync(cached).then(tarball => {
+ return otplease(opts, opts => {
+ return libpub(pkg, tarball, opts)
+ }).catch(err => {
+ if (
+ err.code === 'EPUBLISHCONFLICT' &&
+ opts.force &&
+ !isRetry
+ ) {
+ log.warn('publish', 'Forced publish over ' + pkg._id)
+ return otplease(opts, opts => libunpub(
+ npa.resolve(pkg.name, pkg.version), opts
+ )).finally(() => {
+ // ignore errors. Use the force. Reach out with your feelings.
+ return otplease(opts, opts => {
+ return upload(pkg, true, tarball, opts)
+ }).catch(() => {
+ // but if it fails again, then report the first error.
+ throw err
+ })
})
- })
- } else {
- // close the file we are trying to upload, all attempts to resume will open it again
- closeFile()
- throw err
- }
- })
- }).catch((err) => {
- if (err.code !== 'EOTP' && !(err.code === 'E401' && /one-time pass/.test(err.message))) throw err
- // we prompt on stdout and read answers from stdin, so they need to be ttys.
- if (!process.stdin.isTTY || !process.stdout.isTTY) throw err
- return readUserInfo.otp().then((otp) => {
- npm.config.set('otp', otp)
- return upload(arg, pkg, isRetry, cached)
+ } else {
+ throw err
+ }
+ })
})
- })
+ } else {
+ return opts.Promise.resolve(true)
+ }
}
diff --git a/deps/npm/lib/repo.js b/deps/npm/lib/repo.js
index d5aa81a6a00ebd..b930402aedf953 100644
--- a/deps/npm/lib/repo.js
+++ b/deps/npm/lib/repo.js
@@ -2,10 +2,10 @@ module.exports = repo
repo.usage = 'npm repo []'
-var openUrl = require('./utils/open-url')
-var hostedGitInfo = require('hosted-git-info')
-var url_ = require('url')
-var fetchPackageMetadata = require('./fetch-package-metadata.js')
+const openUrl = require('./utils/open-url')
+const hostedGitInfo = require('hosted-git-info')
+const url_ = require('url')
+const fetchPackageMetadata = require('./fetch-package-metadata.js')
repo.completion = function (opts, cb) {
// FIXME: there used to be registry completion here, but it stopped making
@@ -14,7 +14,7 @@ repo.completion = function (opts, cb) {
}
function repo (args, cb) {
- var n = args.length ? args[0] : '.'
+ const n = args.length ? args[0] : '.'
fetchPackageMetadata(n, '.', {fullMetadata: true}, function (er, d) {
if (er) return cb(er)
getUrlAndOpen(d, cb)
@@ -22,12 +22,12 @@ function repo (args, cb) {
}
function getUrlAndOpen (d, cb) {
- var r = d.repository
+ const r = d.repository
if (!r) return cb(new Error('no repository'))
// XXX remove this when npm@v1.3.10 from node 0.10 is deprecated
// from https://github.com/npm/npm-www/issues/418
- var info = hostedGitInfo.fromUrl(r.url)
- var url = info ? info.browse() : unknownHostedUrl(r.url)
+ const info = hostedGitInfo.fromUrl(r.url)
+ const url = info ? info.browse() : unknownHostedUrl(r.url)
if (!url) return cb(new Error('no repository: could not get url'))
@@ -36,12 +36,12 @@ function getUrlAndOpen (d, cb) {
function unknownHostedUrl (url) {
try {
- var idx = url.indexOf('@')
+ const idx = url.indexOf('@')
if (idx !== -1) {
url = url.slice(idx + 1).replace(/:([^\d]+)/, '/$1')
}
url = url_.parse(url)
- var protocol = url.protocol === 'https:'
+ const protocol = url.protocol === 'https:'
? 'https:'
: 'http:'
return protocol + '//' + (url.host || '') +
diff --git a/deps/npm/lib/search.js b/deps/npm/lib/search.js
index 3987be135c9ae6..3c59f8b43d15bb 100644
--- a/deps/npm/lib/search.js
+++ b/deps/npm/lib/search.js
@@ -2,14 +2,16 @@
module.exports = exports = search
-var npm = require('./npm.js')
-var allPackageSearch = require('./search/all-package-search')
-var esearch = require('./search/esearch.js')
-var formatPackageStream = require('./search/format-package-stream.js')
-var usage = require('./utils/usage')
-var output = require('./utils/output.js')
-var log = require('npmlog')
-var ms = require('mississippi')
+const npm = require('./npm.js')
+const allPackageSearch = require('./search/all-package-search')
+const figgyPudding = require('figgy-pudding')
+const formatPackageStream = require('./search/format-package-stream.js')
+const libSearch = require('libnpm/search')
+const log = require('npmlog')
+const ms = require('mississippi')
+const npmConfig = require('./config/figgy-config.js')
+const output = require('./utils/output.js')
+const usage = require('./utils/usage')
search.usage = usage(
'search',
@@ -20,46 +22,50 @@ search.completion = function (opts, cb) {
cb(null, [])
}
+const SearchOpts = figgyPudding({
+ description: {},
+ exclude: {},
+ include: {},
+ limit: {},
+ log: {},
+ staleness: {},
+ unicode: {}
+})
+
function search (args, cb) {
- var searchOpts = {
+ const opts = SearchOpts(npmConfig()).concat({
description: npm.config.get('description'),
exclude: prepareExcludes(npm.config.get('searchexclude')),
include: prepareIncludes(args, npm.config.get('searchopts')),
- limit: npm.config.get('searchlimit'),
+ limit: npm.config.get('searchlimit') || 20,
log: log,
staleness: npm.config.get('searchstaleness'),
unicode: npm.config.get('unicode')
- }
-
- if (searchOpts.include.length === 0) {
+ })
+ if (opts.include.length === 0) {
return cb(new Error('search must be called with arguments'))
}
// Used later to figure out whether we had any packages go out
- var anyOutput = false
+ let anyOutput = false
- var entriesStream = ms.through.obj()
+ const entriesStream = ms.through.obj()
- var esearchWritten = false
- esearch(searchOpts).on('data', function (pkg) {
+ let esearchWritten = false
+ libSearch.stream(opts.include, opts).on('data', pkg => {
entriesStream.write(pkg)
!esearchWritten && (esearchWritten = true)
- }).on('error', function (e) {
+ }).on('error', err => {
if (esearchWritten) {
// If esearch errored after already starting output, we can't fall back.
- return entriesStream.emit('error', e)
+ return entriesStream.emit('error', err)
}
log.warn('search', 'fast search endpoint errored. Using old search.')
- allPackageSearch(searchOpts).on('data', function (pkg) {
- entriesStream.write(pkg)
- }).on('error', function (e) {
- entriesStream.emit('error', e)
- }).on('end', function () {
- entriesStream.end()
- })
- }).on('end', function () {
- entriesStream.end()
- })
+ allPackageSearch(opts)
+ .on('data', pkg => entriesStream.write(pkg))
+ .on('error', err => entriesStream.emit('error', err))
+ .on('end', () => entriesStream.end())
+ }).on('end', () => entriesStream.end())
// Grab a configured output stream that will spit out packages in the
// desired format.
@@ -71,14 +77,14 @@ function search (args, cb) {
parseable: npm.config.get('parseable'),
color: npm.color
})
- outputStream.on('data', function (chunk) {
+ outputStream.on('data', chunk => {
if (!anyOutput) { anyOutput = true }
output(chunk.toString('utf8'))
})
log.silly('search', 'searching packages')
- ms.pipe(entriesStream, outputStream, function (er) {
- if (er) return cb(er)
+ ms.pipe(entriesStream, outputStream, err => {
+ if (err) return cb(err)
if (!anyOutput && !npm.config.get('json') && !npm.config.get('parseable')) {
output('No matches found for ' + (args.map(JSON.stringify).join(' ')))
}
diff --git a/deps/npm/lib/search/all-package-metadata.js b/deps/npm/lib/search/all-package-metadata.js
index 5a27bdbcee658e..5883def5c72e3e 100644
--- a/deps/npm/lib/search/all-package-metadata.js
+++ b/deps/npm/lib/search/all-package-metadata.js
@@ -1,21 +1,28 @@
'use strict'
-var fs = require('graceful-fs')
-var path = require('path')
-var mkdir = require('mkdirp')
-var chownr = require('chownr')
-var npm = require('../npm.js')
-var log = require('npmlog')
-var cacheFile = require('npm-cache-filename')
-var correctMkdir = require('../utils/correct-mkdir.js')
-var mapToRegistry = require('../utils/map-to-registry.js')
-var jsonstream = require('JSONStream')
-var writeStreamAtomic = require('fs-write-stream-atomic')
-var ms = require('mississippi')
-var sortedUnionStream = require('sorted-union-stream')
-var once = require('once')
-var gunzip = require('../utils/gunzip-maybe')
+const BB = require('bluebird')
+const cacheFile = require('npm-cache-filename')
+const chownr = BB.promisify(require('chownr'))
+const correctMkdir = BB.promisify(require('../utils/correct-mkdir.js'))
+const figgyPudding = require('figgy-pudding')
+const fs = require('graceful-fs')
+const JSONStream = require('JSONStream')
+const log = require('npmlog')
+const mkdir = BB.promisify(require('mkdirp'))
+const ms = require('mississippi')
+const npmFetch = require('libnpm/fetch')
+const path = require('path')
+const sortedUnionStream = require('sorted-union-stream')
+const url = require('url')
+const writeStreamAtomic = require('fs-write-stream-atomic')
+
+const statAsync = BB.promisify(fs.stat)
+
+const APMOpts = figgyPudding({
+ cache: {},
+ registry: {}
+})
// Returns a sorted stream of all package metadata. Internally, takes care of
// maintaining its metadata cache and making partial or full remote requests,
// according to staleness, validity, etc.
@@ -27,63 +34,70 @@ var gunzip = require('../utils/gunzip-maybe')
// 4. It must include all entries that exist in the metadata endpoint as of
// the value in `_updated`
module.exports = allPackageMetadata
-function allPackageMetadata (staleness) {
- var stream = ms.through.obj()
-
- mapToRegistry('-/all', npm.config, function (er, uri, auth) {
- if (er) return stream.emit('error', er)
-
- var cacheBase = cacheFile(npm.config.get('cache'))(uri)
- var cachePath = path.join(cacheBase, '.cache.json')
+function allPackageMetadata (opts) {
+ const staleness = opts.staleness
+ const stream = ms.through.obj()
- createEntryStream(cachePath, uri, auth, staleness, function (err, entryStream, latest, newEntries) {
- if (err) return stream.emit('error', err)
- log.silly('all-package-metadata', 'entry stream created')
- if (entryStream && newEntries) {
- createCacheWriteStream(cachePath, latest, function (err, writeStream) {
- if (err) return stream.emit('error', err)
- log.silly('all-package-metadata', 'output stream created')
- ms.pipeline.obj(entryStream, writeStream, stream)
- })
- } else if (entryStream) {
- ms.pipeline.obj(entryStream, stream)
- } else {
- stream.emit('error', new Error('No search sources available'))
- }
- })
- })
+ opts = APMOpts(opts)
+ const cacheBase = cacheFile(path.resolve(path.dirname(opts.cache)))(url.resolve(opts.registry, '/-/all'))
+ const cachePath = path.join(cacheBase, '.cache.json')
+ createEntryStream(
+ cachePath, staleness, opts
+ ).then(({entryStream, latest, newEntries}) => {
+ log.silly('all-package-metadata', 'entry stream created')
+ if (entryStream && newEntries) {
+ return createCacheWriteStream(cachePath, latest, opts).then(writer => {
+ log.silly('all-package-metadata', 'output stream created')
+ ms.pipeline.obj(entryStream, writer, stream)
+ })
+ } else if (entryStream) {
+ ms.pipeline.obj(entryStream, stream)
+ } else {
+ stream.emit('error', new Error('No search sources available'))
+ }
+ }).catch(err => stream.emit('error', err))
return stream
}
// Creates a stream of the latest available package metadata.
// Metadata will come from a combination of the local cache and remote data.
module.exports._createEntryStream = createEntryStream
-function createEntryStream (cachePath, uri, auth, staleness, cb) {
- createCacheEntryStream(cachePath, function (err, cacheStream, cacheLatest) {
+function createEntryStream (cachePath, staleness, opts) {
+ return createCacheEntryStream(
+ cachePath, opts
+ ).catch(err => {
+ log.warn('', 'Failed to read search cache. Rebuilding')
+ log.silly('all-package-metadata', 'cache read error: ', err)
+ return {}
+ }).then(({
+ updateStream: cacheStream,
+ updatedLatest: cacheLatest
+ }) => {
cacheLatest = cacheLatest || 0
- if (err) {
- log.warn('', 'Failed to read search cache. Rebuilding')
- log.silly('all-package-metadata', 'cache read error: ', err)
- }
- createEntryUpdateStream(uri, auth, staleness, cacheLatest, function (err, updateStream, updatedLatest) {
+ return createEntryUpdateStream(staleness, cacheLatest, opts).catch(err => {
+ log.warn('', 'Search data request failed, search might be stale')
+ log.silly('all-package-metadata', 'update request error: ', err)
+ return {}
+ }).then(({updateStream, updatedLatest}) => {
updatedLatest = updatedLatest || 0
- var latest = updatedLatest || cacheLatest
+ const latest = updatedLatest || cacheLatest
if (!cacheStream && !updateStream) {
- return cb(new Error('No search sources available'))
- }
- if (err) {
- log.warn('', 'Search data request failed, search might be stale')
- log.silly('all-package-metadata', 'update request error: ', err)
+ throw new Error('No search sources available')
}
if (cacheStream && updateStream) {
// Deduped, unioned, sorted stream from the combination of both.
- cb(null,
- createMergedStream(cacheStream, updateStream),
+ return {
+ entryStream: createMergedStream(cacheStream, updateStream),
latest,
- !!updatedLatest)
+ newEntries: !!updatedLatest
+ }
} else {
// Either one works if one or the other failed
- cb(null, cacheStream || updateStream, latest, !!updatedLatest)
+ return {
+ entryStream: cacheStream || updateStream,
+ latest,
+ newEntries: !!updatedLatest
+ }
}
})
})
@@ -96,66 +110,51 @@ function createEntryStream (cachePath, uri, auth, staleness, cb) {
module.exports._createMergedStream = createMergedStream
function createMergedStream (a, b) {
linkStreams(a, b)
- return sortedUnionStream(b, a, function (pkg) { return pkg.name })
+ return sortedUnionStream(b, a, ({name}) => name)
}
// Reads the local index and returns a stream that spits out package data.
module.exports._createCacheEntryStream = createCacheEntryStream
-function createCacheEntryStream (cacheFile, cb) {
+function createCacheEntryStream (cacheFile, opts) {
log.verbose('all-package-metadata', 'creating entry stream from local cache')
log.verbose('all-package-metadata', cacheFile)
- fs.stat(cacheFile, function (err, stat) {
- if (err) return cb(err)
+ return statAsync(cacheFile).then(stat => {
// TODO - This isn't very helpful if `cacheFile` is empty or just `{}`
- var entryStream = ms.pipeline.obj(
+ const entryStream = ms.pipeline.obj(
fs.createReadStream(cacheFile),
- jsonstream.parse('*'),
+ JSONStream.parse('*'),
// I believe this passthrough is necessary cause `jsonstream` returns
// weird custom streams that behave funny sometimes.
ms.through.obj()
)
- extractUpdated(entryStream, 'cached-entry-stream', cb)
+ return extractUpdated(entryStream, 'cached-entry-stream', opts)
})
}
// Stream of entry updates from the server. If `latest` is `0`, streams the
// entire metadata object from the registry.
module.exports._createEntryUpdateStream = createEntryUpdateStream
-function createEntryUpdateStream (all, auth, staleness, latest, cb) {
+function createEntryUpdateStream (staleness, latest, opts) {
log.verbose('all-package-metadata', 'creating remote entry stream')
- var params = {
- timeout: 600,
- follow: true,
- staleOk: true,
- auth: auth,
- streaming: true
- }
- var partialUpdate = false
+ let partialUpdate = false
+ let uri = '/-/all'
if (latest && (Date.now() - latest < (staleness * 1000))) {
// Skip the request altogether if our `latest` isn't stale.
log.verbose('all-package-metadata', 'Local data up to date, skipping update')
- return cb(null)
+ return BB.resolve({})
} else if (latest === 0) {
log.warn('', 'Building the local index for the first time, please be patient')
log.verbose('all-package-metadata', 'No cached data: requesting full metadata db')
} else {
log.verbose('all-package-metadata', 'Cached data present with timestamp:', latest, 'requesting partial index update')
- all += '/since?stale=update_after&startkey=' + latest
+ uri += '/since?stale=update_after&startkey=' + latest
partialUpdate = true
}
- npm.registry.request(all, params, function (er, res) {
- if (er) return cb(er)
+ return npmFetch(uri, opts).then(res => {
log.silly('all-package-metadata', 'request stream opened, code:', res.statusCode)
- // NOTE - The stream returned by `request` seems to be very persnickety
- // and this is almost a magic incantation to get it to work.
- // Modify how `res` is used here at your own risk.
- var entryStream = ms.pipeline.obj(
- res,
- ms.through(function (chunk, enc, cb) {
- cb(null, chunk)
- }),
- gunzip(),
- jsonstream.parse('*', function (pkg, key) {
+ let entryStream = ms.pipeline.obj(
+ res.body,
+ JSONStream.parse('*', (pkg, key) => {
if (key[0] === '_updated' || key[0][0] !== '_') {
return pkg
}
@@ -164,9 +163,12 @@ function createEntryUpdateStream (all, auth, staleness, latest, cb) {
if (partialUpdate) {
// The `/all/since` endpoint doesn't return `_updated`, so we
// just use the request's own timestamp.
- cb(null, entryStream, Date.parse(res.headers.date))
+ return {
+ updateStream: entryStream,
+ updatedLatest: Date.parse(res.headers.get('date'))
+ }
} else {
- extractUpdated(entryStream, 'entry-update-stream', cb)
+ return extractUpdated(entryStream, 'entry-update-stream', opts)
}
})
}
@@ -175,36 +177,37 @@ function createEntryUpdateStream (all, auth, staleness, latest, cb) {
// first returned entries. This is the "latest" unix timestamp for the metadata
// in question. This code does a bit of juggling with the data streams
// so that we can pretend that field doesn't exist, but still extract `latest`
-function extractUpdated (entryStream, label, cb) {
- cb = once(cb)
+function extractUpdated (entryStream, label, opts) {
log.silly('all-package-metadata', 'extracting latest')
- function nope (msg) {
- return function () {
- log.warn('all-package-metadata', label, msg)
- entryStream.removeAllListeners()
- entryStream.destroy()
- cb(new Error(msg))
- }
- }
- var onErr = nope('Failed to read stream')
- var onEnd = nope('Empty or invalid stream')
- entryStream.on('error', onErr)
- entryStream.on('end', onEnd)
- entryStream.once('data', function (latest) {
- log.silly('all-package-metadata', 'got first stream entry for', label, latest)
- entryStream.removeListener('error', onErr)
- entryStream.removeListener('end', onEnd)
- // Because `.once()` unpauses the stream, we re-pause it after the first
- // entry so we don't vomit entries into the void.
- entryStream.pause()
- if (typeof latest === 'number') {
- // The extra pipeline is to return a stream that will implicitly unpause
- // after having an `.on('data')` listener attached, since using this
- // `data` event broke its initial state.
- cb(null, ms.pipeline.obj(entryStream, ms.through.obj()), latest)
- } else {
- cb(new Error('expected first entry to be _updated'))
+ return new BB((resolve, reject) => {
+ function nope (msg) {
+ return function () {
+ log.warn('all-package-metadata', label, msg)
+ entryStream.removeAllListeners()
+ entryStream.destroy()
+ reject(new Error(msg))
+ }
}
+ const onErr = nope('Failed to read stream')
+ const onEnd = nope('Empty or invalid stream')
+ entryStream.on('error', onErr)
+ entryStream.on('end', onEnd)
+ entryStream.once('data', latest => {
+ log.silly('all-package-metadata', 'got first stream entry for', label, latest)
+ entryStream.removeListener('error', onErr)
+ entryStream.removeListener('end', onEnd)
+ if (typeof latest === 'number') {
+ // The extra pipeline is to return a stream that will implicitly unpause
+ // after having an `.on('data')` listener attached, since using this
+ // `data` event broke its initial state.
+ resolve({
+ updateStream: entryStream.pipe(ms.through.obj()),
+ updatedLatest: latest
+ })
+ } else {
+ reject(new Error('expected first entry to be _updated'))
+ }
+ })
})
}
@@ -213,44 +216,43 @@ function extractUpdated (entryStream, label, cb) {
// The stream is also passthrough, so entries going through it will also
// be output from it.
module.exports._createCacheWriteStream = createCacheWriteStream
-function createCacheWriteStream (cacheFile, latest, cb) {
- _ensureCacheDirExists(cacheFile, function (err) {
- if (err) return cb(err)
+function createCacheWriteStream (cacheFile, latest, opts) {
+ return _ensureCacheDirExists(cacheFile, opts).then(() => {
log.silly('all-package-metadata', 'creating output stream')
- var outStream = _createCacheOutStream()
- var cacheFileStream = writeStreamAtomic(cacheFile)
- var inputStream = _createCacheInStream(cacheFileStream, outStream, latest)
+ const outStream = _createCacheOutStream()
+ const cacheFileStream = writeStreamAtomic(cacheFile)
+ const inputStream = _createCacheInStream(
+ cacheFileStream, outStream, latest
+ )
// Glue together the various streams so they fail together.
// `cacheFileStream` errors are already handled by the `inputStream`
// pipeline
- var errEmitted = false
- linkStreams(inputStream, outStream, function () { errEmitted = true })
+ let errEmitted = false
+ linkStreams(inputStream, outStream, () => { errEmitted = true })
- cacheFileStream.on('close', function () { !errEmitted && outStream.end() })
+ cacheFileStream.on('close', () => !errEmitted && outStream.end())
- cb(null, ms.duplex.obj(inputStream, outStream))
+ return ms.duplex.obj(inputStream, outStream)
})
}
-function _ensureCacheDirExists (cacheFile, cb) {
+function _ensureCacheDirExists (cacheFile, opts) {
var cacheBase = path.dirname(cacheFile)
log.silly('all-package-metadata', 'making sure cache dir exists at', cacheBase)
- correctMkdir(npm.cache, function (er, st) {
- if (er) return cb(er)
- mkdir(cacheBase, function (er, made) {
- if (er) return cb(er)
- chownr(made || cacheBase, st.uid, st.gid, cb)
+ return correctMkdir(opts.cache).then(st => {
+ return mkdir(cacheBase).then(made => {
+ return chownr(made || cacheBase, st.uid, st.gid)
})
})
}
function _createCacheOutStream () {
+ // NOTE: this looks goofy, but it's necessary in order to get
+ // JSONStream to play nice with the rest of everything.
return ms.pipeline.obj(
- // These two passthrough `through` streams compensate for some
- // odd behavior with `jsonstream`.
ms.through(),
- jsonstream.parse('*', function (obj, key) {
+ JSONStream.parse('*', (obj, key) => {
// This stream happens to get _updated passed through it, for
// implementation reasons. We make sure to filter it out cause
// the fact that it comes t
@@ -263,9 +265,9 @@ function _createCacheOutStream () {
}
function _createCacheInStream (writer, outStream, latest) {
- var updatedWritten = false
- var inStream = ms.pipeline.obj(
- ms.through.obj(function (pkg, enc, cb) {
+ let updatedWritten = false
+ const inStream = ms.pipeline.obj(
+ ms.through.obj((pkg, enc, cb) => {
if (!updatedWritten && typeof pkg === 'number') {
// This is the `_updated` value getting sent through.
updatedWritten = true
@@ -277,13 +279,11 @@ function _createCacheInStream (writer, outStream, latest) {
cb(null, [pkg.name, pkg])
}
}),
- jsonstream.stringifyObject('{', ',', '}'),
- ms.through(function (chunk, enc, cb) {
+ JSONStream.stringifyObject('{', ',', '}'),
+ ms.through((chunk, enc, cb) => {
// This tees off the buffer data to `outStream`, and then continues
// the pipeline as usual
- outStream.write(chunk, enc, function () {
- cb(null, chunk)
- })
+ outStream.write(chunk, enc, () => cb(null, chunk))
}),
// And finally, we write to the cache file.
writer
@@ -300,14 +300,14 @@ function linkStreams (a, b, cb) {
if (err !== lastError) {
lastError = err
b.emit('error', err)
- cb(err)
+ cb && cb(err)
}
})
b.on('error', function (err) {
if (err !== lastError) {
lastError = err
a.emit('error', err)
- cb(err)
+ cb && cb(err)
}
})
}
diff --git a/deps/npm/lib/search/all-package-search.js b/deps/npm/lib/search/all-package-search.js
index 7a893d517b82cd..fef343bcbc3ba3 100644
--- a/deps/npm/lib/search/all-package-search.js
+++ b/deps/npm/lib/search/all-package-search.js
@@ -8,7 +8,7 @@ function allPackageSearch (opts) {
// Get a stream with *all* the packages. This takes care of dealing
// with the local cache as well, but that's an internal detail.
- var allEntriesStream = allPackageMetadata(opts.staleness)
+ var allEntriesStream = allPackageMetadata(opts)
// Grab a stream that filters those packages according to given params.
var filterStream = streamFilter(function (pkg) {
diff --git a/deps/npm/lib/search/esearch.js b/deps/npm/lib/search/esearch.js
deleted file mode 100644
index f4beb7ade66b18..00000000000000
--- a/deps/npm/lib/search/esearch.js
+++ /dev/null
@@ -1,64 +0,0 @@
-'use strict'
-
-var npm = require('../npm.js')
-var log = require('npmlog')
-var mapToRegistry = require('../utils/map-to-registry.js')
-var jsonstream = require('JSONStream')
-var ms = require('mississippi')
-var gunzip = require('../utils/gunzip-maybe')
-
-module.exports = esearch
-
-function esearch (opts) {
- var stream = ms.through.obj()
-
- mapToRegistry('-/v1/search', npm.config, function (er, uri, auth) {
- if (er) return stream.emit('error', er)
- createResultStream(uri, auth, opts, function (err, resultStream) {
- if (err) return stream.emit('error', err)
- ms.pipeline.obj(resultStream, stream)
- })
- })
- return stream
-}
-
-function createResultStream (uri, auth, opts, cb) {
- log.verbose('esearch', 'creating remote entry stream')
- var params = {
- timeout: 600,
- follow: true,
- staleOk: true,
- auth: auth,
- streaming: true
- }
- var q = buildQuery(opts)
- npm.registry.request(uri + '?text=' + encodeURIComponent(q) + '&size=' + opts.limit, params, function (err, res) {
- if (err) return cb(err)
- log.silly('esearch', 'request stream opened, code:', res.statusCode)
- // NOTE - The stream returned by `request` seems to be very persnickety
- // and this is almost a magic incantation to get it to work.
- // Modify how `res` is used here at your own risk.
- var entryStream = ms.pipeline.obj(
- res,
- ms.through(function (chunk, enc, cb) {
- cb(null, chunk)
- }),
- gunzip(),
- jsonstream.parse('objects.*.package', function (data) {
- return {
- name: data.name,
- description: data.description,
- maintainers: data.maintainers,
- keywords: data.keywords,
- version: data.version,
- date: data.date ? new Date(data.date) : null
- }
- })
- )
- return cb(null, entryStream)
- })
-}
-
-function buildQuery (opts) {
- return opts.include.join(' ')
-}
diff --git a/deps/npm/lib/shrinkwrap.js b/deps/npm/lib/shrinkwrap.js
index 90a4426523cabc..dbb12b5bd4fba4 100644
--- a/deps/npm/lib/shrinkwrap.js
+++ b/deps/npm/lib/shrinkwrap.js
@@ -167,6 +167,8 @@ function childVersion (top, child, req) {
function childRequested (top, child, requested) {
if (requested.type === 'directory' || requested.type === 'file') {
return 'file:' + unixFormatPath(path.relative(top.path, child.package._resolved || requested.fetchSpec))
+ } else if (requested.type === 'git' && child.package._from) {
+ return child.package._from
} else if (!isRegistry(requested) && !child.fromBundle) {
return child.package._resolved || requested.saveSpec || requested.rawSpec
} else if (requested.type === 'tag') {
diff --git a/deps/npm/lib/star.js b/deps/npm/lib/star.js
index f19cb4b07bebb9..44a762b15c0c03 100644
--- a/deps/npm/lib/star.js
+++ b/deps/npm/lib/star.js
@@ -1,11 +1,20 @@
-module.exports = star
+'use strict'
+
+const BB = require('bluebird')
+
+const fetch = require('libnpm/fetch')
+const figgyPudding = require('figgy-pudding')
+const log = require('npmlog')
+const npa = require('libnpm/parse-arg')
+const npm = require('./npm.js')
+const npmConfig = require('./config/figgy-config.js')
+const output = require('./utils/output.js')
+const usage = require('./utils/usage.js')
+const whoami = require('./whoami.js')
-var npm = require('./npm.js')
-var log = require('npmlog')
-var asyncMap = require('slide').asyncMap
-var mapToRegistry = require('./utils/map-to-registry.js')
-var usage = require('./utils/usage')
-var output = require('./utils/output.js')
+const StarConfig = figgyPudding({
+ 'unicode': {}
+})
star.usage = usage(
'star',
@@ -19,27 +28,50 @@ star.completion = function (opts, cb) {
cb()
}
+module.exports = star
function star (args, cb) {
- if (!args.length) return cb(star.usage)
- var s = npm.config.get('unicode') ? '\u2605 ' : '(*)'
- var u = npm.config.get('unicode') ? '\u2606 ' : '( )'
- var using = !(npm.command.match(/^un/))
- if (!using) s = u
- asyncMap(args, function (pkg, cb) {
- mapToRegistry(pkg, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
+ const opts = StarConfig(npmConfig())
+ return BB.try(() => {
+ if (!args.length) throw new Error(star.usage)
+ let s = opts.unicode ? '\u2605 ' : '(*)'
+ const u = opts.unicode ? '\u2606 ' : '( )'
+ const using = !(npm.command.match(/^un/))
+ if (!using) s = u
+ return BB.map(args.map(npa), pkg => {
+ return BB.all([
+ whoami([pkg], true, () => {}),
+ fetch.json(pkg.escapedName, opts.concat({
+ spec: pkg,
+ query: {write: true},
+ 'prefer-online': true
+ }))
+ ]).then(([username, fullData]) => {
+ if (!username) { throw new Error('You need to be logged in!') }
+ const body = {
+ _id: fullData._id,
+ _rev: fullData._rev,
+ users: fullData.users || {}
+ }
- var params = {
- starred: using,
- auth: auth
- }
- npm.registry.star(uri, params, function (er, data, raw, req) {
- if (!er) {
- output(s + ' ' + pkg)
- log.verbose('star', data)
+ if (using) {
+ log.info('star', 'starring', body._id)
+ body.users[username] = true
+ log.verbose('star', 'starring', body)
+ } else {
+ delete body.users[username]
+ log.info('star', 'unstarring', body._id)
+ log.verbose('star', 'unstarring', body)
}
- cb(er, data, raw, req)
+ return fetch.json(pkg.escapedName, opts.concat({
+ spec: pkg,
+ method: 'PUT',
+ body
+ }))
+ }).then(data => {
+ output(s + ' ' + pkg.name)
+ log.verbose('star', data)
+ return data
})
})
- }, cb)
+ }).nodeify(cb)
}
diff --git a/deps/npm/lib/stars.js b/deps/npm/lib/stars.js
index 4771079356a174..ea3581f1d4b444 100644
--- a/deps/npm/lib/stars.js
+++ b/deps/npm/lib/stars.js
@@ -1,47 +1,37 @@
-module.exports = stars
-
-stars.usage = 'npm stars []'
-
-var npm = require('./npm.js')
-var log = require('npmlog')
-var mapToRegistry = require('./utils/map-to-registry.js')
-var output = require('./utils/output.js')
+'use strict'
-function stars (args, cb) {
- npm.commands.whoami([], true, function (er, username) {
- var name = args.length === 1 ? args[0] : username
+const BB = require('bluebird')
- if (er) {
- if (er.code === 'ENEEDAUTH' && !name) {
- var needAuth = new Error("'npm stars' on your own user account requires auth")
- needAuth.code = 'ENEEDAUTH'
- return cb(needAuth)
- }
-
- if (er.code !== 'ENEEDAUTH') return cb(er)
- }
+const npmConfig = require('./config/figgy-config.js')
+const fetch = require('libnpm/fetch')
+const log = require('npmlog')
+const output = require('./utils/output.js')
+const whoami = require('./whoami.js')
- mapToRegistry('', npm.config, function (er, uri, auth) {
- if (er) return cb(er)
+stars.usage = 'npm stars []'
- var params = {
- username: name,
- auth: auth
+module.exports = stars
+function stars ([user], cb) {
+ const opts = npmConfig()
+ return BB.try(() => {
+ return (user ? BB.resolve(user) : whoami([], true, () => {})).then(usr => {
+ return fetch.json('/-/_view/starredByUser', opts.concat({
+ query: {key: `"${usr}"`} // WHY. WHY THE ""?!
+ }))
+ }).then(data => data.rows).then(stars => {
+ if (stars.length === 0) {
+ log.warn('stars', 'user has not starred any packages.')
+ } else {
+ stars.forEach(s => output(s.value))
}
- npm.registry.stars(uri, params, showstars)
})
- })
-
- function showstars (er, data) {
- if (er) return cb(er)
-
- if (data.rows.length === 0) {
- log.warn('stars', 'user has not starred any packages.')
- } else {
- data.rows.forEach(function (a) {
- output(a.value)
+ }).catch(err => {
+ if (err.code === 'ENEEDAUTH') {
+ throw Object.assign(new Error("'npm starts' on your own user account requires auth"), {
+ code: 'ENEEDAUTH'
})
+ } else {
+ throw err
}
- cb()
- }
+ }).nodeify(cb)
}
diff --git a/deps/npm/lib/team.js b/deps/npm/lib/team.js
index 2d9e61cd4384b6..2b56e3b14f95bb 100644
--- a/deps/npm/lib/team.js
+++ b/deps/npm/lib/team.js
@@ -1,19 +1,37 @@
/* eslint-disable standard/no-callback-literal */
-var mapToRegistry = require('./utils/map-to-registry.js')
-var npm = require('./npm')
-var output = require('./utils/output.js')
+
+const columns = require('cli-columns')
+const figgyPudding = require('figgy-pudding')
+const libteam = require('libnpm/team')
+const npmConfig = require('./config/figgy-config.js')
+const output = require('./utils/output.js')
+const otplease = require('./utils/otplease.js')
+const usage = require('./utils/usage')
module.exports = team
team.subcommands = ['create', 'destroy', 'add', 'rm', 'ls', 'edit']
-team.usage =
+team.usage = usage(
+ 'team',
'npm team create \n' +
'npm team destroy \n' +
'npm team add \n' +
'npm team rm \n' +
'npm team ls |\n' +
'npm team edit '
+)
+
+const TeamConfig = figgyPudding({
+ json: {},
+ loglevel: {},
+ parseable: {},
+ silent: {}
+})
+
+function UsageError () {
+ throw Object.assign(new Error(team.usage), {code: 'EUSAGE'})
+}
team.completion = function (opts, cb) {
var argv = opts.conf.argv.remain
@@ -33,24 +51,121 @@ team.completion = function (opts, cb) {
}
}
-function team (args, cb) {
+function team ([cmd, entity = '', user = ''], cb) {
// Entities are in the format :
- var cmd = args.shift()
- var entity = (args.shift() || '').split(':')
- return mapToRegistry('/', npm.config, function (err, uri, auth) {
- if (err) { return cb(err) }
- try {
- return npm.registry.team(cmd, uri, {
- auth: auth,
- scope: entity[0].replace(/^@/, ''), // '@' prefix on scope is optional.
- team: entity[1],
- user: args.shift()
- }, function (err, data) {
- !err && data && output(JSON.stringify(data, undefined, 2))
- cb(err, data)
- })
- } catch (e) {
- cb(e.message + '\n\nUsage:\n' + team.usage)
+ otplease(npmConfig(), opts => {
+ opts = TeamConfig(opts).concat({description: null})
+ entity = entity.replace(/^@/, '')
+ switch (cmd) {
+ case 'create': return teamCreate(entity, opts)
+ case 'destroy': return teamDestroy(entity, opts)
+ case 'add': return teamAdd(entity, user, opts)
+ case 'rm': return teamRm(entity, user, opts)
+ case 'ls': {
+ const match = entity.match(/[^:]+:.+/)
+ if (match) {
+ return teamListUsers(entity, opts)
+ } else {
+ return teamListTeams(entity, opts)
+ }
+ }
+ case 'edit':
+ throw new Error('`npm team edit` is not implemented yet.')
+ default:
+ UsageError()
+ }
+ }).then(
+ data => cb(null, data),
+ err => err.code === 'EUSAGE' ? cb(err.message) : cb(err)
+ )
+}
+
+function teamCreate (entity, opts) {
+ return libteam.create(entity, opts).then(() => {
+ if (opts.json) {
+ output(JSON.stringify({
+ created: true,
+ team: entity
+ }))
+ } else if (opts.parseable) {
+ output(`${entity}\tcreated`)
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`+@${entity}`)
+ }
+ })
+}
+
+function teamDestroy (entity, opts) {
+ return libteam.destroy(entity, opts).then(() => {
+ if (opts.json) {
+ output(JSON.stringify({
+ deleted: true,
+ team: entity
+ }))
+ } else if (opts.parseable) {
+ output(`${entity}\tdeleted`)
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`-@${entity}`)
+ }
+ })
+}
+
+function teamAdd (entity, user, opts) {
+ return libteam.add(user, entity, opts).then(() => {
+ if (opts.json) {
+ output(JSON.stringify({
+ added: true,
+ team: entity,
+ user
+ }))
+ } else if (opts.parseable) {
+ output(`${user}\t${entity}\tadded`)
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`${user} added to @${entity}`)
+ }
+ })
+}
+
+function teamRm (entity, user, opts) {
+ return libteam.rm(user, entity, opts).then(() => {
+ if (opts.json) {
+ output(JSON.stringify({
+ removed: true,
+ team: entity,
+ user
+ }))
+ } else if (opts.parseable) {
+ output(`${user}\t${entity}\tremoved`)
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`${user} removed from @${entity}`)
+ }
+ })
+}
+
+function teamListUsers (entity, opts) {
+ return libteam.lsUsers(entity, opts).then(users => {
+ users = users.sort()
+ if (opts.json) {
+ output(JSON.stringify(users, null, 2))
+ } else if (opts.parseable) {
+ output(users.join('\n'))
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`\n@${entity} has ${users.length} user${users.length === 1 ? '' : 's'}:\n`)
+ output(columns(users, {padding: 1}))
+ }
+ })
+}
+
+function teamListTeams (entity, opts) {
+ return libteam.lsTeams(entity, opts).then(teams => {
+ teams = teams.sort()
+ if (opts.json) {
+ output(JSON.stringify(teams, null, 2))
+ } else if (opts.parseable) {
+ output(teams.join('\n'))
+ } else if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`\n@${entity} has ${teams.length} team${teams.length === 1 ? '' : 's'}:\n`)
+ output(columns(teams.map(t => `@${t}`), {padding: 1}))
}
})
}
diff --git a/deps/npm/lib/token.js b/deps/npm/lib/token.js
index d442d37eb806bc..cccbba2f9ad75e 100644
--- a/deps/npm/lib/token.js
+++ b/deps/npm/lib/token.js
@@ -1,5 +1,5 @@
'use strict'
-const profile = require('npm-profile')
+const profile = require('libnpm/profile')
const npm = require('./npm.js')
const output = require('./utils/output.js')
const Table = require('cli-table3')
diff --git a/deps/npm/lib/unpublish.js b/deps/npm/lib/unpublish.js
index c2e9edd8006f51..bf5867a2687f9d 100644
--- a/deps/npm/lib/unpublish.js
+++ b/deps/npm/lib/unpublish.js
@@ -1,119 +1,110 @@
/* eslint-disable standard/no-callback-literal */
+'use strict'
module.exports = unpublish
-var log = require('npmlog')
-var npm = require('./npm.js')
-var readJson = require('read-package-json')
-var path = require('path')
-var mapToRegistry = require('./utils/map-to-registry.js')
-var npa = require('npm-package-arg')
-var getPublishConfig = require('./utils/get-publish-config.js')
-var output = require('./utils/output.js')
-
-unpublish.usage = 'npm unpublish [<@scope>/][@]'
-
-unpublish.completion = function (opts, cb) {
- if (opts.conf.argv.remain.length >= 3) return cb()
- npm.commands.whoami([], true, function (er, username) {
- if (er) return cb()
-
- var un = encodeURIComponent(username)
- if (!un) return cb()
- var byUser = '-/by-user/' + un
- mapToRegistry(byUser, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- npm.registry.get(uri, { auth: auth }, function (er, pkgs) {
- // do a bit of filtering at this point, so that we don't need
- // to fetch versions for more than one thing, but also don't
- // accidentally a whole project.
- pkgs = pkgs[un]
- if (!pkgs || !pkgs.length) return cb()
- var pp = npa(opts.partialWord).name
- pkgs = pkgs.filter(function (p) {
- return p.indexOf(pp) === 0
- })
- if (pkgs.length > 1) return cb(null, pkgs)
- mapToRegistry(pkgs[0], npm.config, function (er, uri, auth) {
- if (er) return cb(er)
+const BB = require('bluebird')
+
+const figgyPudding = require('figgy-pudding')
+const libaccess = require('libnpm/access')
+const libunpub = require('libnpm/unpublish')
+const log = require('npmlog')
+const npa = require('npm-package-arg')
+const npm = require('./npm.js')
+const npmConfig = require('./config/figgy-config.js')
+const npmFetch = require('npm-registry-fetch')
+const otplease = require('./utils/otplease.js')
+const output = require('./utils/output.js')
+const path = require('path')
+const readJson = BB.promisify(require('read-package-json'))
+const usage = require('./utils/usage.js')
+const whoami = BB.promisify(require('./whoami.js'))
+
+unpublish.usage = usage('npm unpublish [<@scope>/][@]')
+
+function UsageError () {
+ throw Object.assign(new Error(`Usage: ${unpublish.usage}`), {
+ code: 'EUSAGE'
+ })
+}
- npm.registry.get(uri, { auth: auth }, function (er, d) {
- if (er) return cb(er)
- var vers = Object.keys(d.versions)
- if (!vers.length) return cb(null, pkgs)
- return cb(null, vers.map(function (v) {
- return pkgs[0] + '@' + v
- }))
- })
- })
+const UnpublishConfig = figgyPudding({
+ force: {},
+ loglevel: {},
+ silent: {}
+})
+
+unpublish.completion = function (cliOpts, cb) {
+ if (cliOpts.conf.argv.remain.length >= 3) return cb()
+
+ whoami([], true).then(username => {
+ if (!username) { return [] }
+ const opts = UnpublishConfig(npmConfig())
+ return libaccess.lsPackages(username, opts).then(access => {
+ // do a bit of filtering at this point, so that we don't need
+ // to fetch versions for more than one thing, but also don't
+ // accidentally a whole project.
+ let pkgs = Object.keys(access)
+ if (!cliOpts.partialWord || !pkgs.length) { return pkgs }
+ const pp = npa(cliOpts.partialWord).name
+ pkgs = pkgs.filter(p => !p.indexOf(pp))
+ if (pkgs.length > 1) return pkgs
+ return npmFetch.json(npa(pkgs[0]).escapedName, opts).then(doc => {
+ const vers = Object.keys(doc.versions)
+ if (!vers.length) {
+ return pkgs
+ } else {
+ return vers.map(v => `${pkgs[0]}@${v}`)
+ }
})
})
- })
+ }).nodeify(cb)
}
function unpublish (args, cb) {
if (args.length > 1) return cb(unpublish.usage)
- var thing = args.length ? npa(args[0]) : {}
- var project = thing.name
- var version = thing.rawSpec
-
- log.silly('unpublish', 'args[0]', args[0])
- log.silly('unpublish', 'thing', thing)
- if (!version && !npm.config.get('force')) {
- return cb(
- 'Refusing to delete entire project.\n' +
- 'Run with --force to do this.\n' +
- unpublish.usage
- )
- }
-
- if (!project || path.resolve(project) === npm.localPrefix) {
- // if there's a package.json in the current folder, then
- // read the package name and version out of that.
- var cwdJson = path.join(npm.localPrefix, 'package.json')
- return readJson(cwdJson, function (er, data) {
- if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er)
- if (er) return cb('Usage:\n' + unpublish.usage)
- log.verbose('unpublish', data)
- gotProject(data.name, data.version, data.publishConfig, cb)
- })
- }
- return gotProject(project, version, cb)
-}
-
-function gotProject (project, version, publishConfig, cb_) {
- if (typeof cb_ !== 'function') {
- cb_ = publishConfig
- publishConfig = null
- }
-
- function cb (er) {
- if (er) return cb_(er)
- output('- ' + project + (version ? '@' + version : ''))
- cb_()
- }
-
- var mappedConfig = getPublishConfig(publishConfig, npm.config, npm.registry)
- var config = mappedConfig.config
- var registry = mappedConfig.client
-
- // remove from the cache first
- // npm.commands.cache(['clean', project, version], function (er) {
- // if (er) {
- // log.error('unpublish', 'Failed to clean cache')
- // return cb(er)
- // }
-
- mapToRegistry(project, config, function (er, uri, auth) {
- if (er) return cb(er)
-
- var params = {
- version: version,
- auth: auth
+ const spec = args.length && npa(args[0])
+ const opts = UnpublishConfig(npmConfig())
+ const version = spec.rawSpec
+ BB.try(() => {
+ log.silly('unpublish', 'args[0]', args[0])
+ log.silly('unpublish', 'spec', spec)
+ if (!version && !opts.force) {
+ throw Object.assign(new Error(
+ 'Refusing to delete entire project.\n' +
+ 'Run with --force to do this.\n' +
+ unpublish.usage
+ ), {code: 'EUSAGE'})
}
- registry.unpublish(uri, params, cb)
- })
- // })
+ if (!spec || path.resolve(spec.name) === npm.localPrefix) {
+ // if there's a package.json in the current folder, then
+ // read the package name and version out of that.
+ const cwdJson = path.join(npm.localPrefix, 'package.json')
+ return readJson(cwdJson).then(data => {
+ log.verbose('unpublish', data)
+ return otplease(opts, opts => {
+ return libunpub(npa.resolve(data.name, data.version), opts.concat(data.publishConfig))
+ })
+ }, err => {
+ if (err && err.code !== 'ENOENT' && err.code !== 'ENOTDIR') {
+ throw err
+ } else {
+ UsageError()
+ }
+ })
+ } else {
+ return otplease(opts, opts => libunpub(spec, opts))
+ }
+ }).then(
+ ret => {
+ if (!opts.silent && opts.loglevel !== 'silent') {
+ output(`-${spec.name}${
+ spec.type === 'version' ? `@${spec.rawSpec}` : ''
+ }`)
+ }
+ cb(null, ret)
+ },
+ err => err.code === 'EUSAGE' ? cb(err.message) : cb(err)
+ )
}
diff --git a/deps/npm/lib/utils/error-handler.js b/deps/npm/lib/utils/error-handler.js
index c6481abf6737d6..ba9d9f8e252e58 100644
--- a/deps/npm/lib/utils/error-handler.js
+++ b/deps/npm/lib/utils/error-handler.js
@@ -202,7 +202,7 @@ function errorHandler (er) {
msg.summary.concat(msg.detail).forEach(function (errline) {
log.error.apply(log, errline)
})
- if (npm.config.get('json')) {
+ if (npm.config && npm.config.get('json')) {
var error = {
error: {
code: er.code,
diff --git a/deps/npm/lib/utils/error-message.js b/deps/npm/lib/utils/error-message.js
index 6e148981833d32..55c54634542fac 100644
--- a/deps/npm/lib/utils/error-message.js
+++ b/deps/npm/lib/utils/error-message.js
@@ -103,8 +103,7 @@ function errorMessage (er) {
case 'EOTP':
case 'E401':
- // the E401 message checking is a hack till we replace npm-registry-client with something
- // OTP aware.
+ // E401 is for places where we accidentally neglect OTP stuff
if (er.code === 'EOTP' || /one-time pass/.test(er.message)) {
short.push(['', 'This operation requires a one-time password from your authenticator.'])
detail.push([
diff --git a/deps/npm/lib/utils/get-publish-config.js b/deps/npm/lib/utils/get-publish-config.js
deleted file mode 100644
index ac0ef0934201ad..00000000000000
--- a/deps/npm/lib/utils/get-publish-config.js
+++ /dev/null
@@ -1,29 +0,0 @@
-'use strict'
-
-const clientConfig = require('../config/reg-client.js')
-const Conf = require('../config/core.js').Conf
-const log = require('npmlog')
-const npm = require('../npm.js')
-const RegClient = require('npm-registry-client')
-
-module.exports = getPublishConfig
-
-function getPublishConfig (publishConfig, defaultConfig, defaultClient) {
- let config = defaultConfig
- let client = defaultClient
- log.verbose('getPublishConfig', publishConfig)
- if (publishConfig) {
- config = new Conf(defaultConfig)
- config.save = defaultConfig.save.bind(defaultConfig)
-
- // don't modify the actual publishConfig object, in case we have
- // to set a login token or some other data.
- config.unshift(Object.keys(publishConfig).reduce(function (s, k) {
- s[k] = publishConfig[k]
- return s
- }, {}))
- client = new RegClient(clientConfig(npm, log, config))
- }
-
- return { config: config, client: client }
-}
diff --git a/deps/npm/lib/utils/map-to-registry.js b/deps/npm/lib/utils/map-to-registry.js
deleted file mode 100644
index d6e0a5b01f4d5f..00000000000000
--- a/deps/npm/lib/utils/map-to-registry.js
+++ /dev/null
@@ -1,103 +0,0 @@
-var url = require('url')
-
-var log = require('npmlog')
-var npa = require('npm-package-arg')
-var config
-
-module.exports = mapToRegistry
-
-function mapToRegistry (name, config, cb) {
- log.silly('mapToRegistry', 'name', name)
- var registry
-
- // the name itself takes precedence
- var data = npa(name)
- if (data.scope) {
- // the name is definitely scoped, so escape now
- name = name.replace('/', '%2f')
-
- log.silly('mapToRegistry', 'scope (from package name)', data.scope)
-
- registry = config.get(data.scope + ':registry')
- if (!registry) {
- log.verbose('mapToRegistry', 'no registry URL found in name for scope', data.scope)
- }
- }
-
- // ...then --scope=@scope or --scope=scope
- var scope = config.get('scope')
- if (!registry && scope) {
- // I'm an enabler, sorry
- if (scope.charAt(0) !== '@') scope = '@' + scope
-
- log.silly('mapToRegistry', 'scope (from config)', scope)
-
- registry = config.get(scope + ':registry')
- if (!registry) {
- log.verbose('mapToRegistry', 'no registry URL found in config for scope', scope)
- }
- }
-
- // ...and finally use the default registry
- if (!registry) {
- log.silly('mapToRegistry', 'using default registry')
- registry = config.get('registry')
- }
-
- log.silly('mapToRegistry', 'registry', registry)
-
- var auth = config.getCredentialsByURI(registry)
-
- // normalize registry URL so resolution doesn't drop a piece of registry URL
- var normalized = registry.slice(-1) !== '/' ? registry + '/' : registry
- var uri
- log.silly('mapToRegistry', 'data', data)
- if (data.type === 'remote') {
- uri = data.fetchSpec
- } else {
- uri = url.resolve(normalized, name)
- }
-
- log.silly('mapToRegistry', 'uri', uri)
-
- cb(null, uri, scopeAuth(uri, registry, auth), normalized)
-}
-
-function scopeAuth (uri, registry, auth) {
- var cleaned = {
- scope: auth.scope,
- email: auth.email,
- alwaysAuth: auth.alwaysAuth,
- token: undefined,
- username: undefined,
- password: undefined,
- auth: undefined
- }
-
- var requestHost
- var registryHost
-
- if (auth.token || auth.auth || (auth.username && auth.password)) {
- requestHost = url.parse(uri).hostname
- registryHost = url.parse(registry).hostname
-
- if (requestHost === registryHost) {
- cleaned.token = auth.token
- cleaned.auth = auth.auth
- cleaned.username = auth.username
- cleaned.password = auth.password
- } else if (auth.alwaysAuth) {
- log.verbose('scopeAuth', 'alwaysAuth set for', registry)
- cleaned.token = auth.token
- cleaned.auth = auth.auth
- cleaned.username = auth.username
- cleaned.password = auth.password
- } else {
- log.silly('scopeAuth', uri, "doesn't share host with registry", registry)
- }
- if (!config) config = require('../npm').config
- if (config.get('otp')) cleaned.otp = config.get('otp')
- }
-
- return cleaned
-}
diff --git a/deps/npm/lib/utils/metrics.js b/deps/npm/lib/utils/metrics.js
index c51136e78cdb72..0f99c841dbe26c 100644
--- a/deps/npm/lib/utils/metrics.js
+++ b/deps/npm/lib/utils/metrics.js
@@ -4,12 +4,13 @@ exports.stop = stopMetrics
exports.save = saveMetrics
exports.send = sendMetrics
-var fs = require('fs')
-var path = require('path')
-var npm = require('../npm.js')
-var uuid = require('uuid')
+const fs = require('fs')
+const path = require('path')
+const npm = require('../npm.js')
+const regFetch = require('libnpm/fetch')
+const uuid = require('uuid')
-var inMetrics = false
+let inMetrics = false
function startMetrics () {
if (inMetrics) return
@@ -59,15 +60,18 @@ function saveMetrics (itWorked) {
function sendMetrics (metricsFile, metricsRegistry) {
inMetrics = true
var cliMetrics = JSON.parse(fs.readFileSync(metricsFile))
- npm.load({}, function (err) {
- if (err) return
- npm.registry.config.retry.retries = 0
- npm.registry.sendAnonymousCLIMetrics(metricsRegistry, cliMetrics, function (err) {
- if (err) {
- fs.writeFileSync(path.join(path.dirname(metricsFile), 'last-send-metrics-error.txt'), err.stack)
- } else {
- fs.unlinkSync(metricsFile)
- }
- })
+ regFetch(
+ `/-/npm/anon-metrics/v1/${encodeURIComponent(cliMetrics.metricId)}`,
+ // NOTE: skip npmConfig() to prevent auth
+ {
+ registry: metricsRegistry,
+ method: 'PUT',
+ body: cliMetrics.metrics,
+ retry: false
+ }
+ ).then(() => {
+ fs.unlinkSync(metricsFile)
+ }, err => {
+ fs.writeFileSync(path.join(path.dirname(metricsFile), 'last-send-metrics-error.txt'), err.stack)
})
}
diff --git a/deps/npm/lib/utils/otplease.js b/deps/npm/lib/utils/otplease.js
new file mode 100644
index 00000000000000..d0477a896d0049
--- /dev/null
+++ b/deps/npm/lib/utils/otplease.js
@@ -0,0 +1,27 @@
+'use strict'
+
+const BB = require('bluebird')
+
+const optCheck = require('figgy-pudding')({
+ prompt: {default: 'This operation requires a one-time password.\nEnter OTP:'},
+ otp: {}
+})
+const readUserInfo = require('./read-user-info.js')
+
+module.exports = otplease
+function otplease (opts, fn) {
+ opts = opts.concat ? opts : optCheck(opts)
+ return BB.try(() => {
+ return fn(opts)
+ }).catch(err => {
+ if (err.code !== 'EOTP' && !(err.code === 'E401' && /one-time pass/.test(err.body))) {
+ throw err
+ } else if (!process.stdin.isTTY || !process.stdout.isTTY) {
+ throw err
+ } else {
+ return readUserInfo.otp(
+ optCheck(opts).prompt
+ ).then(otp => fn(opts.concat({otp})))
+ }
+ })
+}
diff --git a/deps/npm/lib/view.js b/deps/npm/lib/view.js
index b7d7f6ec803100..5dd605029b9d11 100644
--- a/deps/npm/lib/view.js
+++ b/deps/npm/lib/view.js
@@ -8,17 +8,27 @@ const BB = require('bluebird')
const byteSize = require('byte-size')
const color = require('ansicolors')
const columns = require('cli-columns')
+const npmConfig = require('./config/figgy-config.js')
+const log = require('npmlog')
+const figgyPudding = require('figgy-pudding')
+const npa = require('libnpm/parse-arg')
+const npm = require('./npm.js')
+const packument = require('libnpm/packument')
+const path = require('path')
+const readJson = require('libnpm/read-json')
const relativeDate = require('tiny-relative-date')
+const semver = require('semver')
const style = require('ansistyles')
-var npm = require('./npm.js')
-var readJson = require('read-package-json')
-var log = require('npmlog')
-var util = require('util')
-var semver = require('semver')
-var mapToRegistry = require('./utils/map-to-registry.js')
-var npa = require('npm-package-arg')
-var path = require('path')
-var usage = require('./utils/usage')
+const usage = require('./utils/usage')
+const util = require('util')
+const validateName = require('validate-npm-package-name')
+
+const ViewConfig = figgyPudding({
+ global: {},
+ json: {},
+ tag: {},
+ unicode: {}
+})
view.usage = usage(
'view',
@@ -32,19 +42,14 @@ view.completion = function (opts, cb) {
return cb()
}
// have the package, get the fields.
- var tag = npm.config.get('tag')
- mapToRegistry(opts.conf.argv.remain[2], npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- npm.registry.get(uri, { auth: auth }, function (er, d) {
- if (er) return cb(er)
- var dv = d.versions[d['dist-tags'][tag]]
- var fields = []
- d.versions = Object.keys(d.versions).sort(semver.compareLoose)
- fields = getFields(d).concat(getFields(dv))
- cb(null, fields)
- })
- })
+ const config = ViewConfig(npmConfig())
+ const tag = config.tag
+ const spec = npa(opts.conf.argv.remain[2])
+ return packument(spec, config).then(d => {
+ const dv = d.versions[d['dist-tags'][tag]]
+ d.versions = Object.keys(d.versions).sort(semver.compareLoose)
+ return getFields(d).concat(getFields(dv))
+ }).nodeify(cb)
function getFields (d, f, pref) {
f = f || []
@@ -52,11 +57,11 @@ view.completion = function (opts, cb) {
pref = pref || []
Object.keys(d).forEach(function (k) {
if (k.charAt(0) === '_' || k.indexOf('.') !== -1) return
- var p = pref.concat(k).join('.')
+ const p = pref.concat(k).join('.')
f.push(p)
if (Array.isArray(d[k])) {
d[k].forEach(function (val, i) {
- var pi = p + '[' + i + ']'
+ const pi = p + '[' + i + ']'
if (val && typeof val === 'object') getFields(val, f, [p])
else f.push(pi)
})
@@ -76,113 +81,132 @@ function view (args, silent, cb) {
if (!args.length) args = ['.']
- var pkg = args.shift()
- var nv
+ const opts = ViewConfig(npmConfig())
+ const pkg = args.shift()
+ let nv
if (/^[.]@/.test(pkg)) {
nv = npa.resolve(null, pkg.slice(2))
} else {
nv = npa(pkg)
}
- var name = nv.name
- var local = (name === '.' || !name)
+ const name = nv.name
+ const local = (name === '.' || !name)
- if (npm.config.get('global') && local) {
+ if (opts.global && local) {
return cb(new Error('Cannot use view command in global mode.'))
}
if (local) {
- var dir = npm.prefix
- readJson(path.resolve(dir, 'package.json'), function (er, d) {
+ const dir = npm.prefix
+ BB.resolve(readJson(path.resolve(dir, 'package.json'))).nodeify((er, d) => {
d = d || {}
if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er)
if (!d.name) return cb(new Error('Invalid package.json'))
- var p = d.name
+ const p = d.name
nv = npa(p)
if (pkg && ~pkg.indexOf('@')) {
nv.rawSpec = pkg.split('@')[pkg.indexOf('@')]
}
- fetchAndRead(nv, args, silent, cb)
+ fetchAndRead(nv, args, silent, opts, cb)
})
} else {
- fetchAndRead(nv, args, silent, cb)
+ fetchAndRead(nv, args, silent, opts, cb)
}
}
-function fetchAndRead (nv, args, silent, cb) {
+function fetchAndRead (nv, args, silent, opts, cb) {
// get the data about this package
- var name = nv.name
- var version = nv.rawSpec || npm.config.get('tag')
-
- mapToRegistry(name, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- npm.registry.get(uri, { auth: auth }, function (er, data) {
- if (er) return cb(er)
- if (data['dist-tags'] && data['dist-tags'][version]) {
- version = data['dist-tags'][version]
- }
-
- if (data.time && data.time.unpublished) {
- var u = data.time.unpublished
- er = new Error('Unpublished by ' + u.name + ' on ' + u.time)
- er.statusCode = 404
- er.code = 'E404'
- er.pkgid = data._id
- return cb(er, data)
+ let version = nv.rawSpec || npm.config.get('tag')
+
+ return packument(nv, opts.concat({
+ fullMetadata: true,
+ 'prefer-online': true
+ })).catch(err => {
+ // TODO - this should probably go into pacote, but the tests expect it.
+ if (err.code === 'E404') {
+ err.message = `'${nv.name}' is not in the npm registry.`
+ const validated = validateName(nv.name)
+ if (!validated.validForNewPackages) {
+ err.message += '\n'
+ err.message += (validated.errors || []).join('\n')
+ err.message += (validated.warnings || []).join('\n')
+ } else {
+ err.message += '\nYou should bug the author to publish it'
+ err.message += '\n(or use the name yourself!)'
+ err.message += '\n'
+ err.message += '\nNote that you can also install from a'
+ err.message += '\ntarball, folder, http url, or git url.'
}
+ }
+ throw err
+ }).then(data => {
+ if (data['dist-tags'] && data['dist-tags'][version]) {
+ version = data['dist-tags'][version]
+ }
- var results = []
- var error = null
- var versions = data.versions || {}
- data.versions = Object.keys(versions).sort(semver.compareLoose)
- if (!args.length) args = ['']
+ if (data.time && data.time.unpublished) {
+ const u = data.time.unpublished
+ let er = new Error('Unpublished by ' + u.name + ' on ' + u.time)
+ er.statusCode = 404
+ er.code = 'E404'
+ er.pkgid = data._id
+ throw er
+ }
- // remove readme unless we asked for it
- if (args.indexOf('readme') === -1) {
- delete data.readme
- }
+ const results = []
+ let error = null
+ const versions = data.versions || {}
+ data.versions = Object.keys(versions).sort(semver.compareLoose)
+ if (!args.length) args = ['']
- Object.keys(versions).forEach(function (v) {
- if (semver.satisfies(v, version, true)) {
- args.forEach(function (args) {
- // remove readme unless we asked for it
- if (args.indexOf('readme') !== -1) {
- delete versions[v].readme
- }
- results.push(showFields(data, versions[v], args))
- })
- }
- })
- var retval = results.reduce(reducer, {})
-
- if (args.length === 1 && args[0] === '') {
- retval = cleanBlanks(retval)
- log.silly('cleanup', retval)
- }
+ // remove readme unless we asked for it
+ if (args.indexOf('readme') === -1) {
+ delete data.readme
+ }
- if (error || silent) {
- cb(error, retval)
- } else if (
- !npm.config.get('json') &&
- args.length === 1 &&
- args[0] === ''
- ) {
- data.version = version
- BB.all(results.map((v) => prettyView(data, v[Object.keys(v)[0]][''])))
- .nodeify(cb)
- .then(() => retval)
- } else {
- printData(retval, data._id, cb.bind(null, error, retval))
+ Object.keys(versions).forEach(function (v) {
+ if (semver.satisfies(v, version, true)) {
+ args.forEach(function (args) {
+ // remove readme unless we asked for it
+ if (args.indexOf('readme') !== -1) {
+ delete versions[v].readme
+ }
+ results.push(showFields(data, versions[v], args))
+ })
}
})
- })
+ let retval = results.reduce(reducer, {})
+
+ if (args.length === 1 && args[0] === '') {
+ retval = cleanBlanks(retval)
+ log.silly('view', retval)
+ }
+
+ if (silent) {
+ } else if (error) {
+ throw error
+ } else if (
+ !opts.json &&
+ args.length === 1 &&
+ args[0] === ''
+ ) {
+ data.version = version
+ return BB.all(
+ results.map((v) => prettyView(data, v[Object.keys(v)[0]][''], opts))
+ ).then(() => retval)
+ } else {
+ return BB.fromNode(cb => {
+ printData(retval, data._id, opts, cb)
+ }).then(() => retval)
+ }
+ }).nodeify(cb)
}
-function prettyView (packument, manifest) {
+function prettyView (packument, manifest, opts) {
// More modern, pretty printing of default view
- const unicode = npm.config.get('unicode')
+ const unicode = opts.unicode
return BB.try(() => {
if (!manifest) {
log.error(
@@ -312,7 +336,7 @@ function prettyView (packument, manifest) {
}
function cleanBlanks (obj) {
- var clean = {}
+ const clean = {}
Object.keys(obj).forEach(function (version) {
clean[version] = obj[version]['']
})
@@ -334,7 +358,7 @@ function reducer (l, r) {
// return whatever was printed
function showFields (data, version, fields) {
- var o = {}
+ const o = {}
;[data, version].forEach(function (s) {
Object.keys(s).forEach(function (k) {
o[k] = s[k]
@@ -344,18 +368,18 @@ function showFields (data, version, fields) {
}
function search (data, fields, version, title) {
- var field
- var tail = fields
+ let field
+ const tail = fields
while (!field && fields.length) field = tail.shift()
fields = [field].concat(tail)
- var o
+ let o
if (!field && !tail.length) {
o = {}
o[version] = {}
o[version][title] = data
return o
}
- var index = field.match(/(.+)\[([^\]]+)\]$/)
+ let index = field.match(/(.+)\[([^\]]+)\]$/)
if (index) {
field = index[1]
index = index[2]
@@ -369,10 +393,10 @@ function search (data, fields, version, title) {
if (data.length === 1) {
return search(data[0], fields, version, title)
}
- var results = []
+ let results = []
data.forEach(function (data, i) {
- var tl = title.length
- var newt = title.substr(0, tl - fields.join('.').length - 1) +
+ const tl = title.length
+ const newt = title.substr(0, tl - fields.join('.').length - 1) +
'[' + i + ']' + [''].concat(fields).join('.')
results.push(search(data, fields.slice(), version, newt))
})
@@ -395,32 +419,32 @@ function search (data, fields, version, title) {
return o
}
-function printData (data, name, cb) {
- var versions = Object.keys(data)
- var msg = ''
- var msgJson = []
- var includeVersions = versions.length > 1
- var includeFields
+function printData (data, name, opts, cb) {
+ const versions = Object.keys(data)
+ let msg = ''
+ let msgJson = []
+ const includeVersions = versions.length > 1
+ let includeFields
versions.forEach(function (v) {
- var fields = Object.keys(data[v])
+ const fields = Object.keys(data[v])
includeFields = includeFields || (fields.length > 1)
- if (npm.config.get('json')) msgJson.push({})
+ if (opts.json) msgJson.push({})
fields.forEach(function (f) {
- var d = cleanup(data[v][f])
- if (fields.length === 1 && npm.config.get('json')) {
+ let d = cleanup(data[v][f])
+ if (fields.length === 1 && opts.json) {
msgJson[msgJson.length - 1][f] = d
}
if (includeVersions || includeFields || typeof d !== 'string') {
- if (npm.config.get('json')) {
+ if (opts.json) {
msgJson[msgJson.length - 1][f] = d
} else {
d = util.inspect(d, { showHidden: false, depth: 5, colors: npm.color, maxArrayLength: null })
}
- } else if (typeof d === 'string' && npm.config.get('json')) {
+ } else if (typeof d === 'string' && opts.json) {
d = JSON.stringify(d)
}
- if (!npm.config.get('json')) {
+ if (!opts.json) {
if (f && includeFields) f += ' = '
if (d.indexOf('\n') !== -1) d = ' \n' + d
msg += (includeVersions ? name + '@' + v + ' ' : '') +
@@ -429,9 +453,9 @@ function printData (data, name, cb) {
})
})
- if (npm.config.get('json')) {
+ if (opts.json) {
if (msgJson.length && Object.keys(msgJson[0]).length === 1) {
- var k = Object.keys(msgJson[0])[0]
+ const k = Object.keys(msgJson[0])[0]
msgJson = msgJson.map(function (m) { return m[k] })
}
@@ -465,7 +489,7 @@ function cleanup (data) {
data.versions = Object.keys(data.versions || {})
}
- var keys = Object.keys(data)
+ let keys = Object.keys(data)
keys.forEach(function (d) {
if (d.charAt(0) === '_') delete data[d]
else if (typeof data[d] === 'object') data[d] = cleanup(data[d])
diff --git a/deps/npm/lib/whoami.js b/deps/npm/lib/whoami.js
index e8af6595d15cc1..5145b447de4c6b 100644
--- a/deps/npm/lib/whoami.js
+++ b/deps/npm/lib/whoami.js
@@ -1,47 +1,63 @@
-var npm = require('./npm.js')
-var output = require('./utils/output.js')
+'use strict'
+
+const BB = require('bluebird')
+
+const npmConfig = require('./config/figgy-config.js')
+const fetch = require('libnpm/fetch')
+const figgyPudding = require('figgy-pudding')
+const npm = require('./npm.js')
+const output = require('./utils/output.js')
+
+const WhoamiConfig = figgyPudding({
+ json: {},
+ registry: {}
+})
module.exports = whoami
whoami.usage = 'npm whoami [--registry ]\n(just prints username according to given registry)'
-function whoami (args, silent, cb) {
+function whoami ([spec], silent, cb) {
// FIXME: need tighter checking on this, but is a breaking change
if (typeof cb !== 'function') {
cb = silent
silent = false
}
-
- var registry = npm.config.get('registry')
- if (!registry) return cb(new Error('no default registry set'))
-
- var auth = npm.config.getCredentialsByURI(registry)
- if (auth) {
- if (auth.username) {
- if (!silent) output(auth.username)
- return process.nextTick(cb.bind(this, null, auth.username))
- } else if (auth.token) {
- return npm.registry.whoami(registry, { auth: auth }, function (er, username) {
- if (er) return cb(er)
- if (!username) {
- var needNewSession = new Error(
+ const opts = WhoamiConfig(npmConfig())
+ return BB.try(() => {
+ // First, check if we have a user/pass-based auth
+ const registry = opts.registry
+ if (!registry) throw new Error('no default registry set')
+ return npm.config.getCredentialsByURI(registry)
+ }).then(({username, token}) => {
+ if (username) {
+ return username
+ } else if (token) {
+ return fetch.json('/-/whoami', opts.concat({
+ spec
+ })).then(({username}) => {
+ if (username) {
+ return username
+ } else {
+ throw Object.assign(new Error(
'Your auth token is no longer valid. Please log in again.'
- )
- needNewSession.code = 'ENEEDAUTH'
- return cb(needNewSession)
+ ), {code: 'ENEEDAUTH'})
}
-
- if (!silent) output(username)
- cb(null, username)
})
+ } else {
+ // At this point, if they have a credentials object, it doesn't have a
+ // token or auth in it. Probably just the default registry.
+ throw Object.assign(new Error(
+ 'This command requires you to be logged in.'
+ ), {code: 'ENEEDAUTH'})
}
- }
-
- // At this point, if they have a credentials object, it doesn't have a token
- // or auth in it. Probably just the default registry.
- var needAuth = new Error(
- 'this command requires you to be logged in.'
- )
- needAuth.code = 'ENEEDAUTH'
- process.nextTick(cb.bind(this, needAuth))
+ }).then(username => {
+ if (silent) {
+ } else if (opts.json) {
+ output(JSON.stringify(username))
+ } else {
+ output(username)
+ }
+ return username
+ }).nodeify(cb)
}
diff --git a/deps/npm/man/man1/npm-README.1 b/deps/npm/man/man1/npm-README.1
index ebb32ed4984dfa..d616e80ebb5319 100644
--- a/deps/npm/man/man1/npm-README.1
+++ b/deps/npm/man/man1/npm-README.1
@@ -1,4 +1,4 @@
-.TH "NPM" "1" "December 2018" "" ""
+.TH "NPM" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm\fR \- a JavaScript package manager
.P
diff --git a/deps/npm/man/man1/npm-access.1 b/deps/npm/man/man1/npm-access.1
index 70a9eb013aebea..db77a4092d972b 100644
--- a/deps/npm/man/man1/npm-access.1
+++ b/deps/npm/man/man1/npm-access.1
@@ -1,4 +1,4 @@
-.TH "NPM\-ACCESS" "1" "December 2018" "" ""
+.TH "NPM\-ACCESS" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-access\fR \- Set access level on published packages
.SH SYNOPSIS
@@ -11,6 +11,9 @@ npm access restricted []
npm access grant []
npm access revoke []
+npm access 2fa\-required []
+npm access 2fa\-not\-required []
+
npm access ls\-packages [||]
npm access ls\-collaborators [ []]
npm access edit []
@@ -32,6 +35,10 @@ grant / revoke:
Add or remove the ability of users and teams to have read\-only or read\-write
access to a package\.
.IP \(bu 2
+2fa\-required / 2fa\-not\-required:
+Configure whether a package requires that anyone publishing it have two\-factor
+authentication enabled on their account\.
+.IP \(bu 2
ls\-packages:
Show all of the packages a user or a team is able to access, along with the
access level, except for read\-only public packages (it won't print the whole
@@ -80,6 +87,8 @@ Management of teams and team memberships is done with the \fBnpm team\fP command
.SH SEE ALSO
.RS 0
.IP \(bu 2
+\fBlibnpmaccess\fP \fIhttps://npm\.im/libnpmaccess\fR
+.IP \(bu 2
npm help team
.IP \(bu 2
npm help publish
diff --git a/deps/npm/man/man1/npm-adduser.1 b/deps/npm/man/man1/npm-adduser.1
index 33f3c705e41fb8..a2158576572409 100644
--- a/deps/npm/man/man1/npm-adduser.1
+++ b/deps/npm/man/man1/npm-adduser.1
@@ -1,4 +1,4 @@
-.TH "NPM\-ADDUSER" "1" "December 2018" "" ""
+.TH "NPM\-ADDUSER" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-adduser\fR \- Add a registry user account
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-audit.1 b/deps/npm/man/man1/npm-audit.1
index 90e3e7ad4b5f78..dee3bb03e1e751 100644
--- a/deps/npm/man/man1/npm-audit.1
+++ b/deps/npm/man/man1/npm-audit.1
@@ -1,4 +1,4 @@
-.TH "NPM\-AUDIT" "1" "December 2018" "" ""
+.TH "NPM\-AUDIT" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-audit\fR \- Run a security audit
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-bin.1 b/deps/npm/man/man1/npm-bin.1
index e4ac9b1a782470..edec701bb2bc14 100644
--- a/deps/npm/man/man1/npm-bin.1
+++ b/deps/npm/man/man1/npm-bin.1
@@ -1,4 +1,4 @@
-.TH "NPM\-BIN" "1" "December 2018" "" ""
+.TH "NPM\-BIN" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-bin\fR \- Display npm bin folder
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-bugs.1 b/deps/npm/man/man1/npm-bugs.1
index 67e57069c857cb..d34e0987414369 100644
--- a/deps/npm/man/man1/npm-bugs.1
+++ b/deps/npm/man/man1/npm-bugs.1
@@ -1,4 +1,4 @@
-.TH "NPM\-BUGS" "1" "December 2018" "" ""
+.TH "NPM\-BUGS" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-bugs\fR \- Bugs for a package in a web browser maybe
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-build.1 b/deps/npm/man/man1/npm-build.1
index 153fac58db013f..b4247d95bdfb44 100644
--- a/deps/npm/man/man1/npm-build.1
+++ b/deps/npm/man/man1/npm-build.1
@@ -1,4 +1,4 @@
-.TH "NPM\-BUILD" "1" "December 2018" "" ""
+.TH "NPM\-BUILD" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-build\fR \- Build a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-bundle.1 b/deps/npm/man/man1/npm-bundle.1
index cdc27fa591618c..2b00e67c35c75d 100644
--- a/deps/npm/man/man1/npm-bundle.1
+++ b/deps/npm/man/man1/npm-bundle.1
@@ -1,4 +1,4 @@
-.TH "NPM\-BUNDLE" "1" "December 2018" "" ""
+.TH "NPM\-BUNDLE" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-bundle\fR \- REMOVED
.SH DESCRIPTION
diff --git a/deps/npm/man/man1/npm-cache.1 b/deps/npm/man/man1/npm-cache.1
index 0b82fac3a1c653..2dc3481048a05f 100644
--- a/deps/npm/man/man1/npm-cache.1
+++ b/deps/npm/man/man1/npm-cache.1
@@ -1,4 +1,4 @@
-.TH "NPM\-CACHE" "1" "December 2018" "" ""
+.TH "NPM\-CACHE" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-cache\fR \- Manipulates packages cache
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-ci.1 b/deps/npm/man/man1/npm-ci.1
index d38f56eb5f6ccf..b872c00415b249 100644
--- a/deps/npm/man/man1/npm-ci.1
+++ b/deps/npm/man/man1/npm-ci.1
@@ -1,4 +1,4 @@
-.TH "NPM\-CI" "1" "December 2018" "" ""
+.TH "NPM\-CI" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-ci\fR \- Install a project with a clean slate
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-completion.1 b/deps/npm/man/man1/npm-completion.1
index 8a4cb41d3aac2c..9f424dc2e36b78 100644
--- a/deps/npm/man/man1/npm-completion.1
+++ b/deps/npm/man/man1/npm-completion.1
@@ -1,4 +1,4 @@
-.TH "NPM\-COMPLETION" "1" "December 2018" "" ""
+.TH "NPM\-COMPLETION" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-completion\fR \- Tab Completion for npm
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-config.1 b/deps/npm/man/man1/npm-config.1
index 81c797a1211cd8..cc9f9cff4e21e2 100644
--- a/deps/npm/man/man1/npm-config.1
+++ b/deps/npm/man/man1/npm-config.1
@@ -1,4 +1,4 @@
-.TH "NPM\-CONFIG" "1" "December 2018" "" ""
+.TH "NPM\-CONFIG" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-config\fR \- Manage the npm configuration files
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-dedupe.1 b/deps/npm/man/man1/npm-dedupe.1
index 0d6f80848e27b4..8f221481298c7a 100644
--- a/deps/npm/man/man1/npm-dedupe.1
+++ b/deps/npm/man/man1/npm-dedupe.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DEDUPE" "1" "December 2018" "" ""
+.TH "NPM\-DEDUPE" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-dedupe\fR \- Reduce duplication
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-deprecate.1 b/deps/npm/man/man1/npm-deprecate.1
index 121f2b9f866d2b..1cf39ddb45152f 100644
--- a/deps/npm/man/man1/npm-deprecate.1
+++ b/deps/npm/man/man1/npm-deprecate.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DEPRECATE" "1" "December 2018" "" ""
+.TH "NPM\-DEPRECATE" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-deprecate\fR \- Deprecate a version of a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-dist-tag.1 b/deps/npm/man/man1/npm-dist-tag.1
index f9e52f93042157..b10e8ce3583d00 100644
--- a/deps/npm/man/man1/npm-dist-tag.1
+++ b/deps/npm/man/man1/npm-dist-tag.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DIST\-TAG" "1" "December 2018" "" ""
+.TH "NPM\-DIST\-TAG" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-dist-tag\fR \- Modify package distribution tags
.SH SYNOPSIS
@@ -29,6 +29,7 @@ Clear a tag that is no longer in use from the package\.
ls:
Show all of the dist\-tags for a package, defaulting to the package in
the current prefix\.
+This is the default action if none is specified\.
.RE
.P
diff --git a/deps/npm/man/man1/npm-docs.1 b/deps/npm/man/man1/npm-docs.1
index 3ed38e648a25f0..6d7ac688b39528 100644
--- a/deps/npm/man/man1/npm-docs.1
+++ b/deps/npm/man/man1/npm-docs.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DOCS" "1" "December 2018" "" ""
+.TH "NPM\-DOCS" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-docs\fR \- Docs for a package in a web browser maybe
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-doctor.1 b/deps/npm/man/man1/npm-doctor.1
index c15ae915644c96..aa55d1518ee371 100644
--- a/deps/npm/man/man1/npm-doctor.1
+++ b/deps/npm/man/man1/npm-doctor.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DOCTOR" "1" "December 2018" "" ""
+.TH "NPM\-DOCTOR" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-doctor\fR \- Check your environments
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-edit.1 b/deps/npm/man/man1/npm-edit.1
index ef2a4145852946..9146f3b76d428f 100644
--- a/deps/npm/man/man1/npm-edit.1
+++ b/deps/npm/man/man1/npm-edit.1
@@ -1,4 +1,4 @@
-.TH "NPM\-EDIT" "1" "December 2018" "" ""
+.TH "NPM\-EDIT" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-edit\fR \- Edit an installed package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-explore.1 b/deps/npm/man/man1/npm-explore.1
index 83ea6148f6ec54..89df72517841d2 100644
--- a/deps/npm/man/man1/npm-explore.1
+++ b/deps/npm/man/man1/npm-explore.1
@@ -1,4 +1,4 @@
-.TH "NPM\-EXPLORE" "1" "December 2018" "" ""
+.TH "NPM\-EXPLORE" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-explore\fR \- Browse an installed package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-help-search.1 b/deps/npm/man/man1/npm-help-search.1
index c7214a8048890b..e298e8d98142e8 100644
--- a/deps/npm/man/man1/npm-help-search.1
+++ b/deps/npm/man/man1/npm-help-search.1
@@ -1,4 +1,4 @@
-.TH "NPM\-HELP\-SEARCH" "1" "December 2018" "" ""
+.TH "NPM\-HELP\-SEARCH" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-help-search\fR \- Search npm help documentation
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-help.1 b/deps/npm/man/man1/npm-help.1
index 3c4e92b824713f..c07f2859175f96 100644
--- a/deps/npm/man/man1/npm-help.1
+++ b/deps/npm/man/man1/npm-help.1
@@ -1,4 +1,4 @@
-.TH "NPM\-HELP" "1" "December 2018" "" ""
+.TH "NPM\-HELP" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-help\fR \- Get help on npm
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-hook.1 b/deps/npm/man/man1/npm-hook.1
index 6dcc942ea69eb4..47b175167378c6 100644
--- a/deps/npm/man/man1/npm-hook.1
+++ b/deps/npm/man/man1/npm-hook.1
@@ -1,4 +1,4 @@
-.TH "NPM\-HOOK" "1" "December 2018" "" ""
+.TH "NPM\-HOOK" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-hook\fR \- Manage registry hooks
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-init.1 b/deps/npm/man/man1/npm-init.1
index 2e9915a58b4790..988126a399a5e9 100644
--- a/deps/npm/man/man1/npm-init.1
+++ b/deps/npm/man/man1/npm-init.1
@@ -1,4 +1,4 @@
-.TH "NPM\-INIT" "1" "December 2018" "" ""
+.TH "NPM\-INIT" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-init\fR \- create a package\.json file
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-install-ci-test.1 b/deps/npm/man/man1/npm-install-ci-test.1
index 70862f426c2f5a..6cdda35e2a2b83 100644
--- a/deps/npm/man/man1/npm-install-ci-test.1
+++ b/deps/npm/man/man1/npm-install-ci-test.1
@@ -1,4 +1,4 @@
-.TH "NPM" "" "December 2018" "" ""
+.TH "NPM" "" "January 2019" "" ""
.SH "NAME"
\fBnpm\fR
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-install-test.1 b/deps/npm/man/man1/npm-install-test.1
index 9e6124550308dd..ae8d5223f15d70 100644
--- a/deps/npm/man/man1/npm-install-test.1
+++ b/deps/npm/man/man1/npm-install-test.1
@@ -1,4 +1,4 @@
-.TH "NPM" "" "December 2018" "" ""
+.TH "NPM" "" "January 2019" "" ""
.SH "NAME"
\fBnpm\fR
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-install.1 b/deps/npm/man/man1/npm-install.1
index 7ecc6cf35dd2df..f2cdfeda49cf82 100644
--- a/deps/npm/man/man1/npm-install.1
+++ b/deps/npm/man/man1/npm-install.1
@@ -1,4 +1,4 @@
-.TH "NPM\-INSTALL" "1" "December 2018" "" ""
+.TH "NPM\-INSTALL" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-install\fR \- Install a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-link.1 b/deps/npm/man/man1/npm-link.1
index a70f0b4eebf540..292b9e50a03850 100644
--- a/deps/npm/man/man1/npm-link.1
+++ b/deps/npm/man/man1/npm-link.1
@@ -1,4 +1,4 @@
-.TH "NPM\-LINK" "1" "December 2018" "" ""
+.TH "NPM\-LINK" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-link\fR \- Symlink a package folder
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-logout.1 b/deps/npm/man/man1/npm-logout.1
index f0dddf165fdcb5..df27d4b55b973c 100644
--- a/deps/npm/man/man1/npm-logout.1
+++ b/deps/npm/man/man1/npm-logout.1
@@ -1,4 +1,4 @@
-.TH "NPM\-LOGOUT" "1" "December 2018" "" ""
+.TH "NPM\-LOGOUT" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-logout\fR \- Log out of the registry
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-ls.1 b/deps/npm/man/man1/npm-ls.1
index c68d86ce9f5f5c..59fa5878268686 100644
--- a/deps/npm/man/man1/npm-ls.1
+++ b/deps/npm/man/man1/npm-ls.1
@@ -1,4 +1,4 @@
-.TH "NPM\-LS" "1" "December 2018" "" ""
+.TH "NPM\-LS" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-ls\fR \- List installed packages
.SH SYNOPSIS
@@ -22,7 +22,7 @@ For example, running \fBnpm ls promzard\fP in npm's source tree will show:
.P
.RS 2
.nf
-npm@6.5.0 /path/to/npm
+npm@6.7.0 /path/to/npm
└─┬ init\-package\-json@0\.0\.4
└── promzard@0\.1\.5
.fi
diff --git a/deps/npm/man/man1/npm-org.1 b/deps/npm/man/man1/npm-org.1
new file mode 100644
index 00000000000000..2cef7e05e62596
--- /dev/null
+++ b/deps/npm/man/man1/npm-org.1
@@ -0,0 +1,72 @@
+.TH "NPM\-ORG" "1" "January 2019" "" ""
+.SH "NAME"
+\fBnpm-org\fR \- Manage orgs
+.SH SYNOPSIS
+.P
+.RS 2
+.nf
+npm org set [developer | admin | owner]
+npm org rm
+npm org ls []
+.fi
+.RE
+.SH EXAMPLE
+.P
+Add a new developer to an org:
+.P
+.RS 2
+.nf
+$ npm org set my\-org @mx\-smith
+.fi
+.RE
+.P
+Add a new admin to an org (or change a developer to an admin):
+.P
+.RS 2
+.nf
+$ npm org set my\-org @mx\-santos admin
+.fi
+.RE
+.P
+Remove a user from an org:
+.P
+.RS 2
+.nf
+$ npm org rm my\-org mx\-santos
+.fi
+.RE
+.P
+List all users in an org:
+.P
+.RS 2
+.nf
+$ npm org ls my\-org
+.fi
+.RE
+.P
+List all users in JSON format:
+.P
+.RS 2
+.nf
+$ npm org ls my\-org \-\-json
+.fi
+.RE
+.P
+See what role a user has in an org:
+.P
+.RS 2
+.nf
+$ npm org ls my\-org @mx\-santos
+.fi
+.RE
+.SH DESCRIPTION
+.P
+You can use the \fBnpm org\fP commands to manage and view users of an organization\.
+It supports adding and removing users, changing their roles, listing them, and
+finding specific ones and their roles\.
+.SH SEE ALSO
+.RS 0
+.IP \(bu 2
+Documentation on npm Orgs \fIhttps://docs\.npmjs\.com/orgs/\fR
+
+.RE
diff --git a/deps/npm/man/man1/npm-outdated.1 b/deps/npm/man/man1/npm-outdated.1
index ca26a1a3ca6fa2..7e0a18bef9a007 100644
--- a/deps/npm/man/man1/npm-outdated.1
+++ b/deps/npm/man/man1/npm-outdated.1
@@ -1,4 +1,4 @@
-.TH "NPM\-OUTDATED" "1" "December 2018" "" ""
+.TH "NPM\-OUTDATED" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-outdated\fR \- Check for outdated packages
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-owner.1 b/deps/npm/man/man1/npm-owner.1
index b188cb5c03fd5f..c7209ddc9c4bb2 100644
--- a/deps/npm/man/man1/npm-owner.1
+++ b/deps/npm/man/man1/npm-owner.1
@@ -1,4 +1,4 @@
-.TH "NPM\-OWNER" "1" "December 2018" "" ""
+.TH "NPM\-OWNER" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-owner\fR \- Manage package owners
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-pack.1 b/deps/npm/man/man1/npm-pack.1
index f17f2139e9a131..34af68565e2a75 100644
--- a/deps/npm/man/man1/npm-pack.1
+++ b/deps/npm/man/man1/npm-pack.1
@@ -1,4 +1,4 @@
-.TH "NPM\-PACK" "1" "December 2018" "" ""
+.TH "NPM\-PACK" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-pack\fR \- Create a tarball from a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-ping.1 b/deps/npm/man/man1/npm-ping.1
index 7a6c461b492f36..200f51383dd4c4 100644
--- a/deps/npm/man/man1/npm-ping.1
+++ b/deps/npm/man/man1/npm-ping.1
@@ -1,4 +1,4 @@
-.TH "NPM\-PING" "1" "December 2018" "" ""
+.TH "NPM\-PING" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-ping\fR \- Ping npm registry
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-prefix.1 b/deps/npm/man/man1/npm-prefix.1
index e26ab20c5d9b8b..372647ecaa04f8 100644
--- a/deps/npm/man/man1/npm-prefix.1
+++ b/deps/npm/man/man1/npm-prefix.1
@@ -1,4 +1,4 @@
-.TH "NPM\-PREFIX" "1" "December 2018" "" ""
+.TH "NPM\-PREFIX" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-prefix\fR \- Display prefix
.SH SYNOPSIS
@@ -11,7 +11,8 @@ npm prefix [\-g]
.SH DESCRIPTION
.P
Print the local prefix to standard out\. This is the closest parent directory
-to contain a package\.json file unless \fB\-g\fP is also specified\.
+to contain a \fBpackage\.json\fP file or \fBnode_modules\fP directory, unless \fB\-g\fP is
+also specified\.
.P
If \fB\-g\fP is specified, this will be the value of the global prefix\. See
npm help 7 \fBnpm\-config\fP for more detail\.
diff --git a/deps/npm/man/man1/npm-profile.1 b/deps/npm/man/man1/npm-profile.1
index 91eb3e5b84bda4..1bd7dcc2e117a7 100644
--- a/deps/npm/man/man1/npm-profile.1
+++ b/deps/npm/man/man1/npm-profile.1
@@ -1,4 +1,4 @@
-.TH "NPM\-PROFILE" "1" "December 2018" "" ""
+.TH "NPM\-PROFILE" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-profile\fR \- Change settings on your registry profile
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-prune.1 b/deps/npm/man/man1/npm-prune.1
index 29f4e3f7638198..6a3ed6d62f8a38 100644
--- a/deps/npm/man/man1/npm-prune.1
+++ b/deps/npm/man/man1/npm-prune.1
@@ -1,4 +1,4 @@
-.TH "NPM\-PRUNE" "1" "December 2018" "" ""
+.TH "NPM\-PRUNE" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-prune\fR \- Remove extraneous packages
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-publish.1 b/deps/npm/man/man1/npm-publish.1
index ea7ea69aad1e46..934a9ad810b98f 100644
--- a/deps/npm/man/man1/npm-publish.1
+++ b/deps/npm/man/man1/npm-publish.1
@@ -1,4 +1,4 @@
-.TH "NPM\-PUBLISH" "1" "December 2018" "" ""
+.TH "NPM\-PUBLISH" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-publish\fR \- Publish a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-rebuild.1 b/deps/npm/man/man1/npm-rebuild.1
index daab56a1f6fff7..8005c9513f2497 100644
--- a/deps/npm/man/man1/npm-rebuild.1
+++ b/deps/npm/man/man1/npm-rebuild.1
@@ -1,4 +1,4 @@
-.TH "NPM\-REBUILD" "1" "December 2018" "" ""
+.TH "NPM\-REBUILD" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-rebuild\fR \- Rebuild a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-repo.1 b/deps/npm/man/man1/npm-repo.1
index 5e105e109573c7..261d44a86087c1 100644
--- a/deps/npm/man/man1/npm-repo.1
+++ b/deps/npm/man/man1/npm-repo.1
@@ -1,4 +1,4 @@
-.TH "NPM\-REPO" "1" "December 2018" "" ""
+.TH "NPM\-REPO" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-repo\fR \- Open package repository page in the browser
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-restart.1 b/deps/npm/man/man1/npm-restart.1
index 8058ae62bd0445..084f5690eda38e 100644
--- a/deps/npm/man/man1/npm-restart.1
+++ b/deps/npm/man/man1/npm-restart.1
@@ -1,4 +1,4 @@
-.TH "NPM\-RESTART" "1" "December 2018" "" ""
+.TH "NPM\-RESTART" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-restart\fR \- Restart a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-root.1 b/deps/npm/man/man1/npm-root.1
index 7c123431a5d984..fd4246d8791f6a 100644
--- a/deps/npm/man/man1/npm-root.1
+++ b/deps/npm/man/man1/npm-root.1
@@ -1,4 +1,4 @@
-.TH "NPM\-ROOT" "1" "December 2018" "" ""
+.TH "NPM\-ROOT" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-root\fR \- Display npm root
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-run-script.1 b/deps/npm/man/man1/npm-run-script.1
index 98917db085412c..e349dab426526e 100644
--- a/deps/npm/man/man1/npm-run-script.1
+++ b/deps/npm/man/man1/npm-run-script.1
@@ -1,4 +1,4 @@
-.TH "NPM\-RUN\-SCRIPT" "1" "December 2018" "" ""
+.TH "NPM\-RUN\-SCRIPT" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-run-script\fR \- Run arbitrary package scripts
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-search.1 b/deps/npm/man/man1/npm-search.1
index 1ee0f51b1709ef..95a8d0966bc7df 100644
--- a/deps/npm/man/man1/npm-search.1
+++ b/deps/npm/man/man1/npm-search.1
@@ -1,4 +1,4 @@
-.TH "NPM\-SEARCH" "1" "December 2018" "" ""
+.TH "NPM\-SEARCH" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-search\fR \- Search for packages
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-shrinkwrap.1 b/deps/npm/man/man1/npm-shrinkwrap.1
index a2a4afdacbdd25..3f9658febf76c8 100644
--- a/deps/npm/man/man1/npm-shrinkwrap.1
+++ b/deps/npm/man/man1/npm-shrinkwrap.1
@@ -1,4 +1,4 @@
-.TH "NPM\-SHRINKWRAP" "1" "December 2018" "" ""
+.TH "NPM\-SHRINKWRAP" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-shrinkwrap\fR \- Lock down dependency versions for publication
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-star.1 b/deps/npm/man/man1/npm-star.1
index 106c50a8c0c924..ae8c61667da3c5 100644
--- a/deps/npm/man/man1/npm-star.1
+++ b/deps/npm/man/man1/npm-star.1
@@ -1,4 +1,4 @@
-.TH "NPM\-STAR" "1" "December 2018" "" ""
+.TH "NPM\-STAR" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-star\fR \- Mark your favorite packages
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-stars.1 b/deps/npm/man/man1/npm-stars.1
index 38042d2c1e226b..2f6038cb5d2be2 100644
--- a/deps/npm/man/man1/npm-stars.1
+++ b/deps/npm/man/man1/npm-stars.1
@@ -1,4 +1,4 @@
-.TH "NPM\-STARS" "1" "December 2018" "" ""
+.TH "NPM\-STARS" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-stars\fR \- View packages marked as favorites
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-start.1 b/deps/npm/man/man1/npm-start.1
index f36f19af52f84c..7ea6362bdc90f0 100644
--- a/deps/npm/man/man1/npm-start.1
+++ b/deps/npm/man/man1/npm-start.1
@@ -1,4 +1,4 @@
-.TH "NPM\-START" "1" "December 2018" "" ""
+.TH "NPM\-START" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-start\fR \- Start a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-stop.1 b/deps/npm/man/man1/npm-stop.1
index 65d257d8f41b50..2fa4d88ab3804b 100644
--- a/deps/npm/man/man1/npm-stop.1
+++ b/deps/npm/man/man1/npm-stop.1
@@ -1,4 +1,4 @@
-.TH "NPM\-STOP" "1" "December 2018" "" ""
+.TH "NPM\-STOP" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-stop\fR \- Stop a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-team.1 b/deps/npm/man/man1/npm-team.1
index 692f9e3f5a271f..6ed65ab44f8de3 100644
--- a/deps/npm/man/man1/npm-team.1
+++ b/deps/npm/man/man1/npm-team.1
@@ -1,4 +1,4 @@
-.TH "NPM\-TEAM" "1" "December 2018" "" ""
+.TH "NPM\-TEAM" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-team\fR \- Manage organization teams and team memberships
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-test.1 b/deps/npm/man/man1/npm-test.1
index e644339a405be7..0183a8e079968c 100644
--- a/deps/npm/man/man1/npm-test.1
+++ b/deps/npm/man/man1/npm-test.1
@@ -1,4 +1,4 @@
-.TH "NPM\-TEST" "1" "December 2018" "" ""
+.TH "NPM\-TEST" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-test\fR \- Test a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-token.1 b/deps/npm/man/man1/npm-token.1
index 2a00d6273607aa..d7ab2e18b7478a 100644
--- a/deps/npm/man/man1/npm-token.1
+++ b/deps/npm/man/man1/npm-token.1
@@ -1,4 +1,4 @@
-.TH "NPM\-TOKEN" "1" "December 2018" "" ""
+.TH "NPM\-TOKEN" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-token\fR \- Manage your authentication tokens
.SH SYNOPSIS
@@ -12,7 +12,7 @@ npm token revoke
.RE
.SH DESCRIPTION
.P
-This list you list, create and revoke authentication tokens\.
+This lets you list, create and revoke authentication tokens\.
.RS 0
.IP \(bu 2
\fBnpm token list\fP:
diff --git a/deps/npm/man/man1/npm-uninstall.1 b/deps/npm/man/man1/npm-uninstall.1
index 18115b4141a400..07c413e299beea 100644
--- a/deps/npm/man/man1/npm-uninstall.1
+++ b/deps/npm/man/man1/npm-uninstall.1
@@ -1,4 +1,4 @@
-.TH "NPM\-UNINSTALL" "1" "December 2018" "" ""
+.TH "NPM\-UNINSTALL" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-uninstall\fR \- Remove a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-unpublish.1 b/deps/npm/man/man1/npm-unpublish.1
index c4f628c791e475..25eb7044acb32a 100644
--- a/deps/npm/man/man1/npm-unpublish.1
+++ b/deps/npm/man/man1/npm-unpublish.1
@@ -1,4 +1,4 @@
-.TH "NPM\-UNPUBLISH" "1" "December 2018" "" ""
+.TH "NPM\-UNPUBLISH" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-unpublish\fR \- Remove a package from the registry
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-update.1 b/deps/npm/man/man1/npm-update.1
index 4084d91124fd4d..6486b8aef073b4 100644
--- a/deps/npm/man/man1/npm-update.1
+++ b/deps/npm/man/man1/npm-update.1
@@ -1,4 +1,4 @@
-.TH "NPM\-UPDATE" "1" "December 2018" "" ""
+.TH "NPM\-UPDATE" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-update\fR \- Update a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-version.1 b/deps/npm/man/man1/npm-version.1
index 1fb0950052ac7e..bae79b58186a5f 100644
--- a/deps/npm/man/man1/npm-version.1
+++ b/deps/npm/man/man1/npm-version.1
@@ -1,4 +1,4 @@
-.TH "NPM\-VERSION" "1" "December 2018" "" ""
+.TH "NPM\-VERSION" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-version\fR \- Bump a package version
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-view.1 b/deps/npm/man/man1/npm-view.1
index f5d8b6a84d9bdf..713de6e3cb22a8 100644
--- a/deps/npm/man/man1/npm-view.1
+++ b/deps/npm/man/man1/npm-view.1
@@ -1,4 +1,4 @@
-.TH "NPM\-VIEW" "1" "December 2018" "" ""
+.TH "NPM\-VIEW" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-view\fR \- View registry info
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-whoami.1 b/deps/npm/man/man1/npm-whoami.1
index f44ff37ce0ffcf..08f23ddf0cbc23 100644
--- a/deps/npm/man/man1/npm-whoami.1
+++ b/deps/npm/man/man1/npm-whoami.1
@@ -1,4 +1,4 @@
-.TH "NPM\-WHOAMI" "1" "December 2018" "" ""
+.TH "NPM\-WHOAMI" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-whoami\fR \- Display npm username
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm.1 b/deps/npm/man/man1/npm.1
index 9dc282959eaa9c..38cf7686a434b8 100644
--- a/deps/npm/man/man1/npm.1
+++ b/deps/npm/man/man1/npm.1
@@ -1,4 +1,4 @@
-.TH "NPM" "1" "December 2018" "" ""
+.TH "NPM" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm\fR \- javascript package manager
.SH SYNOPSIS
@@ -10,7 +10,7 @@ npm [args]
.RE
.SH VERSION
.P
-6.5.0
+6.7.0
.SH DESCRIPTION
.P
npm is the package manager for the Node JavaScript platform\. It puts
diff --git a/deps/npm/man/man5/npm-folders.5 b/deps/npm/man/man5/npm-folders.5
index 901e658c2ff434..0fa03d9715058f 100644
--- a/deps/npm/man/man5/npm-folders.5
+++ b/deps/npm/man/man5/npm-folders.5
@@ -1,4 +1,4 @@
-.TH "NPM\-FOLDERS" "5" "December 2018" "" ""
+.TH "NPM\-FOLDERS" "5" "January 2019" "" ""
.SH "NAME"
\fBnpm-folders\fR \- Folder Structures Used by npm
.SH DESCRIPTION
diff --git a/deps/npm/man/man5/npm-global.5 b/deps/npm/man/man5/npm-global.5
index 901e658c2ff434..0fa03d9715058f 100644
--- a/deps/npm/man/man5/npm-global.5
+++ b/deps/npm/man/man5/npm-global.5
@@ -1,4 +1,4 @@
-.TH "NPM\-FOLDERS" "5" "December 2018" "" ""
+.TH "NPM\-FOLDERS" "5" "January 2019" "" ""
.SH "NAME"
\fBnpm-folders\fR \- Folder Structures Used by npm
.SH DESCRIPTION
diff --git a/deps/npm/man/man5/npm-json.5 b/deps/npm/man/man5/npm-json.5
index 64d5408cdb3e47..dd20f7cb5f3ca1 100644
--- a/deps/npm/man/man5/npm-json.5
+++ b/deps/npm/man/man5/npm-json.5
@@ -1,4 +1,4 @@
-.TH "PACKAGE\.JSON" "5" "December 2018" "" ""
+.TH "PACKAGE\.JSON" "5" "January 2019" "" ""
.SH "NAME"
\fBpackage.json\fR \- Specifics of npm's package\.json handling
.SH DESCRIPTION
diff --git a/deps/npm/man/man5/npm-package-locks.5 b/deps/npm/man/man5/npm-package-locks.5
index cddd15648ee4a4..b3692cebb2e139 100644
--- a/deps/npm/man/man5/npm-package-locks.5
+++ b/deps/npm/man/man5/npm-package-locks.5
@@ -1,4 +1,4 @@
-.TH "NPM\-PACKAGE\-LOCKS" "5" "December 2018" "" ""
+.TH "NPM\-PACKAGE\-LOCKS" "5" "January 2019" "" ""
.SH "NAME"
\fBnpm-package-locks\fR \- An explanation of npm lockfiles
.SH DESCRIPTION
diff --git a/deps/npm/man/man5/npm-shrinkwrap.json.5 b/deps/npm/man/man5/npm-shrinkwrap.json.5
index b346fb33d71917..c1d4db9af34b74 100644
--- a/deps/npm/man/man5/npm-shrinkwrap.json.5
+++ b/deps/npm/man/man5/npm-shrinkwrap.json.5
@@ -1,4 +1,4 @@
-.TH "NPM\-SHRINKWRAP\.JSON" "5" "December 2018" "" ""
+.TH "NPM\-SHRINKWRAP\.JSON" "5" "January 2019" "" ""
.SH "NAME"
\fBnpm-shrinkwrap.json\fR \- A publishable lockfile
.SH DESCRIPTION
diff --git a/deps/npm/man/man5/npmrc.5 b/deps/npm/man/man5/npmrc.5
index e1af5ff4806f04..3d6bf7d568d132 100644
--- a/deps/npm/man/man5/npmrc.5
+++ b/deps/npm/man/man5/npmrc.5
@@ -1,4 +1,4 @@
-.TH "NPMRC" "5" "December 2018" "" ""
+.TH "NPMRC" "5" "January 2019" "" ""
.SH "NAME"
\fBnpmrc\fR \- The npm config files
.SH DESCRIPTION
diff --git a/deps/npm/man/man5/package-lock.json.5 b/deps/npm/man/man5/package-lock.json.5
index eeb5bc28609ecd..dcb215423d7303 100644
--- a/deps/npm/man/man5/package-lock.json.5
+++ b/deps/npm/man/man5/package-lock.json.5
@@ -1,4 +1,4 @@
-.TH "PACKAGE\-LOCK\.JSON" "5" "December 2018" "" ""
+.TH "PACKAGE\-LOCK\.JSON" "5" "January 2019" "" ""
.SH "NAME"
\fBpackage-lock.json\fR \- A manifestation of the manifest
.SH DESCRIPTION
diff --git a/deps/npm/man/man5/package.json.5 b/deps/npm/man/man5/package.json.5
index 64d5408cdb3e47..dd20f7cb5f3ca1 100644
--- a/deps/npm/man/man5/package.json.5
+++ b/deps/npm/man/man5/package.json.5
@@ -1,4 +1,4 @@
-.TH "PACKAGE\.JSON" "5" "December 2018" "" ""
+.TH "PACKAGE\.JSON" "5" "January 2019" "" ""
.SH "NAME"
\fBpackage.json\fR \- Specifics of npm's package\.json handling
.SH DESCRIPTION
diff --git a/deps/npm/man/man7/npm-coding-style.7 b/deps/npm/man/man7/npm-coding-style.7
index 1d8ba4055361a8..9268b14cf9f4b9 100644
--- a/deps/npm/man/man7/npm-coding-style.7
+++ b/deps/npm/man/man7/npm-coding-style.7
@@ -1,4 +1,4 @@
-.TH "NPM\-CODING\-STYLE" "7" "December 2018" "" ""
+.TH "NPM\-CODING\-STYLE" "7" "January 2019" "" ""
.SH "NAME"
\fBnpm-coding-style\fR \- npm's "funny" coding style
.SH DESCRIPTION
diff --git a/deps/npm/man/man7/npm-config.7 b/deps/npm/man/man7/npm-config.7
index 82e6e7e8a27d19..9b05942bb0b28f 100644
--- a/deps/npm/man/man7/npm-config.7
+++ b/deps/npm/man/man7/npm-config.7
@@ -1,4 +1,4 @@
-.TH "NPM\-CONFIG" "7" "December 2018" "" ""
+.TH "NPM\-CONFIG" "7" "January 2019" "" ""
.SH "NAME"
\fBnpm-config\fR \- More than you probably want to know about npm configuration
.SH DESCRIPTION
diff --git a/deps/npm/man/man7/npm-developers.7 b/deps/npm/man/man7/npm-developers.7
index 88c8ac2c9e925a..bd36cbabb7991a 100644
--- a/deps/npm/man/man7/npm-developers.7
+++ b/deps/npm/man/man7/npm-developers.7
@@ -1,4 +1,4 @@
-.TH "NPM\-DEVELOPERS" "7" "December 2018" "" ""
+.TH "NPM\-DEVELOPERS" "7" "January 2019" "" ""
.SH "NAME"
\fBnpm-developers\fR \- Developer Guide
.SH DESCRIPTION
diff --git a/deps/npm/man/man7/npm-disputes.7 b/deps/npm/man/man7/npm-disputes.7
index 9c63852455de11..95716579e50527 100644
--- a/deps/npm/man/man7/npm-disputes.7
+++ b/deps/npm/man/man7/npm-disputes.7
@@ -1,4 +1,4 @@
-.TH "NPM\-DISPUTES" "7" "December 2018" "" ""
+.TH "NPM\-DISPUTES" "7" "January 2019" "" ""
.SH "NAME"
\fBnpm-disputes\fR \- Handling Module Name Disputes
.P
diff --git a/deps/npm/man/man7/npm-index.7 b/deps/npm/man/man7/npm-index.7
index dd7c7317345ec6..1d7d5d0bf46cac 100644
--- a/deps/npm/man/man7/npm-index.7
+++ b/deps/npm/man/man7/npm-index.7
@@ -1,4 +1,4 @@
-.TH "NPM\-INDEX" "7" "December 2018" "" ""
+.TH "NPM\-INDEX" "7" "January 2019" "" ""
.SH "NAME"
\fBnpm-index\fR \- Index of all npm documentation
.SS npm help README
@@ -94,6 +94,9 @@ Log out of the registry
.SS npm help ls
.P
List installed packages
+.SS npm help org
+.P
+Manage orgs
.SS npm help outdated
.P
Check for outdated packages
diff --git a/deps/npm/man/man7/npm-orgs.7 b/deps/npm/man/man7/npm-orgs.7
index 619b42e863035b..9f34c3cb7940ef 100644
--- a/deps/npm/man/man7/npm-orgs.7
+++ b/deps/npm/man/man7/npm-orgs.7
@@ -1,4 +1,4 @@
-.TH "NPM\-ORGS" "7" "December 2018" "" ""
+.TH "NPM\-ORGS" "7" "January 2019" "" ""
.SH "NAME"
\fBnpm-orgs\fR \- Working with Teams & Orgs
.SH DESCRIPTION
diff --git a/deps/npm/man/man7/npm-registry.7 b/deps/npm/man/man7/npm-registry.7
index 3cb1c86577507d..4c454c81f0c313 100644
--- a/deps/npm/man/man7/npm-registry.7
+++ b/deps/npm/man/man7/npm-registry.7
@@ -1,4 +1,4 @@
-.TH "NPM\-REGISTRY" "7" "December 2018" "" ""
+.TH "NPM\-REGISTRY" "7" "January 2019" "" ""
.SH "NAME"
\fBnpm-registry\fR \- The JavaScript Package Registry
.SH DESCRIPTION
diff --git a/deps/npm/man/man7/npm-scope.7 b/deps/npm/man/man7/npm-scope.7
index 8fdede6cca3ae7..6ba7c309224567 100644
--- a/deps/npm/man/man7/npm-scope.7
+++ b/deps/npm/man/man7/npm-scope.7
@@ -1,4 +1,4 @@
-.TH "NPM\-SCOPE" "7" "December 2018" "" ""
+.TH "NPM\-SCOPE" "7" "January 2019" "" ""
.SH "NAME"
\fBnpm-scope\fR \- Scoped packages
.SH DESCRIPTION
diff --git a/deps/npm/man/man7/npm-scripts.7 b/deps/npm/man/man7/npm-scripts.7
index 3c2b95f1af4596..9f5462e8a876d0 100644
--- a/deps/npm/man/man7/npm-scripts.7
+++ b/deps/npm/man/man7/npm-scripts.7
@@ -1,4 +1,4 @@
-.TH "NPM\-SCRIPTS" "7" "December 2018" "" ""
+.TH "NPM\-SCRIPTS" "7" "January 2019" "" ""
.SH "NAME"
\fBnpm-scripts\fR \- How npm handles the "scripts" field
.SH DESCRIPTION
diff --git a/deps/npm/man/man7/removing-npm.7 b/deps/npm/man/man7/removing-npm.7
index b93f704ffcaf42..f7d0781055cea5 100644
--- a/deps/npm/man/man7/removing-npm.7
+++ b/deps/npm/man/man7/removing-npm.7
@@ -1,4 +1,4 @@
-.TH "NPM\-REMOVAL" "1" "December 2018" "" ""
+.TH "NPM\-REMOVAL" "1" "January 2019" "" ""
.SH "NAME"
\fBnpm-removal\fR \- Cleaning the Slate
.SH SYNOPSIS
diff --git a/deps/npm/man/man7/semver.7 b/deps/npm/man/man7/semver.7
index 64b836ae9d6207..60f62c7e23d013 100644
--- a/deps/npm/man/man7/semver.7
+++ b/deps/npm/man/man7/semver.7
@@ -1,4 +1,4 @@
-.TH "SEMVER" "7" "December 2018" "" ""
+.TH "SEMVER" "7" "January 2019" "" ""
.SH "NAME"
\fBsemver\fR \- The semantic versioner for npm
.SH Install
@@ -34,8 +34,6 @@ As a command\-line utility:
.nf
$ semver \-h
-SemVer 5\.3\.0
-
A JavaScript implementation of the http://semver\.org/ specification
Copyright Isaac Z\. Schlueter
@@ -59,6 +57,9 @@ Options:
\-l \-\-loose
Interpret versions and ranges loosely
+\-p \-\-include\-prerelease
+ Always include prerelease versions in range matching
+
\-c \-\-coerce
Coerce a string into SemVer if possible
(does not imply \-\-loose)
@@ -344,9 +345,23 @@ part ::= nr | [\-0\-9A\-Za\-z]+
.RE
.SH Functions
.P
-All methods and classes take a final \fBloose\fP boolean argument that, if
-true, will be more forgiving about not\-quite\-valid semver strings\.
-The resulting output will always be 100% strict, of course\.
+All methods and classes take a final \fBoptions\fP object argument\. All
+options in this object are \fBfalse\fP by default\. The options supported
+are:
+.RS 0
+.IP \(bu 2
+\fBloose\fP Be more forgiving about not\-quite\-valid semver strings\.
+(Any resulting output will always be 100% strict compliant, of
+course\.) For backwards compatibility reasons, if the \fBoptions\fP
+argument is a boolean value instead of an object, it is interpreted
+to be the \fBloose\fP param\.
+.IP \(bu 2
+\fBincludePrerelease\fP Set to suppress the default
+behavior \fIhttps://github\.com/npm/node\-semver#prerelease\-tags\fR of
+excluding prerelease tagged versions from ranges unless they are
+explicitly opted into\.
+
+.RE
.P
Strict\-mode Comparators and Ranges will be strict about the SemVer
strings that they parse\.
diff --git a/deps/npm/node_modules/JSONStream/index.js b/deps/npm/node_modules/JSONStream/index.js
index a92967f568a4c2..8c587af769a2d7 100755
--- a/deps/npm/node_modules/JSONStream/index.js
+++ b/deps/npm/node_modules/JSONStream/index.js
@@ -3,6 +3,8 @@
var Parser = require('jsonparse')
, through = require('through')
+var bufferFrom = Buffer.from && Buffer.from !== Uint8Array.from
+
/*
the value of this.stack that creationix's jsonparse has is weird.
@@ -17,7 +19,7 @@ exports.parse = function (path, map) {
var parser = new Parser()
var stream = through(function (chunk) {
if('string' === typeof chunk)
- chunk = new Buffer(chunk)
+ chunk = bufferFrom ? Buffer.from(chunk) : new Buffer(chunk)
parser.write(chunk)
},
function (data) {
diff --git a/deps/npm/node_modules/JSONStream/package.json b/deps/npm/node_modules/JSONStream/package.json
index 23588d31b13a4a..91783af0b0ab64 100644
--- a/deps/npm/node_modules/JSONStream/package.json
+++ b/deps/npm/node_modules/JSONStream/package.json
@@ -1,28 +1,29 @@
{
- "_from": "JSONStream@1.3.4",
- "_id": "JSONStream@1.3.4",
+ "_from": "JSONStream@1.3.5",
+ "_id": "JSONStream@1.3.5",
"_inBundle": false,
- "_integrity": "sha512-Y7vfi3I5oMOYIr+WxV8NZxDSwcbNgzdKYsTNInmycOq9bUYwGg9ryu57Wg5NLmCjqdFPNUmpMBo3kSJN9tCbXg==",
+ "_integrity": "sha512-E+iruNOY8VV9s4JEbe1aNEm6MiszPRr/UfcHMz0TQh1BXSxHK+ASV1R6W4HpjBhSeS+54PIsAMCBmwD06LLsqQ==",
"_location": "/JSONStream",
"_phantomChildren": {},
"_requested": {
"type": "version",
"registry": true,
- "raw": "JSONStream@1.3.4",
+ "raw": "JSONStream@1.3.5",
"name": "JSONStream",
"escapedName": "JSONStream",
- "rawSpec": "1.3.4",
+ "rawSpec": "1.3.5",
"saveSpec": null,
- "fetchSpec": "1.3.4"
+ "fetchSpec": "1.3.5"
},
"_requiredBy": [
"#USER",
- "/"
+ "/",
+ "/npm-registry-fetch"
],
- "_resolved": "https://registry.npmjs.org/JSONStream/-/JSONStream-1.3.4.tgz",
- "_shasum": "615bb2adb0cd34c8f4c447b5f6512fa1d8f16a2e",
- "_spec": "JSONStream@1.3.4",
- "_where": "/Users/zkat/Documents/code/work/npm",
+ "_resolved": "https://registry.npmjs.org/JSONStream/-/JSONStream-1.3.5.tgz",
+ "_shasum": "3208c1f08d3a4d99261ab64f92302bc15e111ca0",
+ "_spec": "JSONStream@1.3.5",
+ "_where": "/Users/aeschright/code/cli",
"author": {
"name": "Dominic Tarr",
"email": "dominic.tarr@gmail.com",
@@ -69,7 +70,7 @@
"url": "git://github.com/dominictarr/JSONStream.git"
},
"scripts": {
- "test": "set -e; for t in test/*.js; do echo '***' $t '***'; node $t; done"
+ "test": "node test/run.js"
},
- "version": "1.3.4"
+ "version": "1.3.5"
}
diff --git a/deps/npm/node_modules/JSONStream/test/parsejson.js b/deps/npm/node_modules/JSONStream/test/parsejson.js
index 7f157175f5c48d..df4fbbe73a40d6 100644
--- a/deps/npm/node_modules/JSONStream/test/parsejson.js
+++ b/deps/npm/node_modules/JSONStream/test/parsejson.js
@@ -9,6 +9,9 @@ var r = Math.random()
, p = new Parser()
, assert = require('assert')
, times = 20
+ , bufferFrom = Buffer.from && Buffer.from !== Uint8Array.from
+ , str
+
while (times --) {
assert.equal(JSON.parse(JSON.stringify(r)), r, 'core JSON')
@@ -18,7 +21,8 @@ while (times --) {
assert.equal(v,r)
}
console.error('correct', r)
- p.write (new Buffer(JSON.stringify([r])))
+ str = JSON.stringify([r])
+ p.write (bufferFrom ? Buffer.from(str) : new Buffer(str))
diff --git a/deps/npm/node_modules/JSONStream/test/run.js b/deps/npm/node_modules/JSONStream/test/run.js
new file mode 100644
index 00000000000000..7d62e7385bd44f
--- /dev/null
+++ b/deps/npm/node_modules/JSONStream/test/run.js
@@ -0,0 +1,13 @@
+var readdirSync = require('fs').readdirSync
+var spawnSync = require('child_process').spawnSync
+var extname = require('path').extname
+
+var files = readdirSync(__dirname)
+files.forEach(function(file){
+ if (extname(file) !== '.js' || file === 'run.js')
+ return
+ console.log(`*** ${file} ***`)
+ var result = spawnSync(process.argv0, [file], { stdio: 'inherit', cwd: __dirname} )
+ if (result.status !== 0)
+ process.exit(result.status)
+})
diff --git a/deps/npm/node_modules/aproba/CHANGELOG.md b/deps/npm/node_modules/aproba/CHANGELOG.md
new file mode 100644
index 00000000000000..bab30ecb7e625d
--- /dev/null
+++ b/deps/npm/node_modules/aproba/CHANGELOG.md
@@ -0,0 +1,4 @@
+2.0.0
+ * Drop support for 0.10 and 0.12. They haven't been in travis but still,
+ since we _know_ we'll break with them now it's only polite to do a
+ major bump.
diff --git a/deps/npm/node_modules/aproba/index.js b/deps/npm/node_modules/aproba/index.js
index 6f3f797c09a750..fd947481ba5575 100644
--- a/deps/npm/node_modules/aproba/index.js
+++ b/deps/npm/node_modules/aproba/index.js
@@ -1,38 +1,39 @@
'use strict'
+module.exports = validate
function isArguments (thingy) {
return thingy != null && typeof thingy === 'object' && thingy.hasOwnProperty('callee')
}
-var types = {
- '*': {label: 'any', check: function () { return true }},
- A: {label: 'array', check: function (thingy) { return Array.isArray(thingy) || isArguments(thingy) }},
- S: {label: 'string', check: function (thingy) { return typeof thingy === 'string' }},
- N: {label: 'number', check: function (thingy) { return typeof thingy === 'number' }},
- F: {label: 'function', check: function (thingy) { return typeof thingy === 'function' }},
- O: {label: 'object', check: function (thingy) { return typeof thingy === 'object' && thingy != null && !types.A.check(thingy) && !types.E.check(thingy) }},
- B: {label: 'boolean', check: function (thingy) { return typeof thingy === 'boolean' }},
- E: {label: 'error', check: function (thingy) { return thingy instanceof Error }},
- Z: {label: 'null', check: function (thingy) { return thingy == null }}
+const types = {
+ '*': {label: 'any', check: () => true},
+ A: {label: 'array', check: _ => Array.isArray(_) || isArguments(_)},
+ S: {label: 'string', check: _ => typeof _ === 'string'},
+ N: {label: 'number', check: _ => typeof _ === 'number'},
+ F: {label: 'function', check: _ => typeof _ === 'function'},
+ O: {label: 'object', check: _ => typeof _ === 'object' && _ != null && !types.A.check(_) && !types.E.check(_)},
+ B: {label: 'boolean', check: _ => typeof _ === 'boolean'},
+ E: {label: 'error', check: _ => _ instanceof Error},
+ Z: {label: 'null', check: _ => _ == null}
}
function addSchema (schema, arity) {
- var group = arity[schema.length] = arity[schema.length] || []
+ const group = arity[schema.length] = arity[schema.length] || []
if (group.indexOf(schema) === -1) group.push(schema)
}
-var validate = module.exports = function (rawSchemas, args) {
+function validate (rawSchemas, args) {
if (arguments.length !== 2) throw wrongNumberOfArgs(['SA'], arguments.length)
if (!rawSchemas) throw missingRequiredArg(0, 'rawSchemas')
if (!args) throw missingRequiredArg(1, 'args')
if (!types.S.check(rawSchemas)) throw invalidType(0, ['string'], rawSchemas)
if (!types.A.check(args)) throw invalidType(1, ['array'], args)
- var schemas = rawSchemas.split('|')
- var arity = {}
+ const schemas = rawSchemas.split('|')
+ const arity = {}
- schemas.forEach(function (schema) {
- for (var ii = 0; ii < schema.length; ++ii) {
- var type = schema[ii]
+ schemas.forEach(schema => {
+ for (let ii = 0; ii < schema.length; ++ii) {
+ const type = schema[ii]
if (!types[type]) throw unknownType(ii, type)
}
if (/E.*E/.test(schema)) throw moreThanOneError(schema)
@@ -43,20 +44,18 @@ var validate = module.exports = function (rawSchemas, args) {
if (schema.length === 1) addSchema('', arity)
}
})
- var matching = arity[args.length]
+ let matching = arity[args.length]
if (!matching) {
throw wrongNumberOfArgs(Object.keys(arity), args.length)
}
- for (var ii = 0; ii < args.length; ++ii) {
- var newMatching = matching.filter(function (schema) {
- var type = schema[ii]
- var typeCheck = types[type].check
+ for (let ii = 0; ii < args.length; ++ii) {
+ let newMatching = matching.filter(schema => {
+ const type = schema[ii]
+ const typeCheck = types[type].check
return typeCheck(args[ii])
})
if (!newMatching.length) {
- var labels = matching.map(function (schema) {
- return types[schema[ii]].label
- }).filter(function (schema) { return schema != null })
+ const labels = matching.map(_ => types[_[ii]].label).filter(_ => _ != null)
throw invalidType(ii, labels, args[ii])
}
matching = newMatching
@@ -72,8 +71,8 @@ function unknownType (num, type) {
}
function invalidType (num, expectedTypes, value) {
- var valueType
- Object.keys(types).forEach(function (typeCode) {
+ let valueType
+ Object.keys(types).forEach(typeCode => {
if (types[typeCode].check(value)) valueType = types[typeCode].label
})
return newException('EINVALIDTYPE', 'Argument #' + (num + 1) + ': Expected ' +
@@ -85,8 +84,8 @@ function englishList (list) {
}
function wrongNumberOfArgs (expected, got) {
- var english = englishList(expected)
- var args = expected.every(function (ex) { return ex.length === 1 })
+ const english = englishList(expected)
+ const args = expected.every(ex => ex.length === 1)
? 'argument'
: 'arguments'
return newException('EWRONGARGCOUNT', 'Expected ' + english + ' ' + args + ' but got ' + got)
@@ -98,8 +97,9 @@ function moreThanOneError (schema) {
}
function newException (code, msg) {
- var e = new Error(msg)
- e.code = code
- if (Error.captureStackTrace) Error.captureStackTrace(e, validate)
- return e
+ const err = new Error(msg)
+ err.code = code
+ /* istanbul ignore else */
+ if (Error.captureStackTrace) Error.captureStackTrace(err, validate)
+ return err
}
diff --git a/deps/npm/node_modules/aproba/package.json b/deps/npm/node_modules/aproba/package.json
index 534c6beb57e4ee..42a7798b0e7829 100644
--- a/deps/npm/node_modules/aproba/package.json
+++ b/deps/npm/node_modules/aproba/package.json
@@ -1,38 +1,29 @@
{
- "_args": [
- [
- "aproba@1.2.0",
- "/Users/rebecca/code/npm"
- ]
- ],
- "_from": "aproba@1.2.0",
- "_id": "aproba@1.2.0",
+ "_from": "aproba@2.0.0",
+ "_id": "aproba@2.0.0",
"_inBundle": false,
- "_integrity": "sha512-Y9J6ZjXtoYh8RnXVCMOU/ttDmk1aBjunq9vO0ta5x85WDQiQfUF9sIPBITdbiiIVcBo03Hi3jMxigBtsddlXRw==",
+ "_integrity": "sha512-lYe4Gx7QT+MKGbDsA+Z+he/Wtef0BiwDOlK/XkBrdfsh9J/jPPXbX0tE9x9cl27Tmu5gg3QUbUrQYa/y+KOHPQ==",
"_location": "/aproba",
"_phantomChildren": {},
"_requested": {
"type": "version",
"registry": true,
- "raw": "aproba@1.2.0",
+ "raw": "aproba@2.0.0",
"name": "aproba",
"escapedName": "aproba",
- "rawSpec": "1.2.0",
+ "rawSpec": "2.0.0",
"saveSpec": null,
- "fetchSpec": "1.2.0"
+ "fetchSpec": "2.0.0"
},
"_requiredBy": [
+ "#USER",
"/",
- "/copy-concurrently",
- "/gauge",
- "/gentle-fs",
- "/move-concurrently",
- "/npm-profile",
- "/run-queue"
+ "/npm-profile"
],
- "_resolved": "https://registry.npmjs.org/aproba/-/aproba-1.2.0.tgz",
- "_spec": "1.2.0",
- "_where": "/Users/rebecca/code/npm",
+ "_resolved": "https://registry.npmjs.org/aproba/-/aproba-2.0.0.tgz",
+ "_shasum": "52520b8ae5b569215b354efc0caa3fe1e45a8adc",
+ "_spec": "aproba@2.0.0",
+ "_where": "/Users/aeschright/code/cli",
"author": {
"name": "Rebecca Turner",
"email": "me@re-becca.org"
@@ -40,11 +31,13 @@
"bugs": {
"url": "https://github.com/iarna/aproba/issues"
},
+ "bundleDependencies": false,
"dependencies": {},
+ "deprecated": false,
"description": "A ridiculously light-weight argument validator (now browser friendly)",
"devDependencies": {
- "standard": "^10.0.3",
- "tap": "^10.0.2"
+ "standard": "^11.0.1",
+ "tap": "^12.0.1"
},
"directories": {
"test": "test"
@@ -65,7 +58,8 @@
"url": "git+https://github.com/iarna/aproba.git"
},
"scripts": {
- "test": "standard && tap -j3 test/*.js"
+ "pretest": "standard",
+ "test": "tap --100 -J test/*.js"
},
- "version": "1.2.0"
+ "version": "2.0.0"
}
diff --git a/deps/npm/node_modules/readable-stream/.travis.yml b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/.travis.yml
similarity index 100%
rename from deps/npm/node_modules/readable-stream/.travis.yml
rename to deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/.travis.yml
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/CONTRIBUTING.md b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/CONTRIBUTING.md
new file mode 100644
index 00000000000000..f478d58dca85b2
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/CONTRIBUTING.md
@@ -0,0 +1,38 @@
+# Developer's Certificate of Origin 1.1
+
+By making a contribution to this project, I certify that:
+
+* (a) The contribution was created in whole or in part by me and I
+ have the right to submit it under the open source license
+ indicated in the file; or
+
+* (b) The contribution is based upon previous work that, to the best
+ of my knowledge, is covered under an appropriate open source
+ license and I have the right under that license to submit that
+ work with modifications, whether created in whole or in part
+ by me, under the same open source license (unless I am
+ permitted to submit under a different license), as indicated
+ in the file; or
+
+* (c) The contribution was provided directly to me by some other
+ person who certified (a), (b) or (c) and I have not modified
+ it.
+
+* (d) I understand and agree that this project and the contribution
+ are public and that a record of the contribution (including all
+ personal information I submit with it, including my sign-off) is
+ maintained indefinitely and may be redistributed consistent with
+ this project or the open source license(s) involved.
+
+## Moderation Policy
+
+The [Node.js Moderation Policy] applies to this WG.
+
+## Code of Conduct
+
+The [Node.js Code of Conduct][] applies to this WG.
+
+[Node.js Code of Conduct]:
+https://github.com/nodejs/node/blob/master/CODE_OF_CONDUCT.md
+[Node.js Moderation Policy]:
+https://github.com/nodejs/TSC/blob/master/Moderation-Policy.md
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/GOVERNANCE.md b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/GOVERNANCE.md
new file mode 100644
index 00000000000000..16ffb93f24bece
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/GOVERNANCE.md
@@ -0,0 +1,136 @@
+### Streams Working Group
+
+The Node.js Streams is jointly governed by a Working Group
+(WG)
+that is responsible for high-level guidance of the project.
+
+The WG has final authority over this project including:
+
+* Technical direction
+* Project governance and process (including this policy)
+* Contribution policy
+* GitHub repository hosting
+* Conduct guidelines
+* Maintaining the list of additional Collaborators
+
+For the current list of WG members, see the project
+[README.md](./README.md#current-project-team-members).
+
+### Collaborators
+
+The readable-stream GitHub repository is
+maintained by the WG and additional Collaborators who are added by the
+WG on an ongoing basis.
+
+Individuals making significant and valuable contributions are made
+Collaborators and given commit-access to the project. These
+individuals are identified by the WG and their addition as
+Collaborators is discussed during the WG meeting.
+
+_Note:_ If you make a significant contribution and are not considered
+for commit-access log an issue or contact a WG member directly and it
+will be brought up in the next WG meeting.
+
+Modifications of the contents of the readable-stream repository are
+made on
+a collaborative basis. Anybody with a GitHub account may propose a
+modification via pull request and it will be considered by the project
+Collaborators. All pull requests must be reviewed and accepted by a
+Collaborator with sufficient expertise who is able to take full
+responsibility for the change. In the case of pull requests proposed
+by an existing Collaborator, an additional Collaborator is required
+for sign-off. Consensus should be sought if additional Collaborators
+participate and there is disagreement around a particular
+modification. See _Consensus Seeking Process_ below for further detail
+on the consensus model used for governance.
+
+Collaborators may opt to elevate significant or controversial
+modifications, or modifications that have not found consensus to the
+WG for discussion by assigning the ***WG-agenda*** tag to a pull
+request or issue. The WG should serve as the final arbiter where
+required.
+
+For the current list of Collaborators, see the project
+[README.md](./README.md#members).
+
+### WG Membership
+
+WG seats are not time-limited. There is no fixed size of the WG.
+However, the expected target is between 6 and 12, to ensure adequate
+coverage of important areas of expertise, balanced with the ability to
+make decisions efficiently.
+
+There is no specific set of requirements or qualifications for WG
+membership beyond these rules.
+
+The WG may add additional members to the WG by unanimous consensus.
+
+A WG member may be removed from the WG by voluntary resignation, or by
+unanimous consensus of all other WG members.
+
+Changes to WG membership should be posted in the agenda, and may be
+suggested as any other agenda item (see "WG Meetings" below).
+
+If an addition or removal is proposed during a meeting, and the full
+WG is not in attendance to participate, then the addition or removal
+is added to the agenda for the subsequent meeting. This is to ensure
+that all members are given the opportunity to participate in all
+membership decisions. If a WG member is unable to attend a meeting
+where a planned membership decision is being made, then their consent
+is assumed.
+
+No more than 1/3 of the WG members may be affiliated with the same
+employer. If removal or resignation of a WG member, or a change of
+employment by a WG member, creates a situation where more than 1/3 of
+the WG membership shares an employer, then the situation must be
+immediately remedied by the resignation or removal of one or more WG
+members affiliated with the over-represented employer(s).
+
+### WG Meetings
+
+The WG meets occasionally on a Google Hangout On Air. A designated moderator
+approved by the WG runs the meeting. Each meeting should be
+published to YouTube.
+
+Items are added to the WG agenda that are considered contentious or
+are modifications of governance, contribution policy, WG membership,
+or release process.
+
+The intention of the agenda is not to approve or review all patches;
+that should happen continuously on GitHub and be handled by the larger
+group of Collaborators.
+
+Any community member or contributor can ask that something be added to
+the next meeting's agenda by logging a GitHub Issue. Any Collaborator,
+WG member or the moderator can add the item to the agenda by adding
+the ***WG-agenda*** tag to the issue.
+
+Prior to each WG meeting the moderator will share the Agenda with
+members of the WG. WG members can add any items they like to the
+agenda at the beginning of each meeting. The moderator and the WG
+cannot veto or remove items.
+
+The WG may invite persons or representatives from certain projects to
+participate in a non-voting capacity.
+
+The moderator is responsible for summarizing the discussion of each
+agenda item and sends it as a pull request after the meeting.
+
+### Consensus Seeking Process
+
+The WG follows a
+[Consensus
+Seeking](http://en.wikipedia.org/wiki/Consensus-seeking_decision-making)
+decision-making model.
+
+When an agenda item has appeared to reach a consensus the moderator
+will ask "Does anyone object?" as a final call for dissent from the
+consensus.
+
+If an agenda item cannot reach a consensus a WG member can call for
+either a closing vote or a vote to table the issue to the next
+meeting. The call for a vote must be seconded by a majority of the WG
+or else the discussion will continue. Simple majority wins.
+
+Note that changes to WG membership require a majority consensus. See
+"WG Membership" above.
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/LICENSE b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/LICENSE
new file mode 100644
index 00000000000000..2873b3b2e59507
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/LICENSE
@@ -0,0 +1,47 @@
+Node.js is licensed for use as follows:
+
+"""
+Copyright Node.js contributors. All rights reserved.
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to
+deal in the Software without restriction, including without limitation the
+rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
+sell copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
+IN THE SOFTWARE.
+"""
+
+This license applies to parts of Node.js originating from the
+https://github.com/joyent/node repository:
+
+"""
+Copyright Joyent, Inc. and other Node contributors. All rights reserved.
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to
+deal in the Software without restriction, including without limitation the
+rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
+sell copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
+IN THE SOFTWARE.
+"""
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/README.md b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/README.md
new file mode 100644
index 00000000000000..23fe3f3e3009a2
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/README.md
@@ -0,0 +1,58 @@
+# readable-stream
+
+***Node-core v8.11.1 streams for userland*** [![Build Status](https://travis-ci.org/nodejs/readable-stream.svg?branch=master)](https://travis-ci.org/nodejs/readable-stream)
+
+
+[![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/)
+[![NPM](https://nodei.co/npm-dl/readable-stream.png?&months=6&height=3)](https://nodei.co/npm/readable-stream/)
+
+
+[![Sauce Test Status](https://saucelabs.com/browser-matrix/readable-stream.svg)](https://saucelabs.com/u/readable-stream)
+
+```bash
+npm install --save readable-stream
+```
+
+***Node-core streams for userland***
+
+This package is a mirror of the Streams2 and Streams3 implementations in
+Node-core.
+
+Full documentation may be found on the [Node.js website](https://nodejs.org/dist/v8.11.1/docs/api/stream.html).
+
+If you want to guarantee a stable streams base, regardless of what version of
+Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core, for background see [this blogpost](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html).
+
+As of version 2.0.0 **readable-stream** uses semantic versioning.
+
+# Streams Working Group
+
+`readable-stream` is maintained by the Streams Working Group, which
+oversees the development and maintenance of the Streams API within
+Node.js. The responsibilities of the Streams Working Group include:
+
+* Addressing stream issues on the Node.js issue tracker.
+* Authoring and editing stream documentation within the Node.js project.
+* Reviewing changes to stream subclasses within the Node.js project.
+* Redirecting changes to streams from the Node.js project to this
+ project.
+* Assisting in the implementation of stream providers within Node.js.
+* Recommending versions of `readable-stream` to be included in Node.js.
+* Messaging about the future of streams to give the community advance
+ notice of changes.
+
+
+## Team Members
+
+* **Chris Dickinson** ([@chrisdickinson](https://github.com/chrisdickinson)) <christopher.s.dickinson@gmail.com>
+ - Release GPG key: 9554F04D7259F04124DE6B476D5A82AC7E37093B
+* **Calvin Metcalf** ([@calvinmetcalf](https://github.com/calvinmetcalf)) <calvin.metcalf@gmail.com>
+ - Release GPG key: F3EF5F62A87FC27A22E643F714CE4FF5015AA242
+* **Rod Vagg** ([@rvagg](https://github.com/rvagg)) <rod@vagg.org>
+ - Release GPG key: DD8F2338BAE7501E3DD5AC78C273792F7D83545D
+* **Sam Newman** ([@sonewman](https://github.com/sonewman)) <newmansam@outlook.com>
+* **Mathias Buus** ([@mafintosh](https://github.com/mafintosh)) <mathiasbuus@gmail.com>
+* **Domenic Denicola** ([@domenic](https://github.com/domenic)) <d@domenic.me>
+* **Matteo Collina** ([@mcollina](https://github.com/mcollina)) <matteo.collina@gmail.com>
+ - Release GPG key: 3ABC01543F22DD2239285CDD818674489FBC127E
+* **Irina Shestak** ([@lrlna](https://github.com/lrlna)) <shestak.irina@gmail.com>
diff --git a/deps/npm/node_modules/readable-stream/doc/wg-meetings/2015-01-30.md b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/doc/wg-meetings/2015-01-30.md
similarity index 99%
rename from deps/npm/node_modules/readable-stream/doc/wg-meetings/2015-01-30.md
rename to deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/doc/wg-meetings/2015-01-30.md
index 83275f192e4077..c141a99c26c638 100644
--- a/deps/npm/node_modules/readable-stream/doc/wg-meetings/2015-01-30.md
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/doc/wg-meetings/2015-01-30.md
@@ -56,5 +56,3 @@ simpler stream creation
* add isPaused/isFlowing
* add new docs section
* move isPaused to that section
-
-
diff --git a/deps/npm/node_modules/readable-stream/duplex-browser.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/duplex-browser.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/duplex-browser.js
rename to deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/duplex-browser.js
diff --git a/deps/npm/node_modules/readable-stream/duplex.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/duplex.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/duplex.js
rename to deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/duplex.js
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_duplex.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_duplex.js
new file mode 100644
index 00000000000000..a1ca813e5acbd8
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_duplex.js
@@ -0,0 +1,131 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+// a duplex stream is just a stream that is both readable and writable.
+// Since JS doesn't have multiple prototypal inheritance, this class
+// prototypally inherits from Readable, and then parasitically from
+// Writable.
+
+'use strict';
+
+/**/
+
+var pna = require('process-nextick-args');
+/* */
+
+/**/
+var objectKeys = Object.keys || function (obj) {
+ var keys = [];
+ for (var key in obj) {
+ keys.push(key);
+ }return keys;
+};
+/* */
+
+module.exports = Duplex;
+
+/**/
+var util = require('core-util-is');
+util.inherits = require('inherits');
+/* */
+
+var Readable = require('./_stream_readable');
+var Writable = require('./_stream_writable');
+
+util.inherits(Duplex, Readable);
+
+{
+ // avoid scope creep, the keys array can then be collected
+ var keys = objectKeys(Writable.prototype);
+ for (var v = 0; v < keys.length; v++) {
+ var method = keys[v];
+ if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method];
+ }
+}
+
+function Duplex(options) {
+ if (!(this instanceof Duplex)) return new Duplex(options);
+
+ Readable.call(this, options);
+ Writable.call(this, options);
+
+ if (options && options.readable === false) this.readable = false;
+
+ if (options && options.writable === false) this.writable = false;
+
+ this.allowHalfOpen = true;
+ if (options && options.allowHalfOpen === false) this.allowHalfOpen = false;
+
+ this.once('end', onend);
+}
+
+Object.defineProperty(Duplex.prototype, 'writableHighWaterMark', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function () {
+ return this._writableState.highWaterMark;
+ }
+});
+
+// the no-half-open enforcer
+function onend() {
+ // if we allow half-open state, or if the writable side ended,
+ // then we're ok.
+ if (this.allowHalfOpen || this._writableState.ended) return;
+
+ // no more data can be written.
+ // But allow more writes to happen in this tick.
+ pna.nextTick(onEndNT, this);
+}
+
+function onEndNT(self) {
+ self.end();
+}
+
+Object.defineProperty(Duplex.prototype, 'destroyed', {
+ get: function () {
+ if (this._readableState === undefined || this._writableState === undefined) {
+ return false;
+ }
+ return this._readableState.destroyed && this._writableState.destroyed;
+ },
+ set: function (value) {
+ // we ignore the value if the stream
+ // has not been initialized yet
+ if (this._readableState === undefined || this._writableState === undefined) {
+ return;
+ }
+
+ // backward compatibility, the user is explicitly
+ // managing destroyed
+ this._readableState.destroyed = value;
+ this._writableState.destroyed = value;
+ }
+});
+
+Duplex.prototype._destroy = function (err, cb) {
+ this.push(null);
+ this.end();
+
+ pna.nextTick(cb, err);
+};
\ No newline at end of file
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_passthrough.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_passthrough.js
new file mode 100644
index 00000000000000..a9c835884828d8
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_passthrough.js
@@ -0,0 +1,47 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+// a passthrough stream.
+// basically just the most minimal sort of Transform stream.
+// Every written chunk gets output as-is.
+
+'use strict';
+
+module.exports = PassThrough;
+
+var Transform = require('./_stream_transform');
+
+/**/
+var util = require('core-util-is');
+util.inherits = require('inherits');
+/* */
+
+util.inherits(PassThrough, Transform);
+
+function PassThrough(options) {
+ if (!(this instanceof PassThrough)) return new PassThrough(options);
+
+ Transform.call(this, options);
+}
+
+PassThrough.prototype._transform = function (chunk, encoding, cb) {
+ cb(null, chunk);
+};
\ No newline at end of file
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_readable.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_readable.js
new file mode 100644
index 00000000000000..bf34ac65e1108f
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_readable.js
@@ -0,0 +1,1019 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+'use strict';
+
+/**/
+
+var pna = require('process-nextick-args');
+/* */
+
+module.exports = Readable;
+
+/**/
+var isArray = require('isarray');
+/* */
+
+/**/
+var Duplex;
+/* */
+
+Readable.ReadableState = ReadableState;
+
+/**/
+var EE = require('events').EventEmitter;
+
+var EElistenerCount = function (emitter, type) {
+ return emitter.listeners(type).length;
+};
+/* */
+
+/**/
+var Stream = require('./internal/streams/stream');
+/* */
+
+/**/
+
+var Buffer = require('safe-buffer').Buffer;
+var OurUint8Array = global.Uint8Array || function () {};
+function _uint8ArrayToBuffer(chunk) {
+ return Buffer.from(chunk);
+}
+function _isUint8Array(obj) {
+ return Buffer.isBuffer(obj) || obj instanceof OurUint8Array;
+}
+
+/* */
+
+/**/
+var util = require('core-util-is');
+util.inherits = require('inherits');
+/* */
+
+/**/
+var debugUtil = require('util');
+var debug = void 0;
+if (debugUtil && debugUtil.debuglog) {
+ debug = debugUtil.debuglog('stream');
+} else {
+ debug = function () {};
+}
+/* */
+
+var BufferList = require('./internal/streams/BufferList');
+var destroyImpl = require('./internal/streams/destroy');
+var StringDecoder;
+
+util.inherits(Readable, Stream);
+
+var kProxyEvents = ['error', 'close', 'destroy', 'pause', 'resume'];
+
+function prependListener(emitter, event, fn) {
+ // Sadly this is not cacheable as some libraries bundle their own
+ // event emitter implementation with them.
+ if (typeof emitter.prependListener === 'function') return emitter.prependListener(event, fn);
+
+ // This is a hack to make sure that our error handler is attached before any
+ // userland ones. NEVER DO THIS. This is here only because this code needs
+ // to continue to work with older versions of Node.js that do not include
+ // the prependListener() method. The goal is to eventually remove this hack.
+ if (!emitter._events || !emitter._events[event]) emitter.on(event, fn);else if (isArray(emitter._events[event])) emitter._events[event].unshift(fn);else emitter._events[event] = [fn, emitter._events[event]];
+}
+
+function ReadableState(options, stream) {
+ Duplex = Duplex || require('./_stream_duplex');
+
+ options = options || {};
+
+ // Duplex streams are both readable and writable, but share
+ // the same options object.
+ // However, some cases require setting options to different
+ // values for the readable and the writable sides of the duplex stream.
+ // These options can be provided separately as readableXXX and writableXXX.
+ var isDuplex = stream instanceof Duplex;
+
+ // object stream flag. Used to make read(n) ignore n and to
+ // make all the buffer merging and length checks go away
+ this.objectMode = !!options.objectMode;
+
+ if (isDuplex) this.objectMode = this.objectMode || !!options.readableObjectMode;
+
+ // the point at which it stops calling _read() to fill the buffer
+ // Note: 0 is a valid value, means "don't call _read preemptively ever"
+ var hwm = options.highWaterMark;
+ var readableHwm = options.readableHighWaterMark;
+ var defaultHwm = this.objectMode ? 16 : 16 * 1024;
+
+ if (hwm || hwm === 0) this.highWaterMark = hwm;else if (isDuplex && (readableHwm || readableHwm === 0)) this.highWaterMark = readableHwm;else this.highWaterMark = defaultHwm;
+
+ // cast to ints.
+ this.highWaterMark = Math.floor(this.highWaterMark);
+
+ // A linked list is used to store data chunks instead of an array because the
+ // linked list can remove elements from the beginning faster than
+ // array.shift()
+ this.buffer = new BufferList();
+ this.length = 0;
+ this.pipes = null;
+ this.pipesCount = 0;
+ this.flowing = null;
+ this.ended = false;
+ this.endEmitted = false;
+ this.reading = false;
+
+ // a flag to be able to tell if the event 'readable'/'data' is emitted
+ // immediately, or on a later tick. We set this to true at first, because
+ // any actions that shouldn't happen until "later" should generally also
+ // not happen before the first read call.
+ this.sync = true;
+
+ // whenever we return null, then we set a flag to say
+ // that we're awaiting a 'readable' event emission.
+ this.needReadable = false;
+ this.emittedReadable = false;
+ this.readableListening = false;
+ this.resumeScheduled = false;
+
+ // has it been destroyed
+ this.destroyed = false;
+
+ // Crypto is kind of old and crusty. Historically, its default string
+ // encoding is 'binary' so we have to make this configurable.
+ // Everything else in the universe uses 'utf8', though.
+ this.defaultEncoding = options.defaultEncoding || 'utf8';
+
+ // the number of writers that are awaiting a drain event in .pipe()s
+ this.awaitDrain = 0;
+
+ // if true, a maybeReadMore has been scheduled
+ this.readingMore = false;
+
+ this.decoder = null;
+ this.encoding = null;
+ if (options.encoding) {
+ if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder;
+ this.decoder = new StringDecoder(options.encoding);
+ this.encoding = options.encoding;
+ }
+}
+
+function Readable(options) {
+ Duplex = Duplex || require('./_stream_duplex');
+
+ if (!(this instanceof Readable)) return new Readable(options);
+
+ this._readableState = new ReadableState(options, this);
+
+ // legacy
+ this.readable = true;
+
+ if (options) {
+ if (typeof options.read === 'function') this._read = options.read;
+
+ if (typeof options.destroy === 'function') this._destroy = options.destroy;
+ }
+
+ Stream.call(this);
+}
+
+Object.defineProperty(Readable.prototype, 'destroyed', {
+ get: function () {
+ if (this._readableState === undefined) {
+ return false;
+ }
+ return this._readableState.destroyed;
+ },
+ set: function (value) {
+ // we ignore the value if the stream
+ // has not been initialized yet
+ if (!this._readableState) {
+ return;
+ }
+
+ // backward compatibility, the user is explicitly
+ // managing destroyed
+ this._readableState.destroyed = value;
+ }
+});
+
+Readable.prototype.destroy = destroyImpl.destroy;
+Readable.prototype._undestroy = destroyImpl.undestroy;
+Readable.prototype._destroy = function (err, cb) {
+ this.push(null);
+ cb(err);
+};
+
+// Manually shove something into the read() buffer.
+// This returns true if the highWaterMark has not been hit yet,
+// similar to how Writable.write() returns true if you should
+// write() some more.
+Readable.prototype.push = function (chunk, encoding) {
+ var state = this._readableState;
+ var skipChunkCheck;
+
+ if (!state.objectMode) {
+ if (typeof chunk === 'string') {
+ encoding = encoding || state.defaultEncoding;
+ if (encoding !== state.encoding) {
+ chunk = Buffer.from(chunk, encoding);
+ encoding = '';
+ }
+ skipChunkCheck = true;
+ }
+ } else {
+ skipChunkCheck = true;
+ }
+
+ return readableAddChunk(this, chunk, encoding, false, skipChunkCheck);
+};
+
+// Unshift should *always* be something directly out of read()
+Readable.prototype.unshift = function (chunk) {
+ return readableAddChunk(this, chunk, null, true, false);
+};
+
+function readableAddChunk(stream, chunk, encoding, addToFront, skipChunkCheck) {
+ var state = stream._readableState;
+ if (chunk === null) {
+ state.reading = false;
+ onEofChunk(stream, state);
+ } else {
+ var er;
+ if (!skipChunkCheck) er = chunkInvalid(state, chunk);
+ if (er) {
+ stream.emit('error', er);
+ } else if (state.objectMode || chunk && chunk.length > 0) {
+ if (typeof chunk !== 'string' && !state.objectMode && Object.getPrototypeOf(chunk) !== Buffer.prototype) {
+ chunk = _uint8ArrayToBuffer(chunk);
+ }
+
+ if (addToFront) {
+ if (state.endEmitted) stream.emit('error', new Error('stream.unshift() after end event'));else addChunk(stream, state, chunk, true);
+ } else if (state.ended) {
+ stream.emit('error', new Error('stream.push() after EOF'));
+ } else {
+ state.reading = false;
+ if (state.decoder && !encoding) {
+ chunk = state.decoder.write(chunk);
+ if (state.objectMode || chunk.length !== 0) addChunk(stream, state, chunk, false);else maybeReadMore(stream, state);
+ } else {
+ addChunk(stream, state, chunk, false);
+ }
+ }
+ } else if (!addToFront) {
+ state.reading = false;
+ }
+ }
+
+ return needMoreData(state);
+}
+
+function addChunk(stream, state, chunk, addToFront) {
+ if (state.flowing && state.length === 0 && !state.sync) {
+ stream.emit('data', chunk);
+ stream.read(0);
+ } else {
+ // update the buffer info.
+ state.length += state.objectMode ? 1 : chunk.length;
+ if (addToFront) state.buffer.unshift(chunk);else state.buffer.push(chunk);
+
+ if (state.needReadable) emitReadable(stream);
+ }
+ maybeReadMore(stream, state);
+}
+
+function chunkInvalid(state, chunk) {
+ var er;
+ if (!_isUint8Array(chunk) && typeof chunk !== 'string' && chunk !== undefined && !state.objectMode) {
+ er = new TypeError('Invalid non-string/buffer chunk');
+ }
+ return er;
+}
+
+// if it's past the high water mark, we can push in some more.
+// Also, if we have no data yet, we can stand some
+// more bytes. This is to work around cases where hwm=0,
+// such as the repl. Also, if the push() triggered a
+// readable event, and the user called read(largeNumber) such that
+// needReadable was set, then we ought to push more, so that another
+// 'readable' event will be triggered.
+function needMoreData(state) {
+ return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0);
+}
+
+Readable.prototype.isPaused = function () {
+ return this._readableState.flowing === false;
+};
+
+// backwards compatibility.
+Readable.prototype.setEncoding = function (enc) {
+ if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder;
+ this._readableState.decoder = new StringDecoder(enc);
+ this._readableState.encoding = enc;
+ return this;
+};
+
+// Don't raise the hwm > 8MB
+var MAX_HWM = 0x800000;
+function computeNewHighWaterMark(n) {
+ if (n >= MAX_HWM) {
+ n = MAX_HWM;
+ } else {
+ // Get the next highest power of 2 to prevent increasing hwm excessively in
+ // tiny amounts
+ n--;
+ n |= n >>> 1;
+ n |= n >>> 2;
+ n |= n >>> 4;
+ n |= n >>> 8;
+ n |= n >>> 16;
+ n++;
+ }
+ return n;
+}
+
+// This function is designed to be inlinable, so please take care when making
+// changes to the function body.
+function howMuchToRead(n, state) {
+ if (n <= 0 || state.length === 0 && state.ended) return 0;
+ if (state.objectMode) return 1;
+ if (n !== n) {
+ // Only flow one buffer at a time
+ if (state.flowing && state.length) return state.buffer.head.data.length;else return state.length;
+ }
+ // If we're asking for more than the current hwm, then raise the hwm.
+ if (n > state.highWaterMark) state.highWaterMark = computeNewHighWaterMark(n);
+ if (n <= state.length) return n;
+ // Don't have enough
+ if (!state.ended) {
+ state.needReadable = true;
+ return 0;
+ }
+ return state.length;
+}
+
+// you can override either this method, or the async _read(n) below.
+Readable.prototype.read = function (n) {
+ debug('read', n);
+ n = parseInt(n, 10);
+ var state = this._readableState;
+ var nOrig = n;
+
+ if (n !== 0) state.emittedReadable = false;
+
+ // if we're doing read(0) to trigger a readable event, but we
+ // already have a bunch of data in the buffer, then just trigger
+ // the 'readable' event and move on.
+ if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) {
+ debug('read: emitReadable', state.length, state.ended);
+ if (state.length === 0 && state.ended) endReadable(this);else emitReadable(this);
+ return null;
+ }
+
+ n = howMuchToRead(n, state);
+
+ // if we've ended, and we're now clear, then finish it up.
+ if (n === 0 && state.ended) {
+ if (state.length === 0) endReadable(this);
+ return null;
+ }
+
+ // All the actual chunk generation logic needs to be
+ // *below* the call to _read. The reason is that in certain
+ // synthetic stream cases, such as passthrough streams, _read
+ // may be a completely synchronous operation which may change
+ // the state of the read buffer, providing enough data when
+ // before there was *not* enough.
+ //
+ // So, the steps are:
+ // 1. Figure out what the state of things will be after we do
+ // a read from the buffer.
+ //
+ // 2. If that resulting state will trigger a _read, then call _read.
+ // Note that this may be asynchronous, or synchronous. Yes, it is
+ // deeply ugly to write APIs this way, but that still doesn't mean
+ // that the Readable class should behave improperly, as streams are
+ // designed to be sync/async agnostic.
+ // Take note if the _read call is sync or async (ie, if the read call
+ // has returned yet), so that we know whether or not it's safe to emit
+ // 'readable' etc.
+ //
+ // 3. Actually pull the requested chunks out of the buffer and return.
+
+ // if we need a readable event, then we need to do some reading.
+ var doRead = state.needReadable;
+ debug('need readable', doRead);
+
+ // if we currently have less than the highWaterMark, then also read some
+ if (state.length === 0 || state.length - n < state.highWaterMark) {
+ doRead = true;
+ debug('length less than watermark', doRead);
+ }
+
+ // however, if we've ended, then there's no point, and if we're already
+ // reading, then it's unnecessary.
+ if (state.ended || state.reading) {
+ doRead = false;
+ debug('reading or ended', doRead);
+ } else if (doRead) {
+ debug('do read');
+ state.reading = true;
+ state.sync = true;
+ // if the length is currently zero, then we *need* a readable event.
+ if (state.length === 0) state.needReadable = true;
+ // call internal read method
+ this._read(state.highWaterMark);
+ state.sync = false;
+ // If _read pushed data synchronously, then `reading` will be false,
+ // and we need to re-evaluate how much data we can return to the user.
+ if (!state.reading) n = howMuchToRead(nOrig, state);
+ }
+
+ var ret;
+ if (n > 0) ret = fromList(n, state);else ret = null;
+
+ if (ret === null) {
+ state.needReadable = true;
+ n = 0;
+ } else {
+ state.length -= n;
+ }
+
+ if (state.length === 0) {
+ // If we have nothing in the buffer, then we want to know
+ // as soon as we *do* get something into the buffer.
+ if (!state.ended) state.needReadable = true;
+
+ // If we tried to read() past the EOF, then emit end on the next tick.
+ if (nOrig !== n && state.ended) endReadable(this);
+ }
+
+ if (ret !== null) this.emit('data', ret);
+
+ return ret;
+};
+
+function onEofChunk(stream, state) {
+ if (state.ended) return;
+ if (state.decoder) {
+ var chunk = state.decoder.end();
+ if (chunk && chunk.length) {
+ state.buffer.push(chunk);
+ state.length += state.objectMode ? 1 : chunk.length;
+ }
+ }
+ state.ended = true;
+
+ // emit 'readable' now to make sure it gets picked up.
+ emitReadable(stream);
+}
+
+// Don't emit readable right away in sync mode, because this can trigger
+// another read() call => stack overflow. This way, it might trigger
+// a nextTick recursion warning, but that's not so bad.
+function emitReadable(stream) {
+ var state = stream._readableState;
+ state.needReadable = false;
+ if (!state.emittedReadable) {
+ debug('emitReadable', state.flowing);
+ state.emittedReadable = true;
+ if (state.sync) pna.nextTick(emitReadable_, stream);else emitReadable_(stream);
+ }
+}
+
+function emitReadable_(stream) {
+ debug('emit readable');
+ stream.emit('readable');
+ flow(stream);
+}
+
+// at this point, the user has presumably seen the 'readable' event,
+// and called read() to consume some data. that may have triggered
+// in turn another _read(n) call, in which case reading = true if
+// it's in progress.
+// However, if we're not ended, or reading, and the length < hwm,
+// then go ahead and try to read some more preemptively.
+function maybeReadMore(stream, state) {
+ if (!state.readingMore) {
+ state.readingMore = true;
+ pna.nextTick(maybeReadMore_, stream, state);
+ }
+}
+
+function maybeReadMore_(stream, state) {
+ var len = state.length;
+ while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) {
+ debug('maybeReadMore read 0');
+ stream.read(0);
+ if (len === state.length)
+ // didn't get any data, stop spinning.
+ break;else len = state.length;
+ }
+ state.readingMore = false;
+}
+
+// abstract method. to be overridden in specific implementation classes.
+// call cb(er, data) where data is <= n in length.
+// for virtual (non-string, non-buffer) streams, "length" is somewhat
+// arbitrary, and perhaps not very meaningful.
+Readable.prototype._read = function (n) {
+ this.emit('error', new Error('_read() is not implemented'));
+};
+
+Readable.prototype.pipe = function (dest, pipeOpts) {
+ var src = this;
+ var state = this._readableState;
+
+ switch (state.pipesCount) {
+ case 0:
+ state.pipes = dest;
+ break;
+ case 1:
+ state.pipes = [state.pipes, dest];
+ break;
+ default:
+ state.pipes.push(dest);
+ break;
+ }
+ state.pipesCount += 1;
+ debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts);
+
+ var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr;
+
+ var endFn = doEnd ? onend : unpipe;
+ if (state.endEmitted) pna.nextTick(endFn);else src.once('end', endFn);
+
+ dest.on('unpipe', onunpipe);
+ function onunpipe(readable, unpipeInfo) {
+ debug('onunpipe');
+ if (readable === src) {
+ if (unpipeInfo && unpipeInfo.hasUnpiped === false) {
+ unpipeInfo.hasUnpiped = true;
+ cleanup();
+ }
+ }
+ }
+
+ function onend() {
+ debug('onend');
+ dest.end();
+ }
+
+ // when the dest drains, it reduces the awaitDrain counter
+ // on the source. This would be more elegant with a .once()
+ // handler in flow(), but adding and removing repeatedly is
+ // too slow.
+ var ondrain = pipeOnDrain(src);
+ dest.on('drain', ondrain);
+
+ var cleanedUp = false;
+ function cleanup() {
+ debug('cleanup');
+ // cleanup event handlers once the pipe is broken
+ dest.removeListener('close', onclose);
+ dest.removeListener('finish', onfinish);
+ dest.removeListener('drain', ondrain);
+ dest.removeListener('error', onerror);
+ dest.removeListener('unpipe', onunpipe);
+ src.removeListener('end', onend);
+ src.removeListener('end', unpipe);
+ src.removeListener('data', ondata);
+
+ cleanedUp = true;
+
+ // if the reader is waiting for a drain event from this
+ // specific writer, then it would cause it to never start
+ // flowing again.
+ // So, if this is awaiting a drain, then we just call it now.
+ // If we don't know, then assume that we are waiting for one.
+ if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain();
+ }
+
+ // If the user pushes more data while we're writing to dest then we'll end up
+ // in ondata again. However, we only want to increase awaitDrain once because
+ // dest will only emit one 'drain' event for the multiple writes.
+ // => Introduce a guard on increasing awaitDrain.
+ var increasedAwaitDrain = false;
+ src.on('data', ondata);
+ function ondata(chunk) {
+ debug('ondata');
+ increasedAwaitDrain = false;
+ var ret = dest.write(chunk);
+ if (false === ret && !increasedAwaitDrain) {
+ // If the user unpiped during `dest.write()`, it is possible
+ // to get stuck in a permanently paused state if that write
+ // also returned false.
+ // => Check whether `dest` is still a piping destination.
+ if ((state.pipesCount === 1 && state.pipes === dest || state.pipesCount > 1 && indexOf(state.pipes, dest) !== -1) && !cleanedUp) {
+ debug('false write response, pause', src._readableState.awaitDrain);
+ src._readableState.awaitDrain++;
+ increasedAwaitDrain = true;
+ }
+ src.pause();
+ }
+ }
+
+ // if the dest has an error, then stop piping into it.
+ // however, don't suppress the throwing behavior for this.
+ function onerror(er) {
+ debug('onerror', er);
+ unpipe();
+ dest.removeListener('error', onerror);
+ if (EElistenerCount(dest, 'error') === 0) dest.emit('error', er);
+ }
+
+ // Make sure our error handler is attached before userland ones.
+ prependListener(dest, 'error', onerror);
+
+ // Both close and finish should trigger unpipe, but only once.
+ function onclose() {
+ dest.removeListener('finish', onfinish);
+ unpipe();
+ }
+ dest.once('close', onclose);
+ function onfinish() {
+ debug('onfinish');
+ dest.removeListener('close', onclose);
+ unpipe();
+ }
+ dest.once('finish', onfinish);
+
+ function unpipe() {
+ debug('unpipe');
+ src.unpipe(dest);
+ }
+
+ // tell the dest that it's being piped to
+ dest.emit('pipe', src);
+
+ // start the flow if it hasn't been started already.
+ if (!state.flowing) {
+ debug('pipe resume');
+ src.resume();
+ }
+
+ return dest;
+};
+
+function pipeOnDrain(src) {
+ return function () {
+ var state = src._readableState;
+ debug('pipeOnDrain', state.awaitDrain);
+ if (state.awaitDrain) state.awaitDrain--;
+ if (state.awaitDrain === 0 && EElistenerCount(src, 'data')) {
+ state.flowing = true;
+ flow(src);
+ }
+ };
+}
+
+Readable.prototype.unpipe = function (dest) {
+ var state = this._readableState;
+ var unpipeInfo = { hasUnpiped: false };
+
+ // if we're not piping anywhere, then do nothing.
+ if (state.pipesCount === 0) return this;
+
+ // just one destination. most common case.
+ if (state.pipesCount === 1) {
+ // passed in one, but it's not the right one.
+ if (dest && dest !== state.pipes) return this;
+
+ if (!dest) dest = state.pipes;
+
+ // got a match.
+ state.pipes = null;
+ state.pipesCount = 0;
+ state.flowing = false;
+ if (dest) dest.emit('unpipe', this, unpipeInfo);
+ return this;
+ }
+
+ // slow case. multiple pipe destinations.
+
+ if (!dest) {
+ // remove all.
+ var dests = state.pipes;
+ var len = state.pipesCount;
+ state.pipes = null;
+ state.pipesCount = 0;
+ state.flowing = false;
+
+ for (var i = 0; i < len; i++) {
+ dests[i].emit('unpipe', this, unpipeInfo);
+ }return this;
+ }
+
+ // try to find the right one.
+ var index = indexOf(state.pipes, dest);
+ if (index === -1) return this;
+
+ state.pipes.splice(index, 1);
+ state.pipesCount -= 1;
+ if (state.pipesCount === 1) state.pipes = state.pipes[0];
+
+ dest.emit('unpipe', this, unpipeInfo);
+
+ return this;
+};
+
+// set up data events if they are asked for
+// Ensure readable listeners eventually get something
+Readable.prototype.on = function (ev, fn) {
+ var res = Stream.prototype.on.call(this, ev, fn);
+
+ if (ev === 'data') {
+ // Start flowing on next tick if stream isn't explicitly paused
+ if (this._readableState.flowing !== false) this.resume();
+ } else if (ev === 'readable') {
+ var state = this._readableState;
+ if (!state.endEmitted && !state.readableListening) {
+ state.readableListening = state.needReadable = true;
+ state.emittedReadable = false;
+ if (!state.reading) {
+ pna.nextTick(nReadingNextTick, this);
+ } else if (state.length) {
+ emitReadable(this);
+ }
+ }
+ }
+
+ return res;
+};
+Readable.prototype.addListener = Readable.prototype.on;
+
+function nReadingNextTick(self) {
+ debug('readable nexttick read 0');
+ self.read(0);
+}
+
+// pause() and resume() are remnants of the legacy readable stream API
+// If the user uses them, then switch into old mode.
+Readable.prototype.resume = function () {
+ var state = this._readableState;
+ if (!state.flowing) {
+ debug('resume');
+ state.flowing = true;
+ resume(this, state);
+ }
+ return this;
+};
+
+function resume(stream, state) {
+ if (!state.resumeScheduled) {
+ state.resumeScheduled = true;
+ pna.nextTick(resume_, stream, state);
+ }
+}
+
+function resume_(stream, state) {
+ if (!state.reading) {
+ debug('resume read 0');
+ stream.read(0);
+ }
+
+ state.resumeScheduled = false;
+ state.awaitDrain = 0;
+ stream.emit('resume');
+ flow(stream);
+ if (state.flowing && !state.reading) stream.read(0);
+}
+
+Readable.prototype.pause = function () {
+ debug('call pause flowing=%j', this._readableState.flowing);
+ if (false !== this._readableState.flowing) {
+ debug('pause');
+ this._readableState.flowing = false;
+ this.emit('pause');
+ }
+ return this;
+};
+
+function flow(stream) {
+ var state = stream._readableState;
+ debug('flow', state.flowing);
+ while (state.flowing && stream.read() !== null) {}
+}
+
+// wrap an old-style stream as the async data source.
+// This is *not* part of the readable stream interface.
+// It is an ugly unfortunate mess of history.
+Readable.prototype.wrap = function (stream) {
+ var _this = this;
+
+ var state = this._readableState;
+ var paused = false;
+
+ stream.on('end', function () {
+ debug('wrapped end');
+ if (state.decoder && !state.ended) {
+ var chunk = state.decoder.end();
+ if (chunk && chunk.length) _this.push(chunk);
+ }
+
+ _this.push(null);
+ });
+
+ stream.on('data', function (chunk) {
+ debug('wrapped data');
+ if (state.decoder) chunk = state.decoder.write(chunk);
+
+ // don't skip over falsy values in objectMode
+ if (state.objectMode && (chunk === null || chunk === undefined)) return;else if (!state.objectMode && (!chunk || !chunk.length)) return;
+
+ var ret = _this.push(chunk);
+ if (!ret) {
+ paused = true;
+ stream.pause();
+ }
+ });
+
+ // proxy all the other methods.
+ // important when wrapping filters and duplexes.
+ for (var i in stream) {
+ if (this[i] === undefined && typeof stream[i] === 'function') {
+ this[i] = function (method) {
+ return function () {
+ return stream[method].apply(stream, arguments);
+ };
+ }(i);
+ }
+ }
+
+ // proxy certain important events.
+ for (var n = 0; n < kProxyEvents.length; n++) {
+ stream.on(kProxyEvents[n], this.emit.bind(this, kProxyEvents[n]));
+ }
+
+ // when we try to consume some more bytes, simply unpause the
+ // underlying stream.
+ this._read = function (n) {
+ debug('wrapped _read', n);
+ if (paused) {
+ paused = false;
+ stream.resume();
+ }
+ };
+
+ return this;
+};
+
+Object.defineProperty(Readable.prototype, 'readableHighWaterMark', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function () {
+ return this._readableState.highWaterMark;
+ }
+});
+
+// exposed for testing purposes only.
+Readable._fromList = fromList;
+
+// Pluck off n bytes from an array of buffers.
+// Length is the combined lengths of all the buffers in the list.
+// This function is designed to be inlinable, so please take care when making
+// changes to the function body.
+function fromList(n, state) {
+ // nothing buffered
+ if (state.length === 0) return null;
+
+ var ret;
+ if (state.objectMode) ret = state.buffer.shift();else if (!n || n >= state.length) {
+ // read it all, truncate the list
+ if (state.decoder) ret = state.buffer.join('');else if (state.buffer.length === 1) ret = state.buffer.head.data;else ret = state.buffer.concat(state.length);
+ state.buffer.clear();
+ } else {
+ // read part of list
+ ret = fromListPartial(n, state.buffer, state.decoder);
+ }
+
+ return ret;
+}
+
+// Extracts only enough buffered data to satisfy the amount requested.
+// This function is designed to be inlinable, so please take care when making
+// changes to the function body.
+function fromListPartial(n, list, hasStrings) {
+ var ret;
+ if (n < list.head.data.length) {
+ // slice is the same for buffers and strings
+ ret = list.head.data.slice(0, n);
+ list.head.data = list.head.data.slice(n);
+ } else if (n === list.head.data.length) {
+ // first chunk is a perfect match
+ ret = list.shift();
+ } else {
+ // result spans more than one buffer
+ ret = hasStrings ? copyFromBufferString(n, list) : copyFromBuffer(n, list);
+ }
+ return ret;
+}
+
+// Copies a specified amount of characters from the list of buffered data
+// chunks.
+// This function is designed to be inlinable, so please take care when making
+// changes to the function body.
+function copyFromBufferString(n, list) {
+ var p = list.head;
+ var c = 1;
+ var ret = p.data;
+ n -= ret.length;
+ while (p = p.next) {
+ var str = p.data;
+ var nb = n > str.length ? str.length : n;
+ if (nb === str.length) ret += str;else ret += str.slice(0, n);
+ n -= nb;
+ if (n === 0) {
+ if (nb === str.length) {
+ ++c;
+ if (p.next) list.head = p.next;else list.head = list.tail = null;
+ } else {
+ list.head = p;
+ p.data = str.slice(nb);
+ }
+ break;
+ }
+ ++c;
+ }
+ list.length -= c;
+ return ret;
+}
+
+// Copies a specified amount of bytes from the list of buffered data chunks.
+// This function is designed to be inlinable, so please take care when making
+// changes to the function body.
+function copyFromBuffer(n, list) {
+ var ret = Buffer.allocUnsafe(n);
+ var p = list.head;
+ var c = 1;
+ p.data.copy(ret);
+ n -= p.data.length;
+ while (p = p.next) {
+ var buf = p.data;
+ var nb = n > buf.length ? buf.length : n;
+ buf.copy(ret, ret.length - n, 0, nb);
+ n -= nb;
+ if (n === 0) {
+ if (nb === buf.length) {
+ ++c;
+ if (p.next) list.head = p.next;else list.head = list.tail = null;
+ } else {
+ list.head = p;
+ p.data = buf.slice(nb);
+ }
+ break;
+ }
+ ++c;
+ }
+ list.length -= c;
+ return ret;
+}
+
+function endReadable(stream) {
+ var state = stream._readableState;
+
+ // If we get here before consuming all the bytes, then that is a
+ // bug in node. Should never happen.
+ if (state.length > 0) throw new Error('"endReadable()" called on non-empty stream');
+
+ if (!state.endEmitted) {
+ state.ended = true;
+ pna.nextTick(endReadableNT, state, stream);
+ }
+}
+
+function endReadableNT(state, stream) {
+ // Check that we didn't get one last unshift.
+ if (!state.endEmitted && state.length === 0) {
+ state.endEmitted = true;
+ stream.readable = false;
+ stream.emit('end');
+ }
+}
+
+function indexOf(xs, x) {
+ for (var i = 0, l = xs.length; i < l; i++) {
+ if (xs[i] === x) return i;
+ }
+ return -1;
+}
\ No newline at end of file
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_transform.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_transform.js
new file mode 100644
index 00000000000000..5d1f8b876d98c7
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_transform.js
@@ -0,0 +1,214 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+// a transform stream is a readable/writable stream where you do
+// something with the data. Sometimes it's called a "filter",
+// but that's not a great name for it, since that implies a thing where
+// some bits pass through, and others are simply ignored. (That would
+// be a valid example of a transform, of course.)
+//
+// While the output is causally related to the input, it's not a
+// necessarily symmetric or synchronous transformation. For example,
+// a zlib stream might take multiple plain-text writes(), and then
+// emit a single compressed chunk some time in the future.
+//
+// Here's how this works:
+//
+// The Transform stream has all the aspects of the readable and writable
+// stream classes. When you write(chunk), that calls _write(chunk,cb)
+// internally, and returns false if there's a lot of pending writes
+// buffered up. When you call read(), that calls _read(n) until
+// there's enough pending readable data buffered up.
+//
+// In a transform stream, the written data is placed in a buffer. When
+// _read(n) is called, it transforms the queued up data, calling the
+// buffered _write cb's as it consumes chunks. If consuming a single
+// written chunk would result in multiple output chunks, then the first
+// outputted bit calls the readcb, and subsequent chunks just go into
+// the read buffer, and will cause it to emit 'readable' if necessary.
+//
+// This way, back-pressure is actually determined by the reading side,
+// since _read has to be called to start processing a new chunk. However,
+// a pathological inflate type of transform can cause excessive buffering
+// here. For example, imagine a stream where every byte of input is
+// interpreted as an integer from 0-255, and then results in that many
+// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
+// 1kb of data being output. In this case, you could write a very small
+// amount of input, and end up with a very large amount of output. In
+// such a pathological inflating mechanism, there'd be no way to tell
+// the system to stop doing the transform. A single 4MB write could
+// cause the system to run out of memory.
+//
+// However, even in such a pathological case, only a single written chunk
+// would be consumed, and then the rest would wait (un-transformed) until
+// the results of the previous transformed chunk were consumed.
+
+'use strict';
+
+module.exports = Transform;
+
+var Duplex = require('./_stream_duplex');
+
+/**/
+var util = require('core-util-is');
+util.inherits = require('inherits');
+/* */
+
+util.inherits(Transform, Duplex);
+
+function afterTransform(er, data) {
+ var ts = this._transformState;
+ ts.transforming = false;
+
+ var cb = ts.writecb;
+
+ if (!cb) {
+ return this.emit('error', new Error('write callback called multiple times'));
+ }
+
+ ts.writechunk = null;
+ ts.writecb = null;
+
+ if (data != null) // single equals check for both `null` and `undefined`
+ this.push(data);
+
+ cb(er);
+
+ var rs = this._readableState;
+ rs.reading = false;
+ if (rs.needReadable || rs.length < rs.highWaterMark) {
+ this._read(rs.highWaterMark);
+ }
+}
+
+function Transform(options) {
+ if (!(this instanceof Transform)) return new Transform(options);
+
+ Duplex.call(this, options);
+
+ this._transformState = {
+ afterTransform: afterTransform.bind(this),
+ needTransform: false,
+ transforming: false,
+ writecb: null,
+ writechunk: null,
+ writeencoding: null
+ };
+
+ // start out asking for a readable event once data is transformed.
+ this._readableState.needReadable = true;
+
+ // we have implemented the _read method, and done the other things
+ // that Readable wants before the first _read call, so unset the
+ // sync guard flag.
+ this._readableState.sync = false;
+
+ if (options) {
+ if (typeof options.transform === 'function') this._transform = options.transform;
+
+ if (typeof options.flush === 'function') this._flush = options.flush;
+ }
+
+ // When the writable side finishes, then flush out anything remaining.
+ this.on('prefinish', prefinish);
+}
+
+function prefinish() {
+ var _this = this;
+
+ if (typeof this._flush === 'function') {
+ this._flush(function (er, data) {
+ done(_this, er, data);
+ });
+ } else {
+ done(this, null, null);
+ }
+}
+
+Transform.prototype.push = function (chunk, encoding) {
+ this._transformState.needTransform = false;
+ return Duplex.prototype.push.call(this, chunk, encoding);
+};
+
+// This is the part where you do stuff!
+// override this function in implementation classes.
+// 'chunk' is an input chunk.
+//
+// Call `push(newChunk)` to pass along transformed output
+// to the readable side. You may call 'push' zero or more times.
+//
+// Call `cb(err)` when you are done with this chunk. If you pass
+// an error, then that'll put the hurt on the whole operation. If you
+// never call cb(), then you'll never get another chunk.
+Transform.prototype._transform = function (chunk, encoding, cb) {
+ throw new Error('_transform() is not implemented');
+};
+
+Transform.prototype._write = function (chunk, encoding, cb) {
+ var ts = this._transformState;
+ ts.writecb = cb;
+ ts.writechunk = chunk;
+ ts.writeencoding = encoding;
+ if (!ts.transforming) {
+ var rs = this._readableState;
+ if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark);
+ }
+};
+
+// Doesn't matter what the args are here.
+// _transform does all the work.
+// That we got here means that the readable side wants more data.
+Transform.prototype._read = function (n) {
+ var ts = this._transformState;
+
+ if (ts.writechunk !== null && ts.writecb && !ts.transforming) {
+ ts.transforming = true;
+ this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform);
+ } else {
+ // mark that we need a transform, so that any data that comes in
+ // will get processed, now that we've asked for it.
+ ts.needTransform = true;
+ }
+};
+
+Transform.prototype._destroy = function (err, cb) {
+ var _this2 = this;
+
+ Duplex.prototype._destroy.call(this, err, function (err2) {
+ cb(err2);
+ _this2.emit('close');
+ });
+};
+
+function done(stream, er, data) {
+ if (er) return stream.emit('error', er);
+
+ if (data != null) // single equals check for both `null` and `undefined`
+ stream.push(data);
+
+ // if there's nothing in the write buffer, then that means
+ // that nothing more will ever be provided
+ if (stream._writableState.length) throw new Error('Calling transform done when ws.length != 0');
+
+ if (stream._transformState.transforming) throw new Error('Calling transform done when still transforming');
+
+ return stream.push(null);
+}
\ No newline at end of file
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_writable.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_writable.js
new file mode 100644
index 00000000000000..b3f4e85a2f6e35
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_writable.js
@@ -0,0 +1,687 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+// A bit simpler than readable streams.
+// Implement an async ._write(chunk, encoding, cb), and it'll handle all
+// the drain event emission and buffering.
+
+'use strict';
+
+/**/
+
+var pna = require('process-nextick-args');
+/* */
+
+module.exports = Writable;
+
+/* */
+function WriteReq(chunk, encoding, cb) {
+ this.chunk = chunk;
+ this.encoding = encoding;
+ this.callback = cb;
+ this.next = null;
+}
+
+// It seems a linked list but it is not
+// there will be only 2 of these for each stream
+function CorkedRequest(state) {
+ var _this = this;
+
+ this.next = null;
+ this.entry = null;
+ this.finish = function () {
+ onCorkedFinish(_this, state);
+ };
+}
+/* */
+
+/**/
+var asyncWrite = !process.browser && ['v0.10', 'v0.9.'].indexOf(process.version.slice(0, 5)) > -1 ? setImmediate : pna.nextTick;
+/* */
+
+/**/
+var Duplex;
+/* */
+
+Writable.WritableState = WritableState;
+
+/**/
+var util = require('core-util-is');
+util.inherits = require('inherits');
+/* */
+
+/**/
+var internalUtil = {
+ deprecate: require('util-deprecate')
+};
+/* */
+
+/**/
+var Stream = require('./internal/streams/stream');
+/* */
+
+/**/
+
+var Buffer = require('safe-buffer').Buffer;
+var OurUint8Array = global.Uint8Array || function () {};
+function _uint8ArrayToBuffer(chunk) {
+ return Buffer.from(chunk);
+}
+function _isUint8Array(obj) {
+ return Buffer.isBuffer(obj) || obj instanceof OurUint8Array;
+}
+
+/* */
+
+var destroyImpl = require('./internal/streams/destroy');
+
+util.inherits(Writable, Stream);
+
+function nop() {}
+
+function WritableState(options, stream) {
+ Duplex = Duplex || require('./_stream_duplex');
+
+ options = options || {};
+
+ // Duplex streams are both readable and writable, but share
+ // the same options object.
+ // However, some cases require setting options to different
+ // values for the readable and the writable sides of the duplex stream.
+ // These options can be provided separately as readableXXX and writableXXX.
+ var isDuplex = stream instanceof Duplex;
+
+ // object stream flag to indicate whether or not this stream
+ // contains buffers or objects.
+ this.objectMode = !!options.objectMode;
+
+ if (isDuplex) this.objectMode = this.objectMode || !!options.writableObjectMode;
+
+ // the point at which write() starts returning false
+ // Note: 0 is a valid value, means that we always return false if
+ // the entire buffer is not flushed immediately on write()
+ var hwm = options.highWaterMark;
+ var writableHwm = options.writableHighWaterMark;
+ var defaultHwm = this.objectMode ? 16 : 16 * 1024;
+
+ if (hwm || hwm === 0) this.highWaterMark = hwm;else if (isDuplex && (writableHwm || writableHwm === 0)) this.highWaterMark = writableHwm;else this.highWaterMark = defaultHwm;
+
+ // cast to ints.
+ this.highWaterMark = Math.floor(this.highWaterMark);
+
+ // if _final has been called
+ this.finalCalled = false;
+
+ // drain event flag.
+ this.needDrain = false;
+ // at the start of calling end()
+ this.ending = false;
+ // when end() has been called, and returned
+ this.ended = false;
+ // when 'finish' is emitted
+ this.finished = false;
+
+ // has it been destroyed
+ this.destroyed = false;
+
+ // should we decode strings into buffers before passing to _write?
+ // this is here so that some node-core streams can optimize string
+ // handling at a lower level.
+ var noDecode = options.decodeStrings === false;
+ this.decodeStrings = !noDecode;
+
+ // Crypto is kind of old and crusty. Historically, its default string
+ // encoding is 'binary' so we have to make this configurable.
+ // Everything else in the universe uses 'utf8', though.
+ this.defaultEncoding = options.defaultEncoding || 'utf8';
+
+ // not an actual buffer we keep track of, but a measurement
+ // of how much we're waiting to get pushed to some underlying
+ // socket or file.
+ this.length = 0;
+
+ // a flag to see when we're in the middle of a write.
+ this.writing = false;
+
+ // when true all writes will be buffered until .uncork() call
+ this.corked = 0;
+
+ // a flag to be able to tell if the onwrite cb is called immediately,
+ // or on a later tick. We set this to true at first, because any
+ // actions that shouldn't happen until "later" should generally also
+ // not happen before the first write call.
+ this.sync = true;
+
+ // a flag to know if we're processing previously buffered items, which
+ // may call the _write() callback in the same tick, so that we don't
+ // end up in an overlapped onwrite situation.
+ this.bufferProcessing = false;
+
+ // the callback that's passed to _write(chunk,cb)
+ this.onwrite = function (er) {
+ onwrite(stream, er);
+ };
+
+ // the callback that the user supplies to write(chunk,encoding,cb)
+ this.writecb = null;
+
+ // the amount that is being written when _write is called.
+ this.writelen = 0;
+
+ this.bufferedRequest = null;
+ this.lastBufferedRequest = null;
+
+ // number of pending user-supplied write callbacks
+ // this must be 0 before 'finish' can be emitted
+ this.pendingcb = 0;
+
+ // emit prefinish if the only thing we're waiting for is _write cbs
+ // This is relevant for synchronous Transform streams
+ this.prefinished = false;
+
+ // True if the error was already emitted and should not be thrown again
+ this.errorEmitted = false;
+
+ // count buffered requests
+ this.bufferedRequestCount = 0;
+
+ // allocate the first CorkedRequest, there is always
+ // one allocated and free to use, and we maintain at most two
+ this.corkedRequestsFree = new CorkedRequest(this);
+}
+
+WritableState.prototype.getBuffer = function getBuffer() {
+ var current = this.bufferedRequest;
+ var out = [];
+ while (current) {
+ out.push(current);
+ current = current.next;
+ }
+ return out;
+};
+
+(function () {
+ try {
+ Object.defineProperty(WritableState.prototype, 'buffer', {
+ get: internalUtil.deprecate(function () {
+ return this.getBuffer();
+ }, '_writableState.buffer is deprecated. Use _writableState.getBuffer ' + 'instead.', 'DEP0003')
+ });
+ } catch (_) {}
+})();
+
+// Test _writableState for inheritance to account for Duplex streams,
+// whose prototype chain only points to Readable.
+var realHasInstance;
+if (typeof Symbol === 'function' && Symbol.hasInstance && typeof Function.prototype[Symbol.hasInstance] === 'function') {
+ realHasInstance = Function.prototype[Symbol.hasInstance];
+ Object.defineProperty(Writable, Symbol.hasInstance, {
+ value: function (object) {
+ if (realHasInstance.call(this, object)) return true;
+ if (this !== Writable) return false;
+
+ return object && object._writableState instanceof WritableState;
+ }
+ });
+} else {
+ realHasInstance = function (object) {
+ return object instanceof this;
+ };
+}
+
+function Writable(options) {
+ Duplex = Duplex || require('./_stream_duplex');
+
+ // Writable ctor is applied to Duplexes, too.
+ // `realHasInstance` is necessary because using plain `instanceof`
+ // would return false, as no `_writableState` property is attached.
+
+ // Trying to use the custom `instanceof` for Writable here will also break the
+ // Node.js LazyTransform implementation, which has a non-trivial getter for
+ // `_writableState` that would lead to infinite recursion.
+ if (!realHasInstance.call(Writable, this) && !(this instanceof Duplex)) {
+ return new Writable(options);
+ }
+
+ this._writableState = new WritableState(options, this);
+
+ // legacy.
+ this.writable = true;
+
+ if (options) {
+ if (typeof options.write === 'function') this._write = options.write;
+
+ if (typeof options.writev === 'function') this._writev = options.writev;
+
+ if (typeof options.destroy === 'function') this._destroy = options.destroy;
+
+ if (typeof options.final === 'function') this._final = options.final;
+ }
+
+ Stream.call(this);
+}
+
+// Otherwise people can pipe Writable streams, which is just wrong.
+Writable.prototype.pipe = function () {
+ this.emit('error', new Error('Cannot pipe, not readable'));
+};
+
+function writeAfterEnd(stream, cb) {
+ var er = new Error('write after end');
+ // TODO: defer error events consistently everywhere, not just the cb
+ stream.emit('error', er);
+ pna.nextTick(cb, er);
+}
+
+// Checks that a user-supplied chunk is valid, especially for the particular
+// mode the stream is in. Currently this means that `null` is never accepted
+// and undefined/non-string values are only allowed in object mode.
+function validChunk(stream, state, chunk, cb) {
+ var valid = true;
+ var er = false;
+
+ if (chunk === null) {
+ er = new TypeError('May not write null values to stream');
+ } else if (typeof chunk !== 'string' && chunk !== undefined && !state.objectMode) {
+ er = new TypeError('Invalid non-string/buffer chunk');
+ }
+ if (er) {
+ stream.emit('error', er);
+ pna.nextTick(cb, er);
+ valid = false;
+ }
+ return valid;
+}
+
+Writable.prototype.write = function (chunk, encoding, cb) {
+ var state = this._writableState;
+ var ret = false;
+ var isBuf = !state.objectMode && _isUint8Array(chunk);
+
+ if (isBuf && !Buffer.isBuffer(chunk)) {
+ chunk = _uint8ArrayToBuffer(chunk);
+ }
+
+ if (typeof encoding === 'function') {
+ cb = encoding;
+ encoding = null;
+ }
+
+ if (isBuf) encoding = 'buffer';else if (!encoding) encoding = state.defaultEncoding;
+
+ if (typeof cb !== 'function') cb = nop;
+
+ if (state.ended) writeAfterEnd(this, cb);else if (isBuf || validChunk(this, state, chunk, cb)) {
+ state.pendingcb++;
+ ret = writeOrBuffer(this, state, isBuf, chunk, encoding, cb);
+ }
+
+ return ret;
+};
+
+Writable.prototype.cork = function () {
+ var state = this._writableState;
+
+ state.corked++;
+};
+
+Writable.prototype.uncork = function () {
+ var state = this._writableState;
+
+ if (state.corked) {
+ state.corked--;
+
+ if (!state.writing && !state.corked && !state.finished && !state.bufferProcessing && state.bufferedRequest) clearBuffer(this, state);
+ }
+};
+
+Writable.prototype.setDefaultEncoding = function setDefaultEncoding(encoding) {
+ // node::ParseEncoding() requires lower case.
+ if (typeof encoding === 'string') encoding = encoding.toLowerCase();
+ if (!(['hex', 'utf8', 'utf-8', 'ascii', 'binary', 'base64', 'ucs2', 'ucs-2', 'utf16le', 'utf-16le', 'raw'].indexOf((encoding + '').toLowerCase()) > -1)) throw new TypeError('Unknown encoding: ' + encoding);
+ this._writableState.defaultEncoding = encoding;
+ return this;
+};
+
+function decodeChunk(state, chunk, encoding) {
+ if (!state.objectMode && state.decodeStrings !== false && typeof chunk === 'string') {
+ chunk = Buffer.from(chunk, encoding);
+ }
+ return chunk;
+}
+
+Object.defineProperty(Writable.prototype, 'writableHighWaterMark', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function () {
+ return this._writableState.highWaterMark;
+ }
+});
+
+// if we're already writing something, then just put this
+// in the queue, and wait our turn. Otherwise, call _write
+// If we return false, then we need a drain event, so set that flag.
+function writeOrBuffer(stream, state, isBuf, chunk, encoding, cb) {
+ if (!isBuf) {
+ var newChunk = decodeChunk(state, chunk, encoding);
+ if (chunk !== newChunk) {
+ isBuf = true;
+ encoding = 'buffer';
+ chunk = newChunk;
+ }
+ }
+ var len = state.objectMode ? 1 : chunk.length;
+
+ state.length += len;
+
+ var ret = state.length < state.highWaterMark;
+ // we must ensure that previous needDrain will not be reset to false.
+ if (!ret) state.needDrain = true;
+
+ if (state.writing || state.corked) {
+ var last = state.lastBufferedRequest;
+ state.lastBufferedRequest = {
+ chunk: chunk,
+ encoding: encoding,
+ isBuf: isBuf,
+ callback: cb,
+ next: null
+ };
+ if (last) {
+ last.next = state.lastBufferedRequest;
+ } else {
+ state.bufferedRequest = state.lastBufferedRequest;
+ }
+ state.bufferedRequestCount += 1;
+ } else {
+ doWrite(stream, state, false, len, chunk, encoding, cb);
+ }
+
+ return ret;
+}
+
+function doWrite(stream, state, writev, len, chunk, encoding, cb) {
+ state.writelen = len;
+ state.writecb = cb;
+ state.writing = true;
+ state.sync = true;
+ if (writev) stream._writev(chunk, state.onwrite);else stream._write(chunk, encoding, state.onwrite);
+ state.sync = false;
+}
+
+function onwriteError(stream, state, sync, er, cb) {
+ --state.pendingcb;
+
+ if (sync) {
+ // defer the callback if we are being called synchronously
+ // to avoid piling up things on the stack
+ pna.nextTick(cb, er);
+ // this can emit finish, and it will always happen
+ // after error
+ pna.nextTick(finishMaybe, stream, state);
+ stream._writableState.errorEmitted = true;
+ stream.emit('error', er);
+ } else {
+ // the caller expect this to happen before if
+ // it is async
+ cb(er);
+ stream._writableState.errorEmitted = true;
+ stream.emit('error', er);
+ // this can emit finish, but finish must
+ // always follow error
+ finishMaybe(stream, state);
+ }
+}
+
+function onwriteStateUpdate(state) {
+ state.writing = false;
+ state.writecb = null;
+ state.length -= state.writelen;
+ state.writelen = 0;
+}
+
+function onwrite(stream, er) {
+ var state = stream._writableState;
+ var sync = state.sync;
+ var cb = state.writecb;
+
+ onwriteStateUpdate(state);
+
+ if (er) onwriteError(stream, state, sync, er, cb);else {
+ // Check if we're actually ready to finish, but don't emit yet
+ var finished = needFinish(state);
+
+ if (!finished && !state.corked && !state.bufferProcessing && state.bufferedRequest) {
+ clearBuffer(stream, state);
+ }
+
+ if (sync) {
+ /**/
+ asyncWrite(afterWrite, stream, state, finished, cb);
+ /* */
+ } else {
+ afterWrite(stream, state, finished, cb);
+ }
+ }
+}
+
+function afterWrite(stream, state, finished, cb) {
+ if (!finished) onwriteDrain(stream, state);
+ state.pendingcb--;
+ cb();
+ finishMaybe(stream, state);
+}
+
+// Must force callback to be called on nextTick, so that we don't
+// emit 'drain' before the write() consumer gets the 'false' return
+// value, and has a chance to attach a 'drain' listener.
+function onwriteDrain(stream, state) {
+ if (state.length === 0 && state.needDrain) {
+ state.needDrain = false;
+ stream.emit('drain');
+ }
+}
+
+// if there's something in the buffer waiting, then process it
+function clearBuffer(stream, state) {
+ state.bufferProcessing = true;
+ var entry = state.bufferedRequest;
+
+ if (stream._writev && entry && entry.next) {
+ // Fast case, write everything using _writev()
+ var l = state.bufferedRequestCount;
+ var buffer = new Array(l);
+ var holder = state.corkedRequestsFree;
+ holder.entry = entry;
+
+ var count = 0;
+ var allBuffers = true;
+ while (entry) {
+ buffer[count] = entry;
+ if (!entry.isBuf) allBuffers = false;
+ entry = entry.next;
+ count += 1;
+ }
+ buffer.allBuffers = allBuffers;
+
+ doWrite(stream, state, true, state.length, buffer, '', holder.finish);
+
+ // doWrite is almost always async, defer these to save a bit of time
+ // as the hot path ends with doWrite
+ state.pendingcb++;
+ state.lastBufferedRequest = null;
+ if (holder.next) {
+ state.corkedRequestsFree = holder.next;
+ holder.next = null;
+ } else {
+ state.corkedRequestsFree = new CorkedRequest(state);
+ }
+ state.bufferedRequestCount = 0;
+ } else {
+ // Slow case, write chunks one-by-one
+ while (entry) {
+ var chunk = entry.chunk;
+ var encoding = entry.encoding;
+ var cb = entry.callback;
+ var len = state.objectMode ? 1 : chunk.length;
+
+ doWrite(stream, state, false, len, chunk, encoding, cb);
+ entry = entry.next;
+ state.bufferedRequestCount--;
+ // if we didn't call the onwrite immediately, then
+ // it means that we need to wait until it does.
+ // also, that means that the chunk and cb are currently
+ // being processed, so move the buffer counter past them.
+ if (state.writing) {
+ break;
+ }
+ }
+
+ if (entry === null) state.lastBufferedRequest = null;
+ }
+
+ state.bufferedRequest = entry;
+ state.bufferProcessing = false;
+}
+
+Writable.prototype._write = function (chunk, encoding, cb) {
+ cb(new Error('_write() is not implemented'));
+};
+
+Writable.prototype._writev = null;
+
+Writable.prototype.end = function (chunk, encoding, cb) {
+ var state = this._writableState;
+
+ if (typeof chunk === 'function') {
+ cb = chunk;
+ chunk = null;
+ encoding = null;
+ } else if (typeof encoding === 'function') {
+ cb = encoding;
+ encoding = null;
+ }
+
+ if (chunk !== null && chunk !== undefined) this.write(chunk, encoding);
+
+ // .end() fully uncorks
+ if (state.corked) {
+ state.corked = 1;
+ this.uncork();
+ }
+
+ // ignore unnecessary end() calls.
+ if (!state.ending && !state.finished) endWritable(this, state, cb);
+};
+
+function needFinish(state) {
+ return state.ending && state.length === 0 && state.bufferedRequest === null && !state.finished && !state.writing;
+}
+function callFinal(stream, state) {
+ stream._final(function (err) {
+ state.pendingcb--;
+ if (err) {
+ stream.emit('error', err);
+ }
+ state.prefinished = true;
+ stream.emit('prefinish');
+ finishMaybe(stream, state);
+ });
+}
+function prefinish(stream, state) {
+ if (!state.prefinished && !state.finalCalled) {
+ if (typeof stream._final === 'function') {
+ state.pendingcb++;
+ state.finalCalled = true;
+ pna.nextTick(callFinal, stream, state);
+ } else {
+ state.prefinished = true;
+ stream.emit('prefinish');
+ }
+ }
+}
+
+function finishMaybe(stream, state) {
+ var need = needFinish(state);
+ if (need) {
+ prefinish(stream, state);
+ if (state.pendingcb === 0) {
+ state.finished = true;
+ stream.emit('finish');
+ }
+ }
+ return need;
+}
+
+function endWritable(stream, state, cb) {
+ state.ending = true;
+ finishMaybe(stream, state);
+ if (cb) {
+ if (state.finished) pna.nextTick(cb);else stream.once('finish', cb);
+ }
+ state.ended = true;
+ stream.writable = false;
+}
+
+function onCorkedFinish(corkReq, state, err) {
+ var entry = corkReq.entry;
+ corkReq.entry = null;
+ while (entry) {
+ var cb = entry.callback;
+ state.pendingcb--;
+ cb(err);
+ entry = entry.next;
+ }
+ if (state.corkedRequestsFree) {
+ state.corkedRequestsFree.next = corkReq;
+ } else {
+ state.corkedRequestsFree = corkReq;
+ }
+}
+
+Object.defineProperty(Writable.prototype, 'destroyed', {
+ get: function () {
+ if (this._writableState === undefined) {
+ return false;
+ }
+ return this._writableState.destroyed;
+ },
+ set: function (value) {
+ // we ignore the value if the stream
+ // has not been initialized yet
+ if (!this._writableState) {
+ return;
+ }
+
+ // backward compatibility, the user is explicitly
+ // managing destroyed
+ this._writableState.destroyed = value;
+ }
+});
+
+Writable.prototype.destroy = destroyImpl.destroy;
+Writable.prototype._undestroy = destroyImpl.undestroy;
+Writable.prototype._destroy = function (err, cb) {
+ this.end();
+ cb(err);
+};
\ No newline at end of file
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/BufferList.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/BufferList.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/lib/internal/streams/BufferList.js
rename to deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/BufferList.js
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/destroy.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/destroy.js
new file mode 100644
index 00000000000000..5a0a0d88cec6f3
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/destroy.js
@@ -0,0 +1,74 @@
+'use strict';
+
+/**/
+
+var pna = require('process-nextick-args');
+/* */
+
+// undocumented cb() API, needed for core, not for public API
+function destroy(err, cb) {
+ var _this = this;
+
+ var readableDestroyed = this._readableState && this._readableState.destroyed;
+ var writableDestroyed = this._writableState && this._writableState.destroyed;
+
+ if (readableDestroyed || writableDestroyed) {
+ if (cb) {
+ cb(err);
+ } else if (err && (!this._writableState || !this._writableState.errorEmitted)) {
+ pna.nextTick(emitErrorNT, this, err);
+ }
+ return this;
+ }
+
+ // we set destroyed to true before firing error callbacks in order
+ // to make it re-entrance safe in case destroy() is called within callbacks
+
+ if (this._readableState) {
+ this._readableState.destroyed = true;
+ }
+
+ // if this is a duplex stream mark the writable part as destroyed as well
+ if (this._writableState) {
+ this._writableState.destroyed = true;
+ }
+
+ this._destroy(err || null, function (err) {
+ if (!cb && err) {
+ pna.nextTick(emitErrorNT, _this, err);
+ if (_this._writableState) {
+ _this._writableState.errorEmitted = true;
+ }
+ } else if (cb) {
+ cb(err);
+ }
+ });
+
+ return this;
+}
+
+function undestroy() {
+ if (this._readableState) {
+ this._readableState.destroyed = false;
+ this._readableState.reading = false;
+ this._readableState.ended = false;
+ this._readableState.endEmitted = false;
+ }
+
+ if (this._writableState) {
+ this._writableState.destroyed = false;
+ this._writableState.ended = false;
+ this._writableState.ending = false;
+ this._writableState.finished = false;
+ this._writableState.errorEmitted = false;
+ }
+}
+
+function emitErrorNT(self, err) {
+ self.emit('error', err);
+}
+
+module.exports = {
+ destroy: destroy,
+ undestroy: undestroy
+};
\ No newline at end of file
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/stream-browser.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/stream-browser.js
new file mode 100644
index 00000000000000..9332a3fdae7060
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/stream-browser.js
@@ -0,0 +1 @@
+module.exports = require('events').EventEmitter;
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/stream.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/stream.js
new file mode 100644
index 00000000000000..ce2ad5b6ee57f4
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/stream.js
@@ -0,0 +1 @@
+module.exports = require('stream');
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/package.json b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/package.json
new file mode 100644
index 00000000000000..387f98ab910c0e
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/package.json
@@ -0,0 +1,81 @@
+{
+ "_from": "readable-stream@^2.0.6",
+ "_id": "readable-stream@2.3.6",
+ "_inBundle": false,
+ "_integrity": "sha512-tQtKA9WIAhBF3+VLAseyMqZeBjW0AHJoxOtYqSUZNJxauErmLbVm2FW1y+J/YA9dUrAC39ITejlZWhVIwawkKw==",
+ "_location": "/are-we-there-yet/readable-stream",
+ "_phantomChildren": {},
+ "_requested": {
+ "type": "range",
+ "registry": true,
+ "raw": "readable-stream@^2.0.6",
+ "name": "readable-stream",
+ "escapedName": "readable-stream",
+ "rawSpec": "^2.0.6",
+ "saveSpec": null,
+ "fetchSpec": "^2.0.6"
+ },
+ "_requiredBy": [
+ "/are-we-there-yet"
+ ],
+ "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
+ "_shasum": "b11c27d88b8ff1fbe070643cf94b0c79ae1b0aaf",
+ "_spec": "readable-stream@^2.0.6",
+ "_where": "/Users/aeschright/code/cli/node_modules/are-we-there-yet",
+ "browser": {
+ "util": false,
+ "./readable.js": "./readable-browser.js",
+ "./writable.js": "./writable-browser.js",
+ "./duplex.js": "./duplex-browser.js",
+ "./lib/internal/streams/stream.js": "./lib/internal/streams/stream-browser.js"
+ },
+ "bugs": {
+ "url": "https://github.com/nodejs/readable-stream/issues"
+ },
+ "bundleDependencies": false,
+ "dependencies": {
+ "core-util-is": "~1.0.0",
+ "inherits": "~2.0.3",
+ "isarray": "~1.0.0",
+ "process-nextick-args": "~2.0.0",
+ "safe-buffer": "~5.1.1",
+ "string_decoder": "~1.1.1",
+ "util-deprecate": "~1.0.1"
+ },
+ "deprecated": false,
+ "description": "Streams3, a user-land copy of the stream library from Node.js",
+ "devDependencies": {
+ "assert": "^1.4.0",
+ "babel-polyfill": "^6.9.1",
+ "buffer": "^4.9.0",
+ "lolex": "^2.3.2",
+ "nyc": "^6.4.0",
+ "tap": "^0.7.0",
+ "tape": "^4.8.0"
+ },
+ "homepage": "https://github.com/nodejs/readable-stream#readme",
+ "keywords": [
+ "readable",
+ "stream",
+ "pipe"
+ ],
+ "license": "MIT",
+ "main": "readable.js",
+ "name": "readable-stream",
+ "nyc": {
+ "include": [
+ "lib/**.js"
+ ]
+ },
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/nodejs/readable-stream.git"
+ },
+ "scripts": {
+ "ci": "tap test/parallel/*.js test/ours/*.js --tap | tee test.tap && node test/verify-dependencies.js",
+ "cover": "nyc npm test",
+ "report": "nyc report --reporter=lcov",
+ "test": "tap test/parallel/*.js test/ours/*.js && node test/verify-dependencies.js"
+ },
+ "version": "2.3.6"
+}
diff --git a/deps/npm/node_modules/readable-stream/passthrough.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/passthrough.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/passthrough.js
rename to deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/passthrough.js
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/readable-browser.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/readable-browser.js
new file mode 100644
index 00000000000000..e50372592ee6c6
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/readable-browser.js
@@ -0,0 +1,7 @@
+exports = module.exports = require('./lib/_stream_readable.js');
+exports.Stream = exports;
+exports.Readable = exports;
+exports.Writable = require('./lib/_stream_writable.js');
+exports.Duplex = require('./lib/_stream_duplex.js');
+exports.Transform = require('./lib/_stream_transform.js');
+exports.PassThrough = require('./lib/_stream_passthrough.js');
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/readable.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/readable.js
new file mode 100644
index 00000000000000..ec89ec53306497
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/readable.js
@@ -0,0 +1,19 @@
+var Stream = require('stream');
+if (process.env.READABLE_STREAM === 'disable' && Stream) {
+ module.exports = Stream;
+ exports = module.exports = Stream.Readable;
+ exports.Readable = Stream.Readable;
+ exports.Writable = Stream.Writable;
+ exports.Duplex = Stream.Duplex;
+ exports.Transform = Stream.Transform;
+ exports.PassThrough = Stream.PassThrough;
+ exports.Stream = Stream;
+} else {
+ exports = module.exports = require('./lib/_stream_readable.js');
+ exports.Stream = Stream || exports;
+ exports.Readable = exports;
+ exports.Writable = require('./lib/_stream_writable.js');
+ exports.Duplex = require('./lib/_stream_duplex.js');
+ exports.Transform = require('./lib/_stream_transform.js');
+ exports.PassThrough = require('./lib/_stream_passthrough.js');
+}
diff --git a/deps/npm/node_modules/readable-stream/transform.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/transform.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/transform.js
rename to deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/transform.js
diff --git a/deps/npm/node_modules/readable-stream/writable-browser.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/writable-browser.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/writable-browser.js
rename to deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/writable-browser.js
diff --git a/deps/npm/node_modules/readable-stream/writable.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/writable.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/writable.js
rename to deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/writable.js
diff --git a/deps/npm/node_modules/string_decoder/.travis.yml b/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/.travis.yml
similarity index 100%
rename from deps/npm/node_modules/string_decoder/.travis.yml
rename to deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/.travis.yml
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/LICENSE b/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/LICENSE
new file mode 100644
index 00000000000000..2873b3b2e59507
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/LICENSE
@@ -0,0 +1,47 @@
+Node.js is licensed for use as follows:
+
+"""
+Copyright Node.js contributors. All rights reserved.
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to
+deal in the Software without restriction, including without limitation the
+rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
+sell copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
+IN THE SOFTWARE.
+"""
+
+This license applies to parts of Node.js originating from the
+https://github.com/joyent/node repository:
+
+"""
+Copyright Joyent, Inc. and other Node contributors. All rights reserved.
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to
+deal in the Software without restriction, including without limitation the
+rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
+sell copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
+IN THE SOFTWARE.
+"""
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/README.md b/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/README.md
new file mode 100644
index 00000000000000..5fd58315ed5880
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/README.md
@@ -0,0 +1,47 @@
+# string_decoder
+
+***Node-core v8.9.4 string_decoder for userland***
+
+
+[![NPM](https://nodei.co/npm/string_decoder.png?downloads=true&downloadRank=true)](https://nodei.co/npm/string_decoder/)
+[![NPM](https://nodei.co/npm-dl/string_decoder.png?&months=6&height=3)](https://nodei.co/npm/string_decoder/)
+
+
+```bash
+npm install --save string_decoder
+```
+
+***Node-core string_decoder for userland***
+
+This package is a mirror of the string_decoder implementation in Node-core.
+
+Full documentation may be found on the [Node.js website](https://nodejs.org/dist/v8.9.4/docs/api/).
+
+As of version 1.0.0 **string_decoder** uses semantic versioning.
+
+## Previous versions
+
+Previous version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10.
+
+## Update
+
+The *build/* directory contains a build script that will scrape the source from the [nodejs/node](https://github.com/nodejs/node) repo given a specific Node version.
+
+## Streams Working Group
+
+`string_decoder` is maintained by the Streams Working Group, which
+oversees the development and maintenance of the Streams API within
+Node.js. The responsibilities of the Streams Working Group include:
+
+* Addressing stream issues on the Node.js issue tracker.
+* Authoring and editing stream documentation within the Node.js project.
+* Reviewing changes to stream subclasses within the Node.js project.
+* Redirecting changes to streams from the Node.js project to this
+ project.
+* Assisting in the implementation of stream providers within Node.js.
+* Recommending versions of `readable-stream` to be included in Node.js.
+* Messaging about the future of streams to give the community advance
+ notice of changes.
+
+See [readable-stream](https://github.com/nodejs/readable-stream) for
+more details.
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/lib/string_decoder.js b/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/lib/string_decoder.js
new file mode 100644
index 00000000000000..2e89e63f7933e4
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/lib/string_decoder.js
@@ -0,0 +1,296 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+'use strict';
+
+/**/
+
+var Buffer = require('safe-buffer').Buffer;
+/* */
+
+var isEncoding = Buffer.isEncoding || function (encoding) {
+ encoding = '' + encoding;
+ switch (encoding && encoding.toLowerCase()) {
+ case 'hex':case 'utf8':case 'utf-8':case 'ascii':case 'binary':case 'base64':case 'ucs2':case 'ucs-2':case 'utf16le':case 'utf-16le':case 'raw':
+ return true;
+ default:
+ return false;
+ }
+};
+
+function _normalizeEncoding(enc) {
+ if (!enc) return 'utf8';
+ var retried;
+ while (true) {
+ switch (enc) {
+ case 'utf8':
+ case 'utf-8':
+ return 'utf8';
+ case 'ucs2':
+ case 'ucs-2':
+ case 'utf16le':
+ case 'utf-16le':
+ return 'utf16le';
+ case 'latin1':
+ case 'binary':
+ return 'latin1';
+ case 'base64':
+ case 'ascii':
+ case 'hex':
+ return enc;
+ default:
+ if (retried) return; // undefined
+ enc = ('' + enc).toLowerCase();
+ retried = true;
+ }
+ }
+};
+
+// Do not cache `Buffer.isEncoding` when checking encoding names as some
+// modules monkey-patch it to support additional encodings
+function normalizeEncoding(enc) {
+ var nenc = _normalizeEncoding(enc);
+ if (typeof nenc !== 'string' && (Buffer.isEncoding === isEncoding || !isEncoding(enc))) throw new Error('Unknown encoding: ' + enc);
+ return nenc || enc;
+}
+
+// StringDecoder provides an interface for efficiently splitting a series of
+// buffers into a series of JS strings without breaking apart multi-byte
+// characters.
+exports.StringDecoder = StringDecoder;
+function StringDecoder(encoding) {
+ this.encoding = normalizeEncoding(encoding);
+ var nb;
+ switch (this.encoding) {
+ case 'utf16le':
+ this.text = utf16Text;
+ this.end = utf16End;
+ nb = 4;
+ break;
+ case 'utf8':
+ this.fillLast = utf8FillLast;
+ nb = 4;
+ break;
+ case 'base64':
+ this.text = base64Text;
+ this.end = base64End;
+ nb = 3;
+ break;
+ default:
+ this.write = simpleWrite;
+ this.end = simpleEnd;
+ return;
+ }
+ this.lastNeed = 0;
+ this.lastTotal = 0;
+ this.lastChar = Buffer.allocUnsafe(nb);
+}
+
+StringDecoder.prototype.write = function (buf) {
+ if (buf.length === 0) return '';
+ var r;
+ var i;
+ if (this.lastNeed) {
+ r = this.fillLast(buf);
+ if (r === undefined) return '';
+ i = this.lastNeed;
+ this.lastNeed = 0;
+ } else {
+ i = 0;
+ }
+ if (i < buf.length) return r ? r + this.text(buf, i) : this.text(buf, i);
+ return r || '';
+};
+
+StringDecoder.prototype.end = utf8End;
+
+// Returns only complete characters in a Buffer
+StringDecoder.prototype.text = utf8Text;
+
+// Attempts to complete a partial non-UTF-8 character using bytes from a Buffer
+StringDecoder.prototype.fillLast = function (buf) {
+ if (this.lastNeed <= buf.length) {
+ buf.copy(this.lastChar, this.lastTotal - this.lastNeed, 0, this.lastNeed);
+ return this.lastChar.toString(this.encoding, 0, this.lastTotal);
+ }
+ buf.copy(this.lastChar, this.lastTotal - this.lastNeed, 0, buf.length);
+ this.lastNeed -= buf.length;
+};
+
+// Checks the type of a UTF-8 byte, whether it's ASCII, a leading byte, or a
+// continuation byte. If an invalid byte is detected, -2 is returned.
+function utf8CheckByte(byte) {
+ if (byte <= 0x7F) return 0;else if (byte >> 5 === 0x06) return 2;else if (byte >> 4 === 0x0E) return 3;else if (byte >> 3 === 0x1E) return 4;
+ return byte >> 6 === 0x02 ? -1 : -2;
+}
+
+// Checks at most 3 bytes at the end of a Buffer in order to detect an
+// incomplete multi-byte UTF-8 character. The total number of bytes (2, 3, or 4)
+// needed to complete the UTF-8 character (if applicable) are returned.
+function utf8CheckIncomplete(self, buf, i) {
+ var j = buf.length - 1;
+ if (j < i) return 0;
+ var nb = utf8CheckByte(buf[j]);
+ if (nb >= 0) {
+ if (nb > 0) self.lastNeed = nb - 1;
+ return nb;
+ }
+ if (--j < i || nb === -2) return 0;
+ nb = utf8CheckByte(buf[j]);
+ if (nb >= 0) {
+ if (nb > 0) self.lastNeed = nb - 2;
+ return nb;
+ }
+ if (--j < i || nb === -2) return 0;
+ nb = utf8CheckByte(buf[j]);
+ if (nb >= 0) {
+ if (nb > 0) {
+ if (nb === 2) nb = 0;else self.lastNeed = nb - 3;
+ }
+ return nb;
+ }
+ return 0;
+}
+
+// Validates as many continuation bytes for a multi-byte UTF-8 character as
+// needed or are available. If we see a non-continuation byte where we expect
+// one, we "replace" the validated continuation bytes we've seen so far with
+// a single UTF-8 replacement character ('\ufffd'), to match v8's UTF-8 decoding
+// behavior. The continuation byte check is included three times in the case
+// where all of the continuation bytes for a character exist in the same buffer.
+// It is also done this way as a slight performance increase instead of using a
+// loop.
+function utf8CheckExtraBytes(self, buf, p) {
+ if ((buf[0] & 0xC0) !== 0x80) {
+ self.lastNeed = 0;
+ return '\ufffd';
+ }
+ if (self.lastNeed > 1 && buf.length > 1) {
+ if ((buf[1] & 0xC0) !== 0x80) {
+ self.lastNeed = 1;
+ return '\ufffd';
+ }
+ if (self.lastNeed > 2 && buf.length > 2) {
+ if ((buf[2] & 0xC0) !== 0x80) {
+ self.lastNeed = 2;
+ return '\ufffd';
+ }
+ }
+ }
+}
+
+// Attempts to complete a multi-byte UTF-8 character using bytes from a Buffer.
+function utf8FillLast(buf) {
+ var p = this.lastTotal - this.lastNeed;
+ var r = utf8CheckExtraBytes(this, buf, p);
+ if (r !== undefined) return r;
+ if (this.lastNeed <= buf.length) {
+ buf.copy(this.lastChar, p, 0, this.lastNeed);
+ return this.lastChar.toString(this.encoding, 0, this.lastTotal);
+ }
+ buf.copy(this.lastChar, p, 0, buf.length);
+ this.lastNeed -= buf.length;
+}
+
+// Returns all complete UTF-8 characters in a Buffer. If the Buffer ended on a
+// partial character, the character's bytes are buffered until the required
+// number of bytes are available.
+function utf8Text(buf, i) {
+ var total = utf8CheckIncomplete(this, buf, i);
+ if (!this.lastNeed) return buf.toString('utf8', i);
+ this.lastTotal = total;
+ var end = buf.length - (total - this.lastNeed);
+ buf.copy(this.lastChar, 0, end);
+ return buf.toString('utf8', i, end);
+}
+
+// For UTF-8, a replacement character is added when ending on a partial
+// character.
+function utf8End(buf) {
+ var r = buf && buf.length ? this.write(buf) : '';
+ if (this.lastNeed) return r + '\ufffd';
+ return r;
+}
+
+// UTF-16LE typically needs two bytes per character, but even if we have an even
+// number of bytes available, we need to check if we end on a leading/high
+// surrogate. In that case, we need to wait for the next two bytes in order to
+// decode the last character properly.
+function utf16Text(buf, i) {
+ if ((buf.length - i) % 2 === 0) {
+ var r = buf.toString('utf16le', i);
+ if (r) {
+ var c = r.charCodeAt(r.length - 1);
+ if (c >= 0xD800 && c <= 0xDBFF) {
+ this.lastNeed = 2;
+ this.lastTotal = 4;
+ this.lastChar[0] = buf[buf.length - 2];
+ this.lastChar[1] = buf[buf.length - 1];
+ return r.slice(0, -1);
+ }
+ }
+ return r;
+ }
+ this.lastNeed = 1;
+ this.lastTotal = 2;
+ this.lastChar[0] = buf[buf.length - 1];
+ return buf.toString('utf16le', i, buf.length - 1);
+}
+
+// For UTF-16LE we do not explicitly append special replacement characters if we
+// end on a partial character, we simply let v8 handle that.
+function utf16End(buf) {
+ var r = buf && buf.length ? this.write(buf) : '';
+ if (this.lastNeed) {
+ var end = this.lastTotal - this.lastNeed;
+ return r + this.lastChar.toString('utf16le', 0, end);
+ }
+ return r;
+}
+
+function base64Text(buf, i) {
+ var n = (buf.length - i) % 3;
+ if (n === 0) return buf.toString('base64', i);
+ this.lastNeed = 3 - n;
+ this.lastTotal = 3;
+ if (n === 1) {
+ this.lastChar[0] = buf[buf.length - 1];
+ } else {
+ this.lastChar[0] = buf[buf.length - 2];
+ this.lastChar[1] = buf[buf.length - 1];
+ }
+ return buf.toString('base64', i, buf.length - n);
+}
+
+function base64End(buf) {
+ var r = buf && buf.length ? this.write(buf) : '';
+ if (this.lastNeed) return r + this.lastChar.toString('base64', 0, 3 - this.lastNeed);
+ return r;
+}
+
+// Pass bytes on through for single-byte encodings (e.g. ascii, latin1, hex)
+function simpleWrite(buf) {
+ return buf.toString(this.encoding);
+}
+
+function simpleEnd(buf) {
+ return buf && buf.length ? this.write(buf) : '';
+}
\ No newline at end of file
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/package.json b/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/package.json
new file mode 100644
index 00000000000000..1e76b8bffe4086
--- /dev/null
+++ b/deps/npm/node_modules/are-we-there-yet/node_modules/string_decoder/package.json
@@ -0,0 +1,59 @@
+{
+ "_from": "string_decoder@~1.1.1",
+ "_id": "string_decoder@1.1.1",
+ "_inBundle": false,
+ "_integrity": "sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==",
+ "_location": "/are-we-there-yet/string_decoder",
+ "_phantomChildren": {},
+ "_requested": {
+ "type": "range",
+ "registry": true,
+ "raw": "string_decoder@~1.1.1",
+ "name": "string_decoder",
+ "escapedName": "string_decoder",
+ "rawSpec": "~1.1.1",
+ "saveSpec": null,
+ "fetchSpec": "~1.1.1"
+ },
+ "_requiredBy": [
+ "/are-we-there-yet/readable-stream"
+ ],
+ "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.1.1.tgz",
+ "_shasum": "9cf1611ba62685d7030ae9e4ba34149c3af03fc8",
+ "_spec": "string_decoder@~1.1.1",
+ "_where": "/Users/aeschright/code/cli/node_modules/are-we-there-yet/node_modules/readable-stream",
+ "bugs": {
+ "url": "https://github.com/nodejs/string_decoder/issues"
+ },
+ "bundleDependencies": false,
+ "dependencies": {
+ "safe-buffer": "~5.1.0"
+ },
+ "deprecated": false,
+ "description": "The string_decoder module from Node core",
+ "devDependencies": {
+ "babel-polyfill": "^6.23.0",
+ "core-util-is": "^1.0.2",
+ "inherits": "^2.0.3",
+ "tap": "~0.4.8"
+ },
+ "homepage": "https://github.com/nodejs/string_decoder",
+ "keywords": [
+ "string",
+ "decoder",
+ "browser",
+ "browserify"
+ ],
+ "license": "MIT",
+ "main": "lib/string_decoder.js",
+ "name": "string_decoder",
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/nodejs/string_decoder.git"
+ },
+ "scripts": {
+ "ci": "tap test/parallel/*.js test/ours/*.js --tap | tee test.tap && node test/verify-dependencies.js",
+ "test": "tap test/parallel/*.js && node test/verify-dependencies"
+ },
+ "version": "1.1.1"
+}
diff --git a/deps/npm/node_modules/byte-size/README.hbs b/deps/npm/node_modules/byte-size/README.hbs
index ebf5101b37002e..8a57e4a9c784a8 100644
--- a/deps/npm/node_modules/byte-size/README.hbs
+++ b/deps/npm/node_modules/byte-size/README.hbs
@@ -7,6 +7,34 @@
{{>main}}
+### Load anywhere
+
+This library is compatible with Node.js, the Web and any style of module loader. It can be loaded anywhere, natively without transpilation.
+
+Node.js:
+
+```js
+const byteSize = require('byte-size')
+```
+
+Within Node.js with ECMAScript Module support enabled:
+
+```js
+import byteSize from 'byte-size'
+```
+
+Within a modern browser ECMAScript Module:
+
+```js
+import byteSize from './node_modules/byte-size/index.mjs'
+```
+
+Old browser (adds `window.byteSize`):
+
+```html
+
+```
+
* * *
© 2014-18 Lloyd Brookes \<75pound@gmail.com\>. Documented by [jsdoc-to-markdown](https://github.com/jsdoc2md/jsdoc-to-markdown).
diff --git a/deps/npm/node_modules/byte-size/README.md b/deps/npm/node_modules/byte-size/README.md
index 768f21d347da20..4d383ad76b6a5a 100644
--- a/deps/npm/node_modules/byte-size/README.md
+++ b/deps/npm/node_modules/byte-size/README.md
@@ -8,7 +8,7 @@
## byte-size
-Convert a bytes value to a more human-readable format. Choose between [metric or IEC units](https://en.wikipedia.org/wiki/Gigabyte), summarised below.
+An isomorphic, load-anywhere function to convert a bytes value into a more human-readable format. Choose between [metric or IEC units](https://en.wikipedia.org/wiki/Gigabyte), summarised below.
Value | Metric
----- | -------------
@@ -100,6 +100,34 @@ const byteSize = require('byte-size')
'1.5 Kio'
```
+### Load anywhere
+
+This library is compatible with Node.js, the Web and any style of module loader. It can be loaded anywhere, natively without transpilation.
+
+Node.js:
+
+```js
+const byteSize = require('byte-size')
+```
+
+Within Node.js with ECMAScript Module support enabled:
+
+```js
+import byteSize from 'byte-size'
+```
+
+Within a modern browser ECMAScript Module:
+
+```js
+import byteSize from './node_modules/byte-size/index.mjs'
+```
+
+Old browser (adds `window.byteSize`):
+
+```html
+
+```
+
* * *
© 2014-18 Lloyd Brookes \<75pound@gmail.com\>. Documented by [jsdoc-to-markdown](https://github.com/jsdoc2md/jsdoc-to-markdown).
diff --git a/deps/npm/node_modules/byte-size/dist/index.js b/deps/npm/node_modules/byte-size/dist/index.js
new file mode 100644
index 00000000000000..8253a63545b3ac
--- /dev/null
+++ b/deps/npm/node_modules/byte-size/dist/index.js
@@ -0,0 +1,152 @@
+(function (global, factory) {
+ typeof exports === 'object' && typeof module !== 'undefined' ? module.exports = factory() :
+ typeof define === 'function' && define.amd ? define(factory) :
+ (global = global || self, global.byteSize = factory());
+}(this, function () { 'use strict';
+
+ /**
+ * An isomorphic, load-anywhere function to convert a bytes value into a more human-readable format. Choose between [metric or IEC units](https://en.wikipedia.org/wiki/Gigabyte), summarised below.
+ *
+ * Value | Metric
+ * ----- | -------------
+ * 1000 | kB kilobyte
+ * 1000^2 | MB megabyte
+ * 1000^3 | GB gigabyte
+ * 1000^4 | TB terabyte
+ * 1000^5 | PB petabyte
+ * 1000^6 | EB exabyte
+ * 1000^7 | ZB zettabyte
+ * 1000^8 | YB yottabyte
+ *
+ * Value | IEC
+ * ----- | ------------
+ * 1024 | KiB kibibyte
+ * 1024^2 | MiB mebibyte
+ * 1024^3 | GiB gibibyte
+ * 1024^4 | TiB tebibyte
+ * 1024^5 | PiB pebibyte
+ * 1024^6 | EiB exbibyte
+ * 1024^7 | ZiB zebibyte
+ * 1024^8 | YiB yobibyte
+ *
+ * Value | Metric (octet)
+ * ----- | -------------
+ * 1000 | ko kilooctet
+ * 1000^2 | Mo megaoctet
+ * 1000^3 | Go gigaoctet
+ * 1000^4 | To teraoctet
+ * 1000^5 | Po petaoctet
+ * 1000^6 | Eo exaoctet
+ * 1000^7 | Zo zettaoctet
+ * 1000^8 | Yo yottaoctet
+ *
+ * Value | IEC (octet)
+ * ----- | ------------
+ * 1024 | Kio kilooctet
+ * 1024^2 | Mio mebioctet
+ * 1024^3 | Gio gibioctet
+ * 1024^4 | Tio tebioctet
+ * 1024^5 | Pio pebioctet
+ * 1024^6 | Eio exbioctet
+ * 1024^7 | Zio zebioctet
+ * 1024^8 | Yio yobioctet
+ *
+ * @module byte-size
+ * @example
+ * ```js
+ * const byteSize = require('byte-size')
+ * ```
+ */
+
+ class ByteSize {
+ constructor (bytes, options) {
+ options = options || {};
+ options.units = options.units || 'metric';
+ options.precision = typeof options.precision === 'undefined' ? 1 : options.precision;
+
+ const table = [
+ { expFrom: 0, expTo: 1, metric: 'B', iec: 'B', metric_octet: 'o', iec_octet: 'o' },
+ { expFrom: 1, expTo: 2, metric: 'kB', iec: 'KiB', metric_octet: 'ko', iec_octet: 'Kio' },
+ { expFrom: 2, expTo: 3, metric: 'MB', iec: 'MiB', metric_octet: 'Mo', iec_octet: 'Mio' },
+ { expFrom: 3, expTo: 4, metric: 'GB', iec: 'GiB', metric_octet: 'Go', iec_octet: 'Gio' },
+ { expFrom: 4, expTo: 5, metric: 'TB', iec: 'TiB', metric_octet: 'To', iec_octet: 'Tio' },
+ { expFrom: 5, expTo: 6, metric: 'PB', iec: 'PiB', metric_octet: 'Po', iec_octet: 'Pio' },
+ { expFrom: 6, expTo: 7, metric: 'EB', iec: 'EiB', metric_octet: 'Eo', iec_octet: 'Eio' },
+ { expFrom: 7, expTo: 8, metric: 'ZB', iec: 'ZiB', metric_octet: 'Zo', iec_octet: 'Zio' },
+ { expFrom: 8, expTo: 9, metric: 'YB', iec: 'YiB', metric_octet: 'Yo', iec_octet: 'Yio' }
+ ];
+
+ const base = options.units === 'metric' || options.units === 'metric_octet' ? 1000 : 1024;
+ const prefix = bytes < 0 ? '-' : '';
+ bytes = Math.abs(bytes);
+
+ for (let i = 0; i < table.length; i++) {
+ const lower = Math.pow(base, table[i].expFrom);
+ const upper = Math.pow(base, table[i].expTo);
+ if (bytes >= lower && bytes < upper) {
+ const units = table[i][options.units];
+ if (i === 0) {
+ this.value = prefix + bytes;
+ this.unit = units;
+ return
+ } else {
+ this.value = prefix + (bytes / lower).toFixed(options.precision);
+ this.unit = units;
+ return
+ }
+ }
+ }
+
+ this.value = prefix + bytes;
+ this.unit = '';
+ }
+
+ toString () {
+ return `${this.value} ${this.unit}`.trim()
+ }
+ }
+
+ /**
+ * @param {number} - the bytes value to convert.
+ * @param [options] {object} - optional config.
+ * @param [options.precision=1] {number} - number of decimal places.
+ * @param [options.units=metric] {string} - select `'metric'`, `'iec'`, `'metric_octet'` or `'iec_octet'` units.
+ * @returns {{ value: string, unit: string}}
+ * @alias module:byte-size
+ * @example
+ * ```js
+ * > const byteSize = require('byte-size')
+ *
+ * > byteSize(1580)
+ * { value: '1.6', unit: 'kB' }
+ *
+ * > byteSize(1580, { units: 'iec' })
+ * { value: '1.5', unit: 'KiB' }
+ *
+ * > byteSize(1580, { units: 'iec', precision: 3 })
+ * { value: '1.543', unit: 'KiB' }
+ *
+ * > byteSize(1580, { units: 'iec', precision: 0 })
+ * { value: '2', unit: 'KiB' }
+ *
+ * > byteSize(1580, { units: 'metric_octet' })
+ * { value: '1.6', unit: 'ko' }
+ *
+ * > byteSize(1580, { units: 'iec_octet' })
+ * { value: '1.5', unit: 'Kio' }
+ *
+ * > byteSize(1580, { units: 'iec_octet' }).toString()
+ * '1.5 Kio'
+ *
+ * > const { value, unit } = byteSize(1580, { units: 'iec_octet' })
+ * > `${value} ${unit}`
+ * '1.5 Kio'
+ * ```
+ */
+ function byteSize (bytes, options) {
+ return new ByteSize(bytes, options)
+ }
+
+ return byteSize;
+
+}));
diff --git a/deps/npm/node_modules/byte-size/index.js b/deps/npm/node_modules/byte-size/index.mjs
similarity index 90%
rename from deps/npm/node_modules/byte-size/index.js
rename to deps/npm/node_modules/byte-size/index.mjs
index ec1a0293892e62..2de3e205b087c6 100644
--- a/deps/npm/node_modules/byte-size/index.js
+++ b/deps/npm/node_modules/byte-size/index.mjs
@@ -1,7 +1,5 @@
-'use strict'
-
/**
- * Convert a bytes value to a more human-readable format. Choose between [metric or IEC units](https://en.wikipedia.org/wiki/Gigabyte), summarised below.
+ * An isomorphic, load-anywhere function to convert a bytes value into a more human-readable format. Choose between [metric or IEC units](https://en.wikipedia.org/wiki/Gigabyte), summarised below.
*
* Value | Metric
* ----- | -------------
@@ -53,7 +51,6 @@
* const byteSize = require('byte-size')
* ```
*/
-module.exports = byteSize
class ByteSize {
constructor (bytes, options) {
@@ -74,6 +71,8 @@ class ByteSize {
]
const base = options.units === 'metric' || options.units === 'metric_octet' ? 1000 : 1024
+ const prefix = bytes < 0 ? '-' : '';
+ bytes = Math.abs(bytes);
for (let i = 0; i < table.length; i++) {
const lower = Math.pow(base, table[i].expFrom)
@@ -81,18 +80,18 @@ class ByteSize {
if (bytes >= lower && bytes < upper) {
const units = table[i][options.units]
if (i === 0) {
- this.value = String(bytes)
+ this.value = prefix + bytes
this.unit = units
return
} else {
- this.value = (bytes / lower).toFixed(options.precision)
+ this.value = prefix + (bytes / lower).toFixed(options.precision)
this.unit = units
return
}
}
}
- this.value = String(bytes)
+ this.value = prefix + bytes
this.unit = ''
}
@@ -141,3 +140,5 @@ class ByteSize {
function byteSize (bytes, options) {
return new ByteSize(bytes, options)
}
+
+export default byteSize
diff --git a/deps/npm/node_modules/byte-size/package.json b/deps/npm/node_modules/byte-size/package.json
index f69fc683f38cc6..57e46ba988d9b5 100644
--- a/deps/npm/node_modules/byte-size/package.json
+++ b/deps/npm/node_modules/byte-size/package.json
@@ -1,32 +1,28 @@
{
- "_args": [
- [
- "byte-size@4.0.3",
- "/Users/rebecca/code/npm"
- ]
- ],
- "_from": "byte-size@4.0.3",
- "_id": "byte-size@4.0.3",
+ "_from": "byte-size@5.0.1",
+ "_id": "byte-size@5.0.1",
"_inBundle": false,
- "_integrity": "sha512-JGC3EV2bCzJH/ENSh3afyJrH4vwxbHTuO5ljLoI5+2iJOcEpMgP8T782jH9b5qGxf2mSUIp1lfGnfKNrRHpvVg==",
+ "_integrity": "sha512-/XuKeqWocKsYa/cBY1YbSJSWWqTi4cFgr9S6OyM7PBaPbr9zvNGwWP33vt0uqGhwDdN+y3yhbXVILEUpnwEWGw==",
"_location": "/byte-size",
"_phantomChildren": {},
"_requested": {
"type": "version",
"registry": true,
- "raw": "byte-size@4.0.3",
+ "raw": "byte-size@5.0.1",
"name": "byte-size",
"escapedName": "byte-size",
- "rawSpec": "4.0.3",
+ "rawSpec": "5.0.1",
"saveSpec": null,
- "fetchSpec": "4.0.3"
+ "fetchSpec": "5.0.1"
},
"_requiredBy": [
+ "#USER",
"/"
],
- "_resolved": "https://registry.npmjs.org/byte-size/-/byte-size-4.0.3.tgz",
- "_spec": "4.0.3",
- "_where": "/Users/rebecca/code/npm",
+ "_resolved": "https://registry.npmjs.org/byte-size/-/byte-size-5.0.1.tgz",
+ "_shasum": "4b651039a5ecd96767e71a3d7ed380e48bed4191",
+ "_spec": "byte-size@5.0.1",
+ "_where": "/Users/aeschright/code/cli",
"author": {
"name": "Lloyd Brookes",
"email": "75pound@gmail.com"
@@ -34,6 +30,7 @@
"bugs": {
"url": "https://github.com/75lb/byte-size/issues"
},
+ "bundleDependencies": false,
"contributors": [
{
"name": "Raul Perez",
@@ -41,14 +38,20 @@
"url": "http://repejota.com"
}
],
+ "deprecated": false,
"description": "Convert a bytes (and octets) value to a more human-readable format. Choose between metric or IEC units.",
"devDependencies": {
- "coveralls": "^3.0.1",
+ "coveralls": "^3.0.2",
"jsdoc-to-markdown": "^4.0.1",
- "test-runner": "^0.5.0"
+ "rollup": "^0.68.1",
+ "test-runner": "^0.5.1"
+ },
+ "engines": {
+ "node": ">=6.0.0"
},
"files": [
- "index.js"
+ "index.mjs",
+ "dist/index.js"
],
"homepage": "https://github.com/75lb/byte-size#readme",
"keywords": [
@@ -62,6 +65,7 @@
"IEC"
],
"license": "MIT",
+ "main": "dist/index.js",
"name": "byte-size",
"repository": {
"type": "git",
@@ -69,8 +73,11 @@
},
"scripts": {
"cover": "istanbul cover ./node_modules/.bin/test-runner test.js && cat coverage/lcov.info | ./node_modules/.bin/coveralls",
- "docs": "jsdoc2md -t README.hbs index.js > README.md; echo",
- "test": "test-runner test.js"
+ "dist": "rollup -c dist/index.config.js",
+ "docs": "jsdoc2md -t README.hbs dist/index.js > README.md; echo",
+ "test": "npm run test:js && npm run test:mjs",
+ "test:js": "rollup -c dist/test.config.js && node dist/test.js",
+ "test:mjs": "node --experimental-modules test/test.mjs"
},
- "version": "4.0.3"
+ "version": "5.0.1"
}
diff --git a/deps/npm/node_modules/cacache/CHANGELOG.md b/deps/npm/node_modules/cacache/CHANGELOG.md
index ec9174f80d76cb..847174be70f4de 100644
--- a/deps/npm/node_modules/cacache/CHANGELOG.md
+++ b/deps/npm/node_modules/cacache/CHANGELOG.md
@@ -2,6 +2,36 @@
All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.
+
+## [11.3.2](https://github.com/zkat/cacache/compare/v11.3.1...v11.3.2) (2018-12-21)
+
+
+### Bug Fixes
+
+* **get:** make sure to handle errors in the .then ([b10bcd0](https://github.com/zkat/cacache/commit/b10bcd0))
+
+
+
+
+## [11.3.1](https://github.com/zkat/cacache/compare/v11.3.0...v11.3.1) (2018-11-05)
+
+
+### Bug Fixes
+
+* **get:** export hasContent.sync properly ([d76c920](https://github.com/zkat/cacache/commit/d76c920))
+
+
+
+
+# [11.3.0](https://github.com/zkat/cacache/compare/v11.2.0...v11.3.0) (2018-11-05)
+
+
+### Features
+
+* **get:** add sync API for reading ([db1e094](https://github.com/zkat/cacache/commit/db1e094))
+
+
+
# [11.2.0](https://github.com/zkat/cacache/compare/v11.1.0...v11.2.0) (2018-08-08)
diff --git a/deps/npm/node_modules/cacache/get.js b/deps/npm/node_modules/cacache/get.js
index 7bafe128e4bf38..008cb83a9ed87a 100644
--- a/deps/npm/node_modules/cacache/get.js
+++ b/deps/npm/node_modules/cacache/get.js
@@ -63,6 +63,55 @@ function getData (byDigest, cache, key, opts) {
})
}
+module.exports.sync = function get (cache, key, opts) {
+ return getDataSync(false, cache, key, opts)
+}
+module.exports.sync.byDigest = function getByDigest (cache, digest, opts) {
+ return getDataSync(true, cache, digest, opts)
+}
+function getDataSync (byDigest, cache, key, opts) {
+ opts = GetOpts(opts)
+ const memoized = (
+ byDigest
+ ? memo.get.byDigest(cache, key, opts)
+ : memo.get(cache, key, opts)
+ )
+ if (memoized && opts.memoize !== false) {
+ return byDigest ? memoized : {
+ metadata: memoized.entry.metadata,
+ data: memoized.data,
+ integrity: memoized.entry.integrity,
+ size: memoized.entry.size
+ }
+ }
+ const entry = !byDigest && index.find.sync(cache, key, opts)
+ if (!entry && !byDigest) {
+ throw new index.NotFoundError(cache, key)
+ }
+ const data = read.sync(
+ cache,
+ byDigest ? key : entry.integrity,
+ {
+ integrity: opts.integrity,
+ size: opts.size
+ }
+ )
+ const res = byDigest
+ ? data
+ : {
+ metadata: entry.metadata,
+ data: data,
+ size: entry.size,
+ integrity: entry.integrity
+ }
+ if (opts.memoize && byDigest) {
+ memo.put.byDigest(cache, key, res, opts)
+ } else if (opts.memoize) {
+ memo.put(cache, entry, res.data, opts)
+ }
+ return res
+}
+
module.exports.stream = getStream
function getStream (cache, key, opts) {
opts = GetOpts(opts)
@@ -113,7 +162,7 @@ function getStream (cache, key, opts) {
memoStream,
stream
)
- }, err => stream.emit('error', err))
+ }).catch(err => stream.emit('error', err))
return stream
}
diff --git a/deps/npm/node_modules/cacache/lib/entry-index.js b/deps/npm/node_modules/cacache/lib/entry-index.js
index 43fa7b95b1d0fe..29a688eea26abe 100644
--- a/deps/npm/node_modules/cacache/lib/entry-index.js
+++ b/deps/npm/node_modules/cacache/lib/entry-index.js
@@ -75,10 +75,36 @@ function insert (cache, key, integrity, opts) {
})
}
+module.exports.insert.sync = insertSync
+function insertSync (cache, key, integrity, opts) {
+ opts = IndexOpts(opts)
+ const bucket = bucketPath(cache, key)
+ const entry = {
+ key,
+ integrity: integrity && ssri.stringify(integrity),
+ time: Date.now(),
+ size: opts.size,
+ metadata: opts.metadata
+ }
+ fixOwner.mkdirfix.sync(path.dirname(bucket), opts.uid, opts.gid)
+ const stringified = JSON.stringify(entry)
+ fs.appendFileSync(
+ bucket, `\n${hashEntry(stringified)}\t${stringified}`
+ )
+ try {
+ fixOwner.chownr.sync(bucket, opts.uid, opts.gid)
+ } catch (err) {
+ if (err.code !== 'ENOENT') {
+ throw err
+ }
+ }
+ return formatEntry(cache, entry)
+}
+
module.exports.find = find
function find (cache, key) {
const bucket = bucketPath(cache, key)
- return bucketEntries(cache, bucket).then(entries => {
+ return bucketEntries(bucket).then(entries => {
return entries.reduce((latest, next) => {
if (next && next.key === key) {
return formatEntry(cache, next)
@@ -95,11 +121,36 @@ function find (cache, key) {
})
}
+module.exports.find.sync = findSync
+function findSync (cache, key) {
+ const bucket = bucketPath(cache, key)
+ try {
+ return bucketEntriesSync(bucket).reduce((latest, next) => {
+ if (next && next.key === key) {
+ return formatEntry(cache, next)
+ } else {
+ return latest
+ }
+ }, null)
+ } catch (err) {
+ if (err.code === 'ENOENT') {
+ return null
+ } else {
+ throw err
+ }
+ }
+}
+
module.exports.delete = del
function del (cache, key, opts) {
return insert(cache, key, null, opts)
}
+module.exports.delete.sync = delSync
+function delSync (cache, key, opts) {
+ return insertSync(cache, key, null, opts)
+}
+
module.exports.lsStream = lsStream
function lsStream (cache) {
const indexDir = bucketDir(cache)
@@ -116,7 +167,6 @@ function lsStream (cache) {
// "/cachename///*"
return readdirOrEmpty(subbucketPath).map(entry => {
const getKeyToEntry = bucketEntries(
- cache,
path.join(subbucketPath, entry)
).reduce((acc, entry) => {
acc.set(entry.key, entry)
@@ -152,32 +202,39 @@ function ls (cache) {
})
}
-function bucketEntries (cache, bucket, filter) {
+function bucketEntries (bucket, filter) {
return readFileAsync(
bucket, 'utf8'
- ).then(data => {
- let entries = []
- data.split('\n').forEach(entry => {
- if (!entry) { return }
- const pieces = entry.split('\t')
- if (!pieces[1] || hashEntry(pieces[1]) !== pieces[0]) {
- // Hash is no good! Corruption or malice? Doesn't matter!
- // EJECT EJECT
- return
- }
- let obj
- try {
- obj = JSON.parse(pieces[1])
- } catch (e) {
- // Entry is corrupted!
- return
- }
- if (obj) {
- entries.push(obj)
- }
- })
- return entries
+ ).then(data => _bucketEntries(data, filter))
+}
+
+function bucketEntriesSync (bucket, filter) {
+ const data = fs.readFileSync(bucket, 'utf8')
+ return _bucketEntries(data, filter)
+}
+
+function _bucketEntries (data, filter) {
+ let entries = []
+ data.split('\n').forEach(entry => {
+ if (!entry) { return }
+ const pieces = entry.split('\t')
+ if (!pieces[1] || hashEntry(pieces[1]) !== pieces[0]) {
+ // Hash is no good! Corruption or malice? Doesn't matter!
+ // EJECT EJECT
+ return
+ }
+ let obj
+ try {
+ obj = JSON.parse(pieces[1])
+ } catch (e) {
+ // Entry is corrupted!
+ return
+ }
+ if (obj) {
+ entries.push(obj)
+ }
})
+ return entries
}
module.exports._bucketDir = bucketDir
diff --git a/deps/npm/node_modules/cacache/lib/util/fix-owner.js b/deps/npm/node_modules/cacache/lib/util/fix-owner.js
index 7000bff04807a0..0c8f9f87537b0b 100644
--- a/deps/npm/node_modules/cacache/lib/util/fix-owner.js
+++ b/deps/npm/node_modules/cacache/lib/util/fix-owner.js
@@ -31,6 +31,34 @@ function fixOwner (filepath, uid, gid) {
)
}
+module.exports.chownr.sync = fixOwnerSync
+function fixOwnerSync (filepath, uid, gid) {
+ if (!process.getuid) {
+ // This platform doesn't need ownership fixing
+ return
+ }
+ if (typeof uid !== 'number' && typeof gid !== 'number') {
+ // There's no permissions override. Nothing to do here.
+ return
+ }
+ if ((typeof uid === 'number' && process.getuid() === uid) &&
+ (typeof gid === 'number' && process.getgid() === gid)) {
+ // No need to override if it's already what we used.
+ return
+ }
+ try {
+ chownr.sync(
+ filepath,
+ typeof uid === 'number' ? uid : process.getuid(),
+ typeof gid === 'number' ? gid : process.getgid()
+ )
+ } catch (err) {
+ if (err.code === 'ENOENT') {
+ return null
+ }
+ }
+}
+
module.exports.mkdirfix = mkdirfix
function mkdirfix (p, uid, gid, cb) {
return mkdirp(p).then(made => {
@@ -42,3 +70,21 @@ function mkdirfix (p, uid, gid, cb) {
return fixOwner(p, uid, gid).then(() => null)
})
}
+
+module.exports.mkdirfix.sync = mkdirfixSync
+function mkdirfixSync (p, uid, gid) {
+ try {
+ const made = mkdirp.sync(p)
+ if (made) {
+ fixOwnerSync(made, uid, gid)
+ return made
+ }
+ } catch (err) {
+ if (err.code === 'EEXIST') {
+ fixOwnerSync(p, uid, gid)
+ return null
+ } else {
+ throw err
+ }
+ }
+}
diff --git a/deps/npm/node_modules/cacache/locales/en.js b/deps/npm/node_modules/cacache/locales/en.js
index 22025cf0e895e6..1715fdb53cd3f6 100644
--- a/deps/npm/node_modules/cacache/locales/en.js
+++ b/deps/npm/node_modules/cacache/locales/en.js
@@ -18,12 +18,15 @@ x.ls.stream = cache => ls.stream(cache)
x.get = (cache, key, opts) => get(cache, key, opts)
x.get.byDigest = (cache, hash, opts) => get.byDigest(cache, hash, opts)
+x.get.sync = (cache, key, opts) => get.sync(cache, key, opts)
+x.get.sync.byDigest = (cache, key, opts) => get.sync.byDigest(cache, key, opts)
x.get.stream = (cache, key, opts) => get.stream(cache, key, opts)
x.get.stream.byDigest = (cache, hash, opts) => get.stream.byDigest(cache, hash, opts)
x.get.copy = (cache, key, dest, opts) => get.copy(cache, key, dest, opts)
x.get.copy.byDigest = (cache, hash, dest, opts) => get.copy.byDigest(cache, hash, dest, opts)
x.get.info = (cache, key) => get.info(cache, key)
x.get.hasContent = (cache, hash) => get.hasContent(cache, hash)
+x.get.hasContent.sync = (cache, hash) => get.hasContent.sync(cache, hash)
x.put = (cache, key, data, opts) => put(cache, key, data, opts)
x.put.stream = (cache, key, opts) => put.stream(cache, key, opts)
diff --git a/deps/npm/node_modules/cacache/locales/es.js b/deps/npm/node_modules/cacache/locales/es.js
index 9a27de6585a231..ac4e4cfe7d2f46 100644
--- a/deps/npm/node_modules/cacache/locales/es.js
+++ b/deps/npm/node_modules/cacache/locales/es.js
@@ -18,12 +18,15 @@ x.ls.flujo = cache => ls.stream(cache)
x.saca = (cache, clave, ops) => get(cache, clave, ops)
x.saca.porHacheo = (cache, hacheo, ops) => get.byDigest(cache, hacheo, ops)
+x.saca.sinc = (cache, clave, ops) => get.sync(cache, clave, ops)
+x.saca.sinc.porHacheo = (cache, hacheo, ops) => get.sync.byDigest(cache, hacheo, ops)
x.saca.flujo = (cache, clave, ops) => get.stream(cache, clave, ops)
x.saca.flujo.porHacheo = (cache, hacheo, ops) => get.stream.byDigest(cache, hacheo, ops)
x.sava.copia = (cache, clave, destino, opts) => get.copy(cache, clave, destino, opts)
x.sava.copia.porHacheo = (cache, hacheo, destino, opts) => get.copy.byDigest(cache, hacheo, destino, opts)
x.saca.info = (cache, clave) => get.info(cache, clave)
x.saca.tieneDatos = (cache, hacheo) => get.hasContent(cache, hacheo)
+x.saca.tieneDatos.sinc = (cache, hacheo) => get.hasContent.sync(cache, hacheo)
x.mete = (cache, clave, datos, ops) => put(cache, clave, datos, ops)
x.mete.flujo = (cache, clave, ops) => put.stream(cache, clave, ops)
diff --git a/deps/npm/node_modules/npm-registry-client/LICENSE b/deps/npm/node_modules/cacache/node_modules/chownr/LICENSE
similarity index 100%
rename from deps/npm/node_modules/npm-registry-client/LICENSE
rename to deps/npm/node_modules/cacache/node_modules/chownr/LICENSE
diff --git a/deps/npm/node_modules/cacache/node_modules/chownr/README.md b/deps/npm/node_modules/cacache/node_modules/chownr/README.md
new file mode 100644
index 00000000000000..70e9a54a32b8e0
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/chownr/README.md
@@ -0,0 +1,3 @@
+Like `chown -R`.
+
+Takes the same arguments as `fs.chown()`
diff --git a/deps/npm/node_modules/cacache/node_modules/chownr/chownr.js b/deps/npm/node_modules/cacache/node_modules/chownr/chownr.js
new file mode 100644
index 00000000000000..7e63928827e2c6
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/chownr/chownr.js
@@ -0,0 +1,88 @@
+'use strict'
+const fs = require('fs')
+const path = require('path')
+
+/* istanbul ignore next */
+const LCHOWN = fs.lchown ? 'lchown' : 'chown'
+/* istanbul ignore next */
+const LCHOWNSYNC = fs.lchownSync ? 'lchownSync' : 'chownSync'
+
+// fs.readdir could only accept an options object as of node v6
+const nodeVersion = process.version
+let readdir = (path, options, cb) => fs.readdir(path, options, cb)
+let readdirSync = (path, options) => fs.readdirSync(path, options)
+/* istanbul ignore next */
+if (/^v4\./.test(nodeVersion))
+ readdir = (path, options, cb) => fs.readdir(path, cb)
+
+const chownrKid = (p, child, uid, gid, cb) => {
+ if (typeof child === 'string')
+ return fs.lstat(path.resolve(p, child), (er, stats) => {
+ if (er)
+ return cb(er)
+ stats.name = child
+ chownrKid(p, stats, uid, gid, cb)
+ })
+
+ if (child.isDirectory()) {
+ chownr(path.resolve(p, child.name), uid, gid, er => {
+ if (er)
+ return cb(er)
+ fs[LCHOWN](path.resolve(p, child.name), uid, gid, cb)
+ })
+ } else
+ fs[LCHOWN](path.resolve(p, child.name), uid, gid, cb)
+}
+
+
+const chownr = (p, uid, gid, cb) => {
+ readdir(p, { withFileTypes: true }, (er, children) => {
+ // any error other than ENOTDIR or ENOTSUP means it's not readable,
+ // or doesn't exist. give up.
+ if (er && er.code !== 'ENOTDIR' && er.code !== 'ENOTSUP')
+ return cb(er)
+ if (er || !children.length) return fs[LCHOWN](p, uid, gid, cb)
+
+ let len = children.length
+ let errState = null
+ const then = er => {
+ if (errState) return
+ if (er) return cb(errState = er)
+ if (-- len === 0) return fs[LCHOWN](p, uid, gid, cb)
+ }
+
+ children.forEach(child => chownrKid(p, child, uid, gid, then))
+ })
+}
+
+const chownrKidSync = (p, child, uid, gid) => {
+ if (typeof child === 'string') {
+ const stats = fs.lstatSync(path.resolve(p, child))
+ stats.name = child
+ child = stats
+ }
+
+ if (child.isDirectory())
+ chownrSync(path.resolve(p, child.name), uid, gid)
+
+ fs[LCHOWNSYNC](path.resolve(p, child.name), uid, gid)
+}
+
+const chownrSync = (p, uid, gid) => {
+ let children
+ try {
+ children = readdirSync(p, { withFileTypes: true })
+ } catch (er) {
+ if (er && er.code === 'ENOTDIR' && er.code !== 'ENOTSUP')
+ return fs[LCHOWNSYNC](p, uid, gid)
+ throw er
+ }
+
+ if (children.length)
+ children.forEach(child => chownrKidSync(p, child, uid, gid))
+
+ return fs[LCHOWNSYNC](p, uid, gid)
+}
+
+module.exports = chownr
+chownr.sync = chownrSync
diff --git a/deps/npm/node_modules/cacache/node_modules/chownr/package.json b/deps/npm/node_modules/cacache/node_modules/chownr/package.json
new file mode 100644
index 00000000000000..4871f94bf391d7
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/chownr/package.json
@@ -0,0 +1,59 @@
+{
+ "_from": "chownr@^1.1.1",
+ "_id": "chownr@1.1.1",
+ "_inBundle": false,
+ "_integrity": "sha512-j38EvO5+LHX84jlo6h4UzmOwi0UgW61WRyPtJz4qaadK5eY3BTS5TY/S1Stc3Uk2lIM6TPevAlULiEJwie860g==",
+ "_location": "/cacache/chownr",
+ "_phantomChildren": {},
+ "_requested": {
+ "type": "range",
+ "registry": true,
+ "raw": "chownr@^1.1.1",
+ "name": "chownr",
+ "escapedName": "chownr",
+ "rawSpec": "^1.1.1",
+ "saveSpec": null,
+ "fetchSpec": "^1.1.1"
+ },
+ "_requiredBy": [
+ "/cacache"
+ ],
+ "_resolved": "https://registry.npmjs.org/chownr/-/chownr-1.1.1.tgz",
+ "_shasum": "54726b8b8fff4df053c42187e801fb4412df1494",
+ "_spec": "chownr@^1.1.1",
+ "_where": "/Users/aeschright/code/cli/node_modules/cacache",
+ "author": {
+ "name": "Isaac Z. Schlueter",
+ "email": "i@izs.me",
+ "url": "http://blog.izs.me/"
+ },
+ "bugs": {
+ "url": "https://github.com/isaacs/chownr/issues"
+ },
+ "bundleDependencies": false,
+ "deprecated": false,
+ "description": "like `chown -R`",
+ "devDependencies": {
+ "mkdirp": "0.3",
+ "rimraf": "",
+ "tap": "^12.0.1"
+ },
+ "files": [
+ "chownr.js"
+ ],
+ "homepage": "https://github.com/isaacs/chownr#readme",
+ "license": "ISC",
+ "main": "chownr.js",
+ "name": "chownr",
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/isaacs/chownr.git"
+ },
+ "scripts": {
+ "postpublish": "git push origin --all; git push origin --tags",
+ "postversion": "npm publish",
+ "preversion": "npm test",
+ "test": "tap test/*.js --cov"
+ },
+ "version": "1.1.1"
+}
diff --git a/deps/npm/node_modules/cacache/node_modules/lru-cache/LICENSE b/deps/npm/node_modules/cacache/node_modules/lru-cache/LICENSE
new file mode 100644
index 00000000000000..19129e315fe593
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/lru-cache/LICENSE
@@ -0,0 +1,15 @@
+The ISC License
+
+Copyright (c) Isaac Z. Schlueter and Contributors
+
+Permission to use, copy, modify, and/or distribute this software for any
+purpose with or without fee is hereby granted, provided that the above
+copyright notice and this permission notice appear in all copies.
+
+THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
+IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
diff --git a/deps/npm/node_modules/cacache/node_modules/lru-cache/README.md b/deps/npm/node_modules/cacache/node_modules/lru-cache/README.md
new file mode 100644
index 00000000000000..435dfebb7e27d0
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/lru-cache/README.md
@@ -0,0 +1,166 @@
+# lru cache
+
+A cache object that deletes the least-recently-used items.
+
+[![Build Status](https://travis-ci.org/isaacs/node-lru-cache.svg?branch=master)](https://travis-ci.org/isaacs/node-lru-cache) [![Coverage Status](https://coveralls.io/repos/isaacs/node-lru-cache/badge.svg?service=github)](https://coveralls.io/github/isaacs/node-lru-cache)
+
+## Installation:
+
+```javascript
+npm install lru-cache --save
+```
+
+## Usage:
+
+```javascript
+var LRU = require("lru-cache")
+ , options = { max: 500
+ , length: function (n, key) { return n * 2 + key.length }
+ , dispose: function (key, n) { n.close() }
+ , maxAge: 1000 * 60 * 60 }
+ , cache = new LRU(options)
+ , otherCache = new LRU(50) // sets just the max size
+
+cache.set("key", "value")
+cache.get("key") // "value"
+
+// non-string keys ARE fully supported
+// but note that it must be THE SAME object, not
+// just a JSON-equivalent object.
+var someObject = { a: 1 }
+cache.set(someObject, 'a value')
+// Object keys are not toString()-ed
+cache.set('[object Object]', 'a different value')
+assert.equal(cache.get(someObject), 'a value')
+// A similar object with same keys/values won't work,
+// because it's a different object identity
+assert.equal(cache.get({ a: 1 }), undefined)
+
+cache.reset() // empty the cache
+```
+
+If you put more stuff in it, then items will fall out.
+
+If you try to put an oversized thing in it, then it'll fall out right
+away.
+
+## Options
+
+* `max` The maximum size of the cache, checked by applying the length
+ function to all values in the cache. Not setting this is kind of
+ silly, since that's the whole purpose of this lib, but it defaults
+ to `Infinity`. Setting it to a non-number or negative number will
+ throw a `TypeError`. Setting it to 0 makes it be `Infinity`.
+* `maxAge` Maximum age in ms. Items are not pro-actively pruned out
+ as they age, but if you try to get an item that is too old, it'll
+ drop it and return undefined instead of giving it to you.
+ Setting this to a negative value will make everything seem old!
+ Setting it to a non-number will throw a `TypeError`.
+* `length` Function that is used to calculate the length of stored
+ items. If you're storing strings or buffers, then you probably want
+ to do something like `function(n, key){return n.length}`. The default is
+ `function(){return 1}`, which is fine if you want to store `max`
+ like-sized things. The item is passed as the first argument, and
+ the key is passed as the second argumnet.
+* `dispose` Function that is called on items when they are dropped
+ from the cache. This can be handy if you want to close file
+ descriptors or do other cleanup tasks when items are no longer
+ accessible. Called with `key, value`. It's called *before*
+ actually removing the item from the internal cache, so if you want
+ to immediately put it back in, you'll have to do that in a
+ `nextTick` or `setTimeout` callback or it won't do anything.
+* `stale` By default, if you set a `maxAge`, it'll only actually pull
+ stale items out of the cache when you `get(key)`. (That is, it's
+ not pre-emptively doing a `setTimeout` or anything.) If you set
+ `stale:true`, it'll return the stale value before deleting it. If
+ you don't set this, then it'll return `undefined` when you try to
+ get a stale entry, as if it had already been deleted.
+* `noDisposeOnSet` By default, if you set a `dispose()` method, then
+ it'll be called whenever a `set()` operation overwrites an existing
+ key. If you set this option, `dispose()` will only be called when a
+ key falls out of the cache, not when it is overwritten.
+* `updateAgeOnGet` When using time-expiring entries with `maxAge`,
+ setting this to `true` will make each item's effective time update
+ to the current time whenever it is retrieved from cache, causing it
+ to not expire. (It can still fall out of cache based on recency of
+ use, of course.)
+
+## API
+
+* `set(key, value, maxAge)`
+* `get(key) => value`
+
+ Both of these will update the "recently used"-ness of the key.
+ They do what you think. `maxAge` is optional and overrides the
+ cache `maxAge` option if provided.
+
+ If the key is not found, `get()` will return `undefined`.
+
+ The key and val can be any value.
+
+* `peek(key)`
+
+ Returns the key value (or `undefined` if not found) without
+ updating the "recently used"-ness of the key.
+
+ (If you find yourself using this a lot, you *might* be using the
+ wrong sort of data structure, but there are some use cases where
+ it's handy.)
+
+* `del(key)`
+
+ Deletes a key out of the cache.
+
+* `reset()`
+
+ Clear the cache entirely, throwing away all values.
+
+* `has(key)`
+
+ Check if a key is in the cache, without updating the recent-ness
+ or deleting it for being stale.
+
+* `forEach(function(value,key,cache), [thisp])`
+
+ Just like `Array.prototype.forEach`. Iterates over all the keys
+ in the cache, in order of recent-ness. (Ie, more recently used
+ items are iterated over first.)
+
+* `rforEach(function(value,key,cache), [thisp])`
+
+ The same as `cache.forEach(...)` but items are iterated over in
+ reverse order. (ie, less recently used items are iterated over
+ first.)
+
+* `keys()`
+
+ Return an array of the keys in the cache.
+
+* `values()`
+
+ Return an array of the values in the cache.
+
+* `length`
+
+ Return total length of objects in cache taking into account
+ `length` options function.
+
+* `itemCount`
+
+ Return total quantity of objects currently in cache. Note, that
+ `stale` (see options) items are returned as part of this item
+ count.
+
+* `dump()`
+
+ Return an array of the cache entries ready for serialization and usage
+ with 'destinationCache.load(arr)`.
+
+* `load(cacheEntriesArray)`
+
+ Loads another cache entries array, obtained with `sourceCache.dump()`,
+ into the cache. The destination cache is reset before loading new entries
+
+* `prune()`
+
+ Manually iterates over the entire cache proactively pruning old entries
diff --git a/deps/npm/node_modules/cacache/node_modules/lru-cache/index.js b/deps/npm/node_modules/cacache/node_modules/lru-cache/index.js
new file mode 100644
index 00000000000000..573b6b85b9779d
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/lru-cache/index.js
@@ -0,0 +1,334 @@
+'use strict'
+
+// A linked list to keep track of recently-used-ness
+const Yallist = require('yallist')
+
+const MAX = Symbol('max')
+const LENGTH = Symbol('length')
+const LENGTH_CALCULATOR = Symbol('lengthCalculator')
+const ALLOW_STALE = Symbol('allowStale')
+const MAX_AGE = Symbol('maxAge')
+const DISPOSE = Symbol('dispose')
+const NO_DISPOSE_ON_SET = Symbol('noDisposeOnSet')
+const LRU_LIST = Symbol('lruList')
+const CACHE = Symbol('cache')
+const UPDATE_AGE_ON_GET = Symbol('updateAgeOnGet')
+
+const naiveLength = () => 1
+
+// lruList is a yallist where the head is the youngest
+// item, and the tail is the oldest. the list contains the Hit
+// objects as the entries.
+// Each Hit object has a reference to its Yallist.Node. This
+// never changes.
+//
+// cache is a Map (or PseudoMap) that matches the keys to
+// the Yallist.Node object.
+class LRUCache {
+ constructor (options) {
+ if (typeof options === 'number')
+ options = { max: options }
+
+ if (!options)
+ options = {}
+
+ if (options.max && (typeof options.max !== 'number' || options.max < 0))
+ throw new TypeError('max must be a non-negative number')
+ // Kind of weird to have a default max of Infinity, but oh well.
+ const max = this[MAX] = options.max || Infinity
+
+ const lc = options.length || naiveLength
+ this[LENGTH_CALCULATOR] = (typeof lc !== 'function') ? naiveLength : lc
+ this[ALLOW_STALE] = options.stale || false
+ if (options.maxAge && typeof options.maxAge !== 'number')
+ throw new TypeError('maxAge must be a number')
+ this[MAX_AGE] = options.maxAge || 0
+ this[DISPOSE] = options.dispose
+ this[NO_DISPOSE_ON_SET] = options.noDisposeOnSet || false
+ this[UPDATE_AGE_ON_GET] = options.updateAgeOnGet || false
+ this.reset()
+ }
+
+ // resize the cache when the max changes.
+ set max (mL) {
+ if (typeof mL !== 'number' || mL < 0)
+ throw new TypeError('max must be a non-negative number')
+
+ this[MAX] = mL || Infinity
+ trim(this)
+ }
+ get max () {
+ return this[MAX]
+ }
+
+ set allowStale (allowStale) {
+ this[ALLOW_STALE] = !!allowStale
+ }
+ get allowStale () {
+ return this[ALLOW_STALE]
+ }
+
+ set maxAge (mA) {
+ if (typeof mA !== 'number')
+ throw new TypeError('maxAge must be a non-negative number')
+
+ this[MAX_AGE] = mA
+ trim(this)
+ }
+ get maxAge () {
+ return this[MAX_AGE]
+ }
+
+ // resize the cache when the lengthCalculator changes.
+ set lengthCalculator (lC) {
+ if (typeof lC !== 'function')
+ lC = naiveLength
+
+ if (lC !== this[LENGTH_CALCULATOR]) {
+ this[LENGTH_CALCULATOR] = lC
+ this[LENGTH] = 0
+ this[LRU_LIST].forEach(hit => {
+ hit.length = this[LENGTH_CALCULATOR](hit.value, hit.key)
+ this[LENGTH] += hit.length
+ })
+ }
+ trim(this)
+ }
+ get lengthCalculator () { return this[LENGTH_CALCULATOR] }
+
+ get length () { return this[LENGTH] }
+ get itemCount () { return this[LRU_LIST].length }
+
+ rforEach (fn, thisp) {
+ thisp = thisp || this
+ for (let walker = this[LRU_LIST].tail; walker !== null;) {
+ const prev = walker.prev
+ forEachStep(this, fn, walker, thisp)
+ walker = prev
+ }
+ }
+
+ forEach (fn, thisp) {
+ thisp = thisp || this
+ for (let walker = this[LRU_LIST].head; walker !== null;) {
+ const next = walker.next
+ forEachStep(this, fn, walker, thisp)
+ walker = next
+ }
+ }
+
+ keys () {
+ return this[LRU_LIST].toArray().map(k => k.key)
+ }
+
+ values () {
+ return this[LRU_LIST].toArray().map(k => k.value)
+ }
+
+ reset () {
+ if (this[DISPOSE] &&
+ this[LRU_LIST] &&
+ this[LRU_LIST].length) {
+ this[LRU_LIST].forEach(hit => this[DISPOSE](hit.key, hit.value))
+ }
+
+ this[CACHE] = new Map() // hash of items by key
+ this[LRU_LIST] = new Yallist() // list of items in order of use recency
+ this[LENGTH] = 0 // length of items in the list
+ }
+
+ dump () {
+ return this[LRU_LIST].map(hit =>
+ isStale(this, hit) ? false : {
+ k: hit.key,
+ v: hit.value,
+ e: hit.now + (hit.maxAge || 0)
+ }).toArray().filter(h => h)
+ }
+
+ dumpLru () {
+ return this[LRU_LIST]
+ }
+
+ set (key, value, maxAge) {
+ maxAge = maxAge || this[MAX_AGE]
+
+ if (maxAge && typeof maxAge !== 'number')
+ throw new TypeError('maxAge must be a number')
+
+ const now = maxAge ? Date.now() : 0
+ const len = this[LENGTH_CALCULATOR](value, key)
+
+ if (this[CACHE].has(key)) {
+ if (len > this[MAX]) {
+ del(this, this[CACHE].get(key))
+ return false
+ }
+
+ const node = this[CACHE].get(key)
+ const item = node.value
+
+ // dispose of the old one before overwriting
+ // split out into 2 ifs for better coverage tracking
+ if (this[DISPOSE]) {
+ if (!this[NO_DISPOSE_ON_SET])
+ this[DISPOSE](key, item.value)
+ }
+
+ item.now = now
+ item.maxAge = maxAge
+ item.value = value
+ this[LENGTH] += len - item.length
+ item.length = len
+ this.get(key)
+ trim(this)
+ return true
+ }
+
+ const hit = new Entry(key, value, len, now, maxAge)
+
+ // oversized objects fall out of cache automatically.
+ if (hit.length > this[MAX]) {
+ if (this[DISPOSE])
+ this[DISPOSE](key, value)
+
+ return false
+ }
+
+ this[LENGTH] += hit.length
+ this[LRU_LIST].unshift(hit)
+ this[CACHE].set(key, this[LRU_LIST].head)
+ trim(this)
+ return true
+ }
+
+ has (key) {
+ if (!this[CACHE].has(key)) return false
+ const hit = this[CACHE].get(key).value
+ return !isStale(this, hit)
+ }
+
+ get (key) {
+ return get(this, key, true)
+ }
+
+ peek (key) {
+ return get(this, key, false)
+ }
+
+ pop () {
+ const node = this[LRU_LIST].tail
+ if (!node)
+ return null
+
+ del(this, node)
+ return node.value
+ }
+
+ del (key) {
+ del(this, this[CACHE].get(key))
+ }
+
+ load (arr) {
+ // reset the cache
+ this.reset()
+
+ const now = Date.now()
+ // A previous serialized cache has the most recent items first
+ for (let l = arr.length - 1; l >= 0; l--) {
+ const hit = arr[l]
+ const expiresAt = hit.e || 0
+ if (expiresAt === 0)
+ // the item was created without expiration in a non aged cache
+ this.set(hit.k, hit.v)
+ else {
+ const maxAge = expiresAt - now
+ // dont add already expired items
+ if (maxAge > 0) {
+ this.set(hit.k, hit.v, maxAge)
+ }
+ }
+ }
+ }
+
+ prune () {
+ this[CACHE].forEach((value, key) => get(this, key, false))
+ }
+}
+
+const get = (self, key, doUse) => {
+ const node = self[CACHE].get(key)
+ if (node) {
+ const hit = node.value
+ if (isStale(self, hit)) {
+ del(self, node)
+ if (!self[ALLOW_STALE])
+ return undefined
+ } else {
+ if (doUse) {
+ if (self[UPDATE_AGE_ON_GET])
+ node.value.now = Date.now()
+ self[LRU_LIST].unshiftNode(node)
+ }
+ }
+ return hit.value
+ }
+}
+
+const isStale = (self, hit) => {
+ if (!hit || (!hit.maxAge && !self[MAX_AGE]))
+ return false
+
+ const diff = Date.now() - hit.now
+ return hit.maxAge ? diff > hit.maxAge
+ : self[MAX_AGE] && (diff > self[MAX_AGE])
+}
+
+const trim = self => {
+ if (self[LENGTH] > self[MAX]) {
+ for (let walker = self[LRU_LIST].tail;
+ self[LENGTH] > self[MAX] && walker !== null;) {
+ // We know that we're about to delete this one, and also
+ // what the next least recently used key will be, so just
+ // go ahead and set it now.
+ const prev = walker.prev
+ del(self, walker)
+ walker = prev
+ }
+ }
+}
+
+const del = (self, node) => {
+ if (node) {
+ const hit = node.value
+ if (self[DISPOSE])
+ self[DISPOSE](hit.key, hit.value)
+
+ self[LENGTH] -= hit.length
+ self[CACHE].delete(hit.key)
+ self[LRU_LIST].removeNode(node)
+ }
+}
+
+class Entry {
+ constructor (key, value, length, now, maxAge) {
+ this.key = key
+ this.value = value
+ this.length = length
+ this.now = now
+ this.maxAge = maxAge || 0
+ }
+}
+
+const forEachStep = (self, fn, node, thisp) => {
+ let hit = node.value
+ if (isStale(self, hit)) {
+ del(self, node)
+ if (!self[ALLOW_STALE])
+ hit = undefined
+ }
+ if (hit)
+ fn.call(thisp, hit.value, hit.key, self)
+}
+
+module.exports = LRUCache
diff --git a/deps/npm/node_modules/cacache/node_modules/lru-cache/package.json b/deps/npm/node_modules/cacache/node_modules/lru-cache/package.json
new file mode 100644
index 00000000000000..1d41a07afbda4c
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/lru-cache/package.json
@@ -0,0 +1,67 @@
+{
+ "_from": "lru-cache@^5.1.1",
+ "_id": "lru-cache@5.1.1",
+ "_inBundle": false,
+ "_integrity": "sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w==",
+ "_location": "/cacache/lru-cache",
+ "_phantomChildren": {},
+ "_requested": {
+ "type": "range",
+ "registry": true,
+ "raw": "lru-cache@^5.1.1",
+ "name": "lru-cache",
+ "escapedName": "lru-cache",
+ "rawSpec": "^5.1.1",
+ "saveSpec": null,
+ "fetchSpec": "^5.1.1"
+ },
+ "_requiredBy": [
+ "/cacache"
+ ],
+ "_resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz",
+ "_shasum": "1da27e6710271947695daf6848e847f01d84b920",
+ "_spec": "lru-cache@^5.1.1",
+ "_where": "/Users/aeschright/code/cli/node_modules/cacache",
+ "author": {
+ "name": "Isaac Z. Schlueter",
+ "email": "i@izs.me"
+ },
+ "bugs": {
+ "url": "https://github.com/isaacs/node-lru-cache/issues"
+ },
+ "bundleDependencies": false,
+ "dependencies": {
+ "yallist": "^3.0.2"
+ },
+ "deprecated": false,
+ "description": "A cache object that deletes the least-recently-used items.",
+ "devDependencies": {
+ "benchmark": "^2.1.4",
+ "tap": "^12.1.0"
+ },
+ "files": [
+ "index.js"
+ ],
+ "homepage": "https://github.com/isaacs/node-lru-cache#readme",
+ "keywords": [
+ "mru",
+ "lru",
+ "cache"
+ ],
+ "license": "ISC",
+ "main": "index.js",
+ "name": "lru-cache",
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/isaacs/node-lru-cache.git"
+ },
+ "scripts": {
+ "coveragerport": "tap --coverage-report=html",
+ "postpublish": "git push origin --all; git push origin --tags",
+ "postversion": "npm publish",
+ "preversion": "npm test",
+ "snap": "TAP_SNAPSHOT=1 tap test/*.js -J",
+ "test": "tap test/*.js --100 -J"
+ },
+ "version": "5.1.1"
+}
diff --git a/deps/npm/node_modules/cacache/node_modules/unique-filename/LICENSE b/deps/npm/node_modules/cacache/node_modules/unique-filename/LICENSE
new file mode 100644
index 00000000000000..69619c125ea7ef
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/unique-filename/LICENSE
@@ -0,0 +1,5 @@
+Copyright npm, Inc
+
+Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.
+
+THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
diff --git a/deps/npm/node_modules/cacache/node_modules/unique-filename/README.md b/deps/npm/node_modules/cacache/node_modules/unique-filename/README.md
new file mode 100644
index 00000000000000..74b62b2ab4426e
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/unique-filename/README.md
@@ -0,0 +1,33 @@
+unique-filename
+===============
+
+Generate a unique filename for use in temporary directories or caches.
+
+```
+var uniqueFilename = require('unique-filename')
+
+// returns something like: /tmp/912ec803b2ce49e4a541068d495ab570
+var randomTmpfile = uniqueFilename(os.tmpdir())
+
+// returns something like: /tmp/my-test-912ec803b2ce49e4a541068d495ab570
+var randomPrefixedTmpfile = uniqueFilename(os.tmpdir(), 'my-test')
+
+var uniqueTmpfile = uniqueFilename('/tmp', 'testing', '/my/thing/to/uniq/on')
+```
+
+### uniqueFilename(*dir*, *fileprefix*, *uniqstr*) → String
+
+Returns the full path of a unique filename that looks like:
+`dir/prefix-7ddd44c0`
+or `dir/7ddd44c0`
+
+*dir* – The path you want the filename in. `os.tmpdir()` is a good choice for this.
+
+*fileprefix* – A string to append prior to the unique part of the filename.
+The parameter is required if *uniqstr* is also passed in but is otherwise
+optional and can be `undefined`/`null`/`''`. If present and not empty
+then this string plus a hyphen are prepended to the unique part.
+
+*uniqstr* – Optional, if not passed the unique part of the resulting
+filename will be random. If passed in it will be generated from this string
+in a reproducable way.
diff --git a/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/__root__/index.html b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/__root__/index.html
new file mode 100644
index 00000000000000..cd55391a67a4ce
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/__root__/index.html
@@ -0,0 +1,73 @@
+
+
+
+ Code coverage report for __root__/
+
+
+
+
+
+
+
+ Code coverage report for __root__/
+
+ Statements: 100% (4 / 4)
+ Branches: 100% (2 / 2)
+ Functions: 100% (1 / 1)
+ Lines: 100% (4 / 4)
+ Ignored: none
+
+ All files » __root__/
+
+
+
+
+
+
+ File
+
+ Statements
+
+ Branches
+
+ Functions
+
+ Lines
+
+
+
+
+ index.js
+
+ 100%
+ (4 / 4)
+ 100%
+ (2 / 2)
+ 100%
+ (1 / 1)
+ 100%
+ (4 / 4)
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/__root__/index.js.html b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/__root__/index.js.html
new file mode 100644
index 00000000000000..02e5768d3fb647
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/__root__/index.js.html
@@ -0,0 +1,69 @@
+
+
+
+ Code coverage report for index.js
+
+
+
+
+
+
+
+ Code coverage report for index.js
+
+ Statements: 100% (4 / 4)
+ Branches: 100% (2 / 2)
+ Functions: 100% (1 / 1)
+ Lines: 100% (4 / 4)
+ Ignored: none
+
+
+
+
+
+1
+2
+3
+4
+5
+6
+7
+8
+9
+1
+
+1
+
+1
+6
+
+ 'use strict'
+var path = require('path')
+
+var uniqueSlug = require('unique-slug')
+
+module.exports = function (filepath, prefix, uniq) {
+ return path.join(filepath, (prefix ? prefix + '-' : '') + uniqueSlug(uniq))
+}
+
+
+
+
+
+
+
+
+
+
diff --git a/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/base.css b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/base.css
new file mode 100644
index 00000000000000..a6a2f3284d0221
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/base.css
@@ -0,0 +1,182 @@
+body, html {
+ margin:0; padding: 0;
+}
+body {
+ font-family: Helvetica Neue, Helvetica,Arial;
+ font-size: 10pt;
+}
+div.header, div.footer {
+ background: #eee;
+ padding: 1em;
+}
+div.header {
+ z-index: 100;
+ position: fixed;
+ top: 0;
+ border-bottom: 1px solid #666;
+ width: 100%;
+}
+div.footer {
+ border-top: 1px solid #666;
+}
+div.body {
+ margin-top: 10em;
+}
+div.meta {
+ font-size: 90%;
+ text-align: center;
+}
+h1, h2, h3 {
+ font-weight: normal;
+}
+h1 {
+ font-size: 12pt;
+}
+h2 {
+ font-size: 10pt;
+}
+pre {
+ font-family: Consolas, Menlo, Monaco, monospace;
+ margin: 0;
+ padding: 0;
+ line-height: 1.3;
+ font-size: 14px;
+ -moz-tab-size: 2;
+ -o-tab-size: 2;
+ tab-size: 2;
+}
+
+div.path { font-size: 110%; }
+div.path a:link, div.path a:visited { color: #000; }
+table.coverage { border-collapse: collapse; margin:0; padding: 0 }
+
+table.coverage td {
+ margin: 0;
+ padding: 0;
+ color: #111;
+ vertical-align: top;
+}
+table.coverage td.line-count {
+ width: 50px;
+ text-align: right;
+ padding-right: 5px;
+}
+table.coverage td.line-coverage {
+ color: #777 !important;
+ text-align: right;
+ border-left: 1px solid #666;
+ border-right: 1px solid #666;
+}
+
+table.coverage td.text {
+}
+
+table.coverage td span.cline-any {
+ display: inline-block;
+ padding: 0 5px;
+ width: 40px;
+}
+table.coverage td span.cline-neutral {
+ background: #eee;
+}
+table.coverage td span.cline-yes {
+ background: #b5d592;
+ color: #999;
+}
+table.coverage td span.cline-no {
+ background: #fc8c84;
+}
+
+.cstat-yes { color: #111; }
+.cstat-no { background: #fc8c84; color: #111; }
+.fstat-no { background: #ffc520; color: #111 !important; }
+.cbranch-no { background: yellow !important; color: #111; }
+
+.cstat-skip { background: #ddd; color: #111; }
+.fstat-skip { background: #ddd; color: #111 !important; }
+.cbranch-skip { background: #ddd !important; color: #111; }
+
+.missing-if-branch {
+ display: inline-block;
+ margin-right: 10px;
+ position: relative;
+ padding: 0 4px;
+ background: black;
+ color: yellow;
+}
+
+.skip-if-branch {
+ display: none;
+ margin-right: 10px;
+ position: relative;
+ padding: 0 4px;
+ background: #ccc;
+ color: white;
+}
+
+.missing-if-branch .typ, .skip-if-branch .typ {
+ color: inherit !important;
+}
+
+.entity, .metric { font-weight: bold; }
+.metric { display: inline-block; border: 1px solid #333; padding: 0.3em; background: white; }
+.metric small { font-size: 80%; font-weight: normal; color: #666; }
+
+div.coverage-summary table { border-collapse: collapse; margin: 3em; font-size: 110%; }
+div.coverage-summary td, div.coverage-summary table th { margin: 0; padding: 0.25em 1em; border-top: 1px solid #666; border-bottom: 1px solid #666; }
+div.coverage-summary th { text-align: left; border: 1px solid #666; background: #eee; font-weight: normal; }
+div.coverage-summary th.file { border-right: none !important; }
+div.coverage-summary th.pic { border-left: none !important; text-align: right; }
+div.coverage-summary th.pct { border-right: none !important; }
+div.coverage-summary th.abs { border-left: none !important; text-align: right; }
+div.coverage-summary td.pct { text-align: right; border-left: 1px solid #666; }
+div.coverage-summary td.abs { text-align: right; font-size: 90%; color: #444; border-right: 1px solid #666; }
+div.coverage-summary td.file { border-left: 1px solid #666; white-space: nowrap; }
+div.coverage-summary td.pic { min-width: 120px !important; }
+div.coverage-summary a:link { text-decoration: none; color: #000; }
+div.coverage-summary a:visited { text-decoration: none; color: #777; }
+div.coverage-summary a:hover { text-decoration: underline; }
+div.coverage-summary tfoot td { border-top: 1px solid #666; }
+
+div.coverage-summary .sorter {
+ height: 10px;
+ width: 7px;
+ display: inline-block;
+ margin-left: 0.5em;
+ background: url(sort-arrow-sprite.png) no-repeat scroll 0 0 transparent;
+}
+div.coverage-summary .sorted .sorter {
+ background-position: 0 -20px;
+}
+div.coverage-summary .sorted-desc .sorter {
+ background-position: 0 -10px;
+}
+
+.high { background: #b5d592 !important; }
+.medium { background: #ffe87c !important; }
+.low { background: #fc8c84 !important; }
+
+span.cover-fill, span.cover-empty {
+ display:inline-block;
+ border:1px solid #444;
+ background: white;
+ height: 12px;
+}
+span.cover-fill {
+ background: #ccc;
+ border-right: 1px solid #444;
+}
+span.cover-empty {
+ background: white;
+ border-left: none;
+}
+span.cover-full {
+ border-right: none !important;
+}
+pre.prettyprint {
+ border: none !important;
+ padding: 0 !important;
+ margin: 0 !important;
+}
+.com { color: #999 !important; }
+.ignore-none { color: #999; font-weight: normal; }
diff --git a/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/index.html b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/index.html
new file mode 100644
index 00000000000000..b10d186cc3978e
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/index.html
@@ -0,0 +1,73 @@
+
+
+
+ Code coverage report for All files
+
+
+
+
+
+
+
+ Code coverage report for All files
+
+ Statements: 100% (4 / 4)
+ Branches: 100% (2 / 2)
+ Functions: 100% (1 / 1)
+ Lines: 100% (4 / 4)
+ Ignored: none
+
+
+
+
+
+
+
+
+ File
+
+ Statements
+
+ Branches
+
+ Functions
+
+ Lines
+
+
+
+
+ __root__/
+
+ 100%
+ (4 / 4)
+ 100%
+ (2 / 2)
+ 100%
+ (1 / 1)
+ 100%
+ (4 / 4)
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/prettify.css b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/prettify.css
new file mode 100644
index 00000000000000..b317a7cda31a44
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/prettify.css
@@ -0,0 +1 @@
+.pln{color:#000}@media screen{.str{color:#080}.kwd{color:#008}.com{color:#800}.typ{color:#606}.lit{color:#066}.pun,.opn,.clo{color:#660}.tag{color:#008}.atn{color:#606}.atv{color:#080}.dec,.var{color:#606}.fun{color:red}}@media print,projection{.str{color:#060}.kwd{color:#006;font-weight:bold}.com{color:#600;font-style:italic}.typ{color:#404;font-weight:bold}.lit{color:#044}.pun,.opn,.clo{color:#440}.tag{color:#006;font-weight:bold}.atn{color:#404}.atv{color:#060}}pre.prettyprint{padding:2px;border:1px solid #888}ol.linenums{margin-top:0;margin-bottom:0}li.L0,li.L1,li.L2,li.L3,li.L5,li.L6,li.L7,li.L8{list-style-type:none}li.L1,li.L3,li.L5,li.L7,li.L9{background:#eee}
diff --git a/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/prettify.js b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/prettify.js
new file mode 100644
index 00000000000000..ef51e03866898f
--- /dev/null
+++ b/deps/npm/node_modules/cacache/node_modules/unique-filename/coverage/prettify.js
@@ -0,0 +1 @@
+window.PR_SHOULD_USE_CONTINUATION=true;(function(){var h=["break,continue,do,else,for,if,return,while"];var u=[h,"auto,case,char,const,default,double,enum,extern,float,goto,int,long,register,short,signed,sizeof,static,struct,switch,typedef,union,unsigned,void,volatile"];var p=[u,"catch,class,delete,false,import,new,operator,private,protected,public,this,throw,true,try,typeof"];var l=[p,"alignof,align_union,asm,axiom,bool,concept,concept_map,const_cast,constexpr,decltype,dynamic_cast,explicit,export,friend,inline,late_check,mutable,namespace,nullptr,reinterpret_cast,static_assert,static_cast,template,typeid,typename,using,virtual,where"];var x=[p,"abstract,boolean,byte,extends,final,finally,implements,import,instanceof,null,native,package,strictfp,super,synchronized,throws,transient"];var R=[x,"as,base,by,checked,decimal,delegate,descending,dynamic,event,fixed,foreach,from,group,implicit,in,interface,internal,into,is,lock,object,out,override,orderby,params,partial,readonly,ref,sbyte,sealed,stackalloc,string,select,uint,ulong,unchecked,unsafe,ushort,var"];var r="all,and,by,catch,class,else,extends,false,finally,for,if,in,is,isnt,loop,new,no,not,null,of,off,on,or,return,super,then,true,try,unless,until,when,while,yes";var w=[p,"debugger,eval,export,function,get,null,set,undefined,var,with,Infinity,NaN"];var s="caller,delete,die,do,dump,elsif,eval,exit,foreach,for,goto,if,import,last,local,my,next,no,our,print,package,redo,require,sub,undef,unless,until,use,wantarray,while,BEGIN,END";var I=[h,"and,as,assert,class,def,del,elif,except,exec,finally,from,global,import,in,is,lambda,nonlocal,not,or,pass,print,raise,try,with,yield,False,True,None"];var f=[h,"alias,and,begin,case,class,def,defined,elsif,end,ensure,false,in,module,next,nil,not,or,redo,rescue,retry,self,super,then,true,undef,unless,until,when,yield,BEGIN,END"];var H=[h,"case,done,elif,esac,eval,fi,function,in,local,set,then,until"];var A=[l,R,w,s+I,f,H];var e=/^(DIR|FILE|vector|(de|priority_)?queue|list|stack|(const_)?iterator|(multi)?(set|map)|bitset|u?(int|float)\d*)/;var C="str";var z="kwd";var j="com";var O="typ";var G="lit";var L="pun";var F="pln";var m="tag";var E="dec";var J="src";var P="atn";var n="atv";var N="nocode";var M="(?:^^\\.?|[+-]|\\!|\\!=|\\!==|\\#|\\%|\\%=|&|&&|&&=|&=|\\(|\\*|\\*=|\\+=|\\,|\\-=|\\->|\\/|\\/=|:|::|\\;|<|<<|<<=|<=|=|==|===|>|>=|>>|>>=|>>>|>>>=|\\?|\\@|\\[|\\^|\\^=|\\^\\^|\\^\\^=|\\{|\\||\\|=|\\|\\||\\|\\|=|\\~|break|case|continue|delete|do|else|finally|instanceof|return|throw|try|typeof)\\s*";function k(Z){var ad=0;var S=false;var ac=false;for(var V=0,U=Z.length;V122)){if(!(al<65||ag>90)){af.push([Math.max(65,ag)|32,Math.min(al,90)|32])}if(!(al<97||ag>122)){af.push([Math.max(97,ag)&~32,Math.min(al,122)&~32])}}}}af.sort(function(av,au){return(av[0]-au[0])||(au[1]-av[1])});var ai=[];var ap=[NaN,NaN];for(var ar=0;arat[0]){if(at[1]+1>at[0]){an.push("-")}an.push(T(at[1]))}}an.push("]");return an.join("")}function W(al){var aj=al.source.match(new RegExp("(?:\\[(?:[^\\x5C\\x5D]|\\\\[\\s\\S])*\\]|\\\\u[A-Fa-f0-9]{4}|\\\\x[A-Fa-f0-9]{2}|\\\\[0-9]+|\\\\[^ux0-9]|\\(\\?[:!=]|[\\(\\)\\^]|[^\\x5B\\x5C\\(\\)\\^]+)","g"));var ah=aj.length;var an=[];for(var ak=0,am=0;ak=2&&ai==="["){aj[ak]=X(ag)}else{if(ai!=="\\"){aj[ak]=ag.replace(/[a-zA-Z]/g,function(ao){var ap=ao.charCodeAt(0);return"["+String.fromCharCode(ap&~32,ap|32)+"]"})}}}}return aj.join("")}var aa=[];for(var V=0,U=Z.length;V=0;){S[ac.charAt(ae)]=Y}}var af=Y[1];var aa=""+af;if(!ag.hasOwnProperty(aa)){ah.push(af);ag[aa]=null}}ah.push(/[\0-\uffff]/);V=k(ah)})();var X=T.length;var W=function(ah){var Z=ah.sourceCode,Y=ah.basePos;var ad=[Y,F];var af=0;var an=Z.match(V)||[];var aj={};for(var ae=0,aq=an.length;ae=5&&"lang-"===ap.substring(0,5);if(am&&!(ai&&typeof ai[1]==="string")){am=false;ap=J}if(!am){aj[ag]=ap}}var ab=af;af+=ag.length;if(!am){ad.push(Y+ab,ap)}else{var al=ai[1];var ak=ag.indexOf(al);var ac=ak+al.length;if(ai[2]){ac=ag.length-ai[2].length;ak=ac-al.length}var ar=ap.substring(5);B(Y+ab,ag.substring(0,ak),W,ad);B(Y+ab+ak,al,q(ar,al),ad);B(Y+ab+ac,ag.substring(ac),W,ad)}}ah.decorations=ad};return W}function i(T){var W=[],S=[];if(T.tripleQuotedStrings){W.push([C,/^(?:\'\'\'(?:[^\'\\]|\\[\s\S]|\'{1,2}(?=[^\']))*(?:\'\'\'|$)|\"\"\"(?:[^\"\\]|\\[\s\S]|\"{1,2}(?=[^\"]))*(?:\"\"\"|$)|\'(?:[^\\\']|\\[\s\S])*(?:\'|$)|\"(?:[^\\\"]|\\[\s\S])*(?:\"|$))/,null,"'\""])}else{if(T.multiLineStrings){W.push([C,/^(?:\'(?:[^\\\']|\\[\s\S])*(?:\'|$)|\"(?:[^\\\"]|\\[\s\S])*(?:\"|$)|\`(?:[^\\\`]|\\[\s\S])*(?:\`|$))/,null,"'\"`"])}else{W.push([C,/^(?:\'(?:[^\\\'\r\n]|\\.)*(?:\'|$)|\"(?:[^\\\"\r\n]|\\.)*(?:\"|$))/,null,"\"'"])}}if(T.verbatimStrings){S.push([C,/^@\"(?:[^\"]|\"\")*(?:\"|$)/,null])}var Y=T.hashComments;if(Y){if(T.cStyleComments){if(Y>1){W.push([j,/^#(?:##(?:[^#]|#(?!##))*(?:###|$)|.*)/,null,"#"])}else{W.push([j,/^#(?:(?:define|elif|else|endif|error|ifdef|include|ifndef|line|pragma|undef|warning)\b|[^\r\n]*)/,null,"#"])}S.push([C,/^<(?:(?:(?:\.\.\/)*|\/?)(?:[\w-]+(?:\/[\w-]+)+)?[\w-]+\.h|[a-z]\w*)>/,null])}else{W.push([j,/^#[^\r\n]*/,null,"#"])}}if(T.cStyleComments){S.push([j,/^\/\/[^\r\n]*/,null]);S.push([j,/^\/\*[\s\S]*?(?:\*\/|$)/,null])}if(T.regexLiterals){var X=("/(?=[^/*])(?:[^/\\x5B\\x5C]|\\x5C[\\s\\S]|\\x5B(?:[^\\x5C\\x5D]|\\x5C[\\s\\S])*(?:\\x5D|$))+/");S.push(["lang-regex",new RegExp("^"+M+"("+X+")")])}var V=T.types;if(V){S.push([O,V])}var U=(""+T.keywords).replace(/^ | $/g,"");if(U.length){S.push([z,new RegExp("^(?:"+U.replace(/[\s,]+/g,"|")+")\\b"),null])}W.push([F,/^\s+/,null," \r\n\t\xA0"]);S.push([G,/^@[a-z_$][a-z_$@0-9]*/i,null],[O,/^(?:[@_]?[A-Z]+[a-z][A-Za-z_$@0-9]*|\w+_t\b)/,null],[F,/^[a-z_$][a-z_$@0-9]*/i,null],[G,new RegExp("^(?:0x[a-f0-9]+|(?:\\d(?:_\\d+)*\\d*(?:\\.\\d*)?|\\.\\d\\+)(?:e[+\\-]?\\d+)?)[a-z]*","i"),null,"0123456789"],[F,/^\\[\s\S]?/,null],[L,/^.[^\s\w\.$@\'\"\`\/\#\\]*/,null]);return g(W,S)}var K=i({keywords:A,hashComments:true,cStyleComments:true,multiLineStrings:true,regexLiterals:true});function Q(V,ag){var U=/(?:^|\s)nocode(?:\s|$)/;var ab=/\r\n?|\n/;var ac=V.ownerDocument;var S;if(V.currentStyle){S=V.currentStyle.whiteSpace}else{if(window.getComputedStyle){S=ac.defaultView.getComputedStyle(V,null).getPropertyValue("white-space")}}var Z=S&&"pre"===S.substring(0,3);var af=ac.createElement("LI");while(V.firstChild){af.appendChild(V.firstChild)}var W=[af];function ae(al){switch(al.nodeType){case 1:if(U.test(al.className)){break}if("BR"===al.nodeName){ad(al);if(al.parentNode){al.parentNode.removeChild(al)}}else{for(var an=al.firstChild;an;an=an.nextSibling){ae(an)}}break;case 3:case 4:if(Z){var am=al.nodeValue;var aj=am.match(ab);if(aj){var ai=am.substring(0,aj.index);al.nodeValue=ai;var ah=am.substring(aj.index+aj[0].length);if(ah){var ak=al.parentNode;ak.insertBefore(ac.createTextNode(ah),al.nextSibling)}ad(al);if(!ai){al.parentNode.removeChild(al)}}}break}}function ad(ak){while(!ak.nextSibling){ak=ak.parentNode;if(!ak){return}}function ai(al,ar){var aq=ar?al.cloneNode(false):al;var ao=al.parentNode;if(ao){var ap=ai(ao,1);var an=al.nextSibling;ap.appendChild(aq);for(var am=an;am;am=an){an=am.nextSibling;ap.appendChild(am)}}return aq}var ah=ai(ak.nextSibling,0);for(var aj;(aj=ah.parentNode)&&aj.nodeType===1;){ah=aj}W.push(ah)}for(var Y=0;Y=S){ah+=2}if(V>=ap){Z+=2}}}var t={};function c(U,V){for(var S=V.length;--S>=0;){var T=V[S];if(!t.hasOwnProperty(T)){t[T]=U}else{if(window.console){console.warn("cannot override language handler %s",T)}}}}function q(T,S){if(!(T&&t.hasOwnProperty(T))){T=/^\s*]*(?:>|$)/],[j,/^<\!--[\s\S]*?(?:-\->|$)/],["lang-",/^<\?([\s\S]+?)(?:\?>|$)/],["lang-",/^<%([\s\S]+?)(?:%>|$)/],[L,/^(?:<[%?]|[%?]>)/],["lang-",/^]*>([\s\S]+?)<\/xmp\b[^>]*>/i],["lang-js",/^\n> data.parsed.open; // false\n> data.parsed.port; // 8888\n> data.parsed.name; // 'namealert(123)'; -> stripped\n> data.errors; // {}\n> data.warnings; // {}\n```\n\nWarnings and errors:\n\n```\n$ node test.js --port 888 --no-open --name name\n> data.parsed.open; // false\n> data.parsed.port; // undefined; -> error\n> data.parsed.name; // 'name'\n> data.errors; // {port: ['port < 100 which is forbidden']}\n> data.warnings; // {port: ['port < 8000 which is dangerous']}\n```\n\n\n\n\n\n\n","_id":"clean@1.1.4","dist":{"shasum":"5fe3f1d48a22842316eba0a19c51f26428015c80","tarball":"http://localhost:1337/clean/-/clean-1.1.4.tgz"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"2.1.1":{"name":"clean","version":"2.1.1","description":"clean parses and santitize argv or options for node, supporting fully extendable types, shorthands, validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git@github.com:kaelzhang/node-clean.git"},"keywords":["argv","parser","argument-vector","cleaner","simple"],"author":{"name":"Kael"},"license":"MIT","readmeFilename":"README.md","bugs":{"url":"https://github.com/kaelzhang/node-clean/issues"},"dependencies":{"checker":"~0.5.1","minimist":"~0.0.5"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"readme":"[![Build Status](https://travis-ci.org/kaelzhang/node-clean.png?branch=master)](https://travis-ci.org/kaelzhang/node-clean)\n\n# clean\n\nClean is small but powerful node.js module that parses and santitize argv or options for node, supporting:\n\n- fully extendable types\n- shorthands\n- validatiors\n- setters\n\n# Installation and Usage\n\n```sh\nnpm install clean --save\n```\n\n```js\nvar clean = require('clean')(options);\n```\n\n# Usage\n\n## Argv Shorthands\n\nWe can define shorthands with the option `options.shorthands`.\n\n```js\nvar shorthands = {\n\t// if `String`, define a shorthand for a key name\n\tc: 'cwd',\n\t// if `Array`, define a pattern slice of argv\n\tnr: ['--no-recursive'],\n\t// if `Object`, define a specific value\n\tr3: {\n\t\tretry: 3,\n\t\tstrict: false\n\t}\n};\nclean({\n\tshorthands: shorthands\n}).argv(['node', 'xxx', '-c', 'abc', '--nr', '--r3']); \n// notice that '-nr' will be considered as '-n -r'\n// The result is:\n// {\n//\t\tcwd: 'abc',\n//\t\trecursive: false,\n//\t\tretry: 3,\n//\t\tstrict: false \n// }\n```\n\n## Types\n\n```js\nclean({\n\tschema: {\n\t\tcwd: {\n\t\t\ttype: require('path')\n\t\t},\n\t\t\n\t\tretry: {\n\t\t\ttype: Boolean\n\t\t}\t\t\n\t}\n}).parseArgv(\n\t['node', 'xxx', '--cwd', 'abc', 'retry', 'false'], \n\tfunction(err, results, details){\n\t\tconsole.log(results.cwd); // the `path.resolved()`d 'abc'\n\t\tconsole.log(results.retry === false); // is a boolean, not a string\n\t}\n)\n```\n\nHow to extend a custom type ? See the \"advanced section\".\n\n## Validators and Setters\n\nValidators and setters of `clean` is implemented by `[checker](https://github.com/kaelzhang/node-checker)`, check the apis of `checker` for details.\n\nYou could check out the demo located at \"example/clean.js\". That is a very complicated situation of usage.\n\n```sh\nnode example/clean.js --username guest\n```\n\n\n\n# Programatical Details\n\n## constructor: clean(schema, options)\n\n\n### options\n\n#### options.offset `Number=`\n\nThe offset from which the parser should start to parse. Optional. Default to `2`.\n\n#### options.shorthands `Object=`\n\nThe shorthands used to parse the argv.\n\n#### options.schema `Object=`\n\nThe schema used to clean the given object or the parsred argv\n\n#### options.check_all `Boolean=false`\n\n#### options.parallel `Boolean=false`\n\n#### options.limit `Boolean=false`\n\n\n## .argv(argv)\n\nParses the argument vector, without cleaning the data.\n\n### argv `Array`\n\n### returns `Object`\n\nThe parsed object with shorthand rules applied.\n\n\n## .clean(data, callback)\n\nCleans the given data according to the `schema`.\n\n### data `Object`\n\nThe given data.\n\n### callback `function(err, results, details)`\n\n\n## .parseArgv(argv, callback)\n\nParses argument vector (argv) or something like argv, and cleans the parsed data according to the `schema`.\n\nThis method is equivalent to `c.clean(c.argv(argv), callback)`.\n\n# Advanced Section\n\n## .registerType(type, typeDef)\n\n\n\n\n\n\n","_id":"clean@2.1.1","dist":{"shasum":"f78cb9f6a9b3156e537fc2cbb7caf271636ecb09","tarball":"http://localhost:1337/clean/-/clean-2.1.1.tgz"},"_from":".","_npmVersion":"1.3.11","_npmUser":{"name":"kael","email":"i@kael.me"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"2.1.2":{"name":"clean","version":"2.1.2","description":"clean parses and santitize argv or options for node, supporting fully extendable types, shorthands, validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git@github.com:kaelzhang/node-clean.git"},"keywords":["argv","parser","argument-vector","cleaner","simple"],"author":{"name":"Kael"},"license":"MIT","readmeFilename":"README.md","bugs":{"url":"https://github.com/kaelzhang/node-clean/issues"},"dependencies":{"checker":"~0.5.1","minimist":"~0.0.5"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"readme":"[![NPM version](https://badge.fury.io/js/clean.png)](http://badge.fury.io/js/clean)\n[![Build Status](https://travis-ci.org/kaelzhang/node-clean.png?branch=master)](https://travis-ci.org/kaelzhang/node-clean)\n[![Dependency Status](https://gemnasium.com/kaelzhang/node-clean.png)](https://gemnasium.com/kaelzhang/node-clean)\n\n# clean\n\nClean is small but powerful node.js module that parses and santitize argv or options for node, supporting:\n\n- fully extendable types\n- shorthands\n- validatiors\n- setters\n\n# Installation and Usage\n\n```sh\nnpm install clean --save\n```\n\n```js\nvar clean = require('clean')(options);\n```\n\n# Usage\n\n## Argv Shorthands\n\nWe can define shorthands with the option `options.shorthands`.\n\n```js\nvar shorthands = {\n\t// if `String`, define a shorthand for a key name\n\tc: 'cwd',\n\t// if `Array`, define a pattern slice of argv\n\tnr: ['--no-recursive'],\n\t// if `Object`, define a specific value\n\tr3: {\n\t\tretry: 3,\n\t\tstrict: false\n\t}\n};\nclean({\n\tshorthands: shorthands\n}).argv(['node', 'xxx', '-c', 'abc', '--nr', '--r3']); \n// notice that '-nr' will be considered as '-n -r'\n// The result is:\n// {\n//\t\tcwd: 'abc',\n//\t\trecursive: false,\n//\t\tretry: 3,\n//\t\tstrict: false \n// }\n```\n\n## Types\n\n```js\nclean({\n\tschema: {\n\t\tcwd: {\n\t\t\ttype: require('path')\n\t\t},\n\t\t\n\t\tretry: {\n\t\t\ttype: Boolean\n\t\t}\t\t\n\t}\n}).parseArgv(\n\t['node', 'xxx', '--cwd', 'abc', 'retry', 'false'], \n\tfunction(err, results, details){\n\t\tconsole.log(results.cwd); // the `path.resolved()`d 'abc'\n\t\tconsole.log(results.retry === false); // is a boolean, not a string\n\t}\n)\n```\n\nHow to extend a custom type ? See the \"advanced section\".\n\n## Validators and Setters\n\nValidators and setters of `clean` is implemented by `[checker](https://github.com/kaelzhang/node-checker)`, check the apis of `checker` for details.\n\nYou could check out the demo located at \"example/clean.js\". That is a very complicated situation of usage.\n\n```sh\nnode example/clean.js --username guest\n```\n\n\n\n# Programatical Details\n\n## constructor: clean(schema, options)\n\n\n### options\n\n#### options.offset `Number=`\n\nThe offset from which the parser should start to parse. Optional. Default to `2`.\n\n#### options.shorthands `Object=`\n\nThe shorthands used to parse the argv.\n\n#### options.schema `Object=`\n\nThe schema used to clean the given object or the parsred argv\n\n#### options.check_all `Boolean=false`\n\n#### options.parallel `Boolean=false`\n\n#### options.limit `Boolean=false`\n\n\n## .argv(argv)\n\nParses the argument vector, without cleaning the data.\n\n### argv `Array`\n\n### returns `Object`\n\nThe parsed object with shorthand rules applied.\n\n\n## .clean(data, callback)\n\nCleans the given data according to the `schema`.\n\n### data `Object`\n\nThe given data.\n\n### callback `function(err, results, details)`\n\n\n## .parseArgv(argv, callback)\n\nParses argument vector (argv) or something like argv, and cleans the parsed data according to the `schema`.\n\nThis method is equivalent to `c.clean(c.argv(argv), callback)`.\n\n# Advanced Section\n\n## .registerType(type, typeDef)\n\n\n\n\n\n\n","_id":"clean@2.1.2","dist":{"shasum":"1b331a23e2352a0ef4e145a72cfce1b461f36c41","tarball":"http://localhost:1337/clean/-/clean-2.1.2.tgz"},"_from":".","_npmVersion":"1.3.11","_npmUser":{"name":"kael","email":"i@kael.me"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"2.1.3":{"name":"clean","version":"2.1.3","description":"clean parses and santitize argv or options for node, supporting fully extendable types, shorthands, validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git@github.com:kaelzhang/node-clean.git"},"keywords":["argv","parser","argument-vector","cleaner","simple"],"author":{"name":"Kael"},"license":"MIT","readmeFilename":"README.md","bugs":{"url":"https://github.com/kaelzhang/node-clean/issues"},"dependencies":{"checker":"~0.5.1","minimist":"~0.0.5"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"readme":"[![NPM version](https://badge.fury.io/js/clean.png)](http://badge.fury.io/js/clean)\n[![Build Status](https://travis-ci.org/kaelzhang/node-clean.png?branch=master)](https://travis-ci.org/kaelzhang/node-clean)\n[![Dependency Status](https://gemnasium.com/kaelzhang/node-clean.png)](https://gemnasium.com/kaelzhang/node-clean)\n\n# clean\n\nClean is small but powerful node.js module that parses and santitize argv or options for node, supporting:\n\n- fully extendable types\n- shorthands\n- validatiors\n- setters\n\n# Installation and Usage\n\n```sh\nnpm install clean --save\n```\n\n```js\nvar clean = require('clean')(options);\n```\n\n# Usage\n\n## Argv Shorthands\n\nWe can define shorthands with the option `options.shorthands`.\n\n```js\nvar shorthands = {\n\t// if `String`, define a shorthand for a key name\n\tc: 'cwd',\n\t// if `Array`, define a pattern slice of argv\n\tnr: ['--no-recursive'],\n\t// if `Object`, define a specific value\n\tr3: {\n\t\tretry: 3,\n\t\tstrict: false\n\t}\n};\nclean({\n\tshorthands: shorthands\n}).argv(['node', 'xxx', '-c', 'abc', '--nr', '--r3']); \n// notice that '-nr' will be considered as '-n -r'\n// The result is:\n// {\n//\t\tcwd: 'abc',\n//\t\trecursive: false,\n//\t\tretry: 3,\n//\t\tstrict: false \n// }\n```\n\n## Types\n\n```js\nclean({\n\tschema: {\n\t\tcwd: {\n\t\t\ttype: require('path')\n\t\t},\n\t\t\n\t\tretry: {\n\t\t\ttype: Boolean\n\t\t}\t\t\n\t}\n}).parseArgv(\n\t['node', 'xxx', '--cwd', 'abc', 'retry', 'false'], \n\tfunction(err, results, details){\n\t\tconsole.log(results.cwd); // the `path.resolved()`d 'abc'\n\t\tconsole.log(results.retry === false); // is a boolean, not a string\n\t}\n)\n```\n\nHow to extend a custom type ? See the \"advanced section\".\n\n## Validators and Setters\n\nValidators and setters of `clean` is implemented by `[checker](https://github.com/kaelzhang/node-checker)`, check the apis of `checker` for details.\n\nYou could check out the demo located at \"example/clean.js\". That is a very complicated situation of usage.\n\n```sh\nnode example/clean.js --username guest\n```\n\n\n\n# Programatical Details\n\n## constructor: clean(schema, options)\n\n\n### options\n\n#### options.offset `Number=`\n\nThe offset from which the parser should start to parse. Optional. Default to `2`.\n\n#### options.shorthands `Object=`\n\nThe shorthands used to parse the argv.\n\n#### options.schema `Object=`\n\nThe schema used to clean the given object or the parsred argv\n\n#### options.check_all `Boolean=false`\n\n#### options.parallel `Boolean=false`\n\n#### options.limit `Boolean=false`\n\n\n## .argv(argv)\n\nParses the argument vector, without cleaning the data.\n\n### argv `Array`\n\n### returns `Object`\n\nThe parsed object with shorthand rules applied.\n\n\n## .clean(data, callback)\n\nCleans the given data according to the `schema`.\n\n### data `Object`\n\nThe given data.\n\n### callback `function(err, results, details)`\n\n\n## .parseArgv(argv, callback)\n\nParses argument vector (argv) or something like argv, and cleans the parsed data according to the `schema`.\n\nThis method is equivalent to `c.clean(c.argv(argv), callback)`.\n\n# Advanced Section\n\n## .registerType(type, typeDef)\n\n\n\n\n\n\n","_id":"clean@2.1.3","dist":{"shasum":"b7cc64b5f6254daed3a285ae6845a826f1e16c71","tarball":"http://localhost:1337/clean/-/clean-2.1.3.tgz"},"_from":".","_npmVersion":"1.3.11","_npmUser":{"name":"kael","email":"i@kael.me"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"2.1.4":{"name":"clean","version":"2.1.4","description":"clean parses and santitize argv or options for node, supporting fully extendable types, shorthands, validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git@github.com:kaelzhang/node-clean.git"},"keywords":["argv","parser","argument-vector","cleaner","simple"],"author":{"name":"Kael"},"license":"MIT","readmeFilename":"README.md","bugs":{"url":"https://github.com/kaelzhang/node-clean/issues"},"dependencies":{"checker":"~0.5.1","minimist":"~0.0.5"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"readme":"[![NPM version](https://badge.fury.io/js/clean.png)](http://badge.fury.io/js/clean)\n[![Build Status](https://travis-ci.org/kaelzhang/node-clean.png?branch=master)](https://travis-ci.org/kaelzhang/node-clean)\n[![Dependency Status](https://gemnasium.com/kaelzhang/node-clean.png)](https://gemnasium.com/kaelzhang/node-clean)\n\n# clean\n\nClean is small but powerful node.js module that parses and santitize argv or options for node, supporting:\n\n- fully extendable types\n- shorthands\n- validatiors\n- setters\n\n# Installation and Usage\n\n```sh\nnpm install clean --save\n```\n\n```js\nvar clean = require('clean')(options);\n```\n\n# Usage\n\n## Argv Shorthands\n\nWe can define shorthands with the option `options.shorthands`.\n\n```js\nvar shorthands = {\n\t// if `String`, define a shorthand for a key name\n\tc: 'cwd',\n\t// if `Array`, define a pattern slice of argv\n\tnr: ['--no-recursive'],\n\t// if `Object`, define a specific value\n\tr3: {\n\t\tretry: 3,\n\t\tstrict: false\n\t}\n};\nclean({\n\tshorthands: shorthands\n}).argv(['node', 'xxx', '-c', 'abc', '--nr', '--r3']); \n// notice that '-nr' will be considered as '-n -r'\n// The result is:\n// {\n//\t\tcwd: 'abc',\n//\t\trecursive: false,\n//\t\tretry: 3,\n//\t\tstrict: false \n// }\n```\n\n## Types\n\n```js\nclean({\n\tschema: {\n\t\tcwd: {\n\t\t\ttype: require('path')\n\t\t},\n\t\t\n\t\tretry: {\n\t\t\ttype: Boolean\n\t\t}\t\t\n\t}\n}).parseArgv(\n\t['node', 'xxx', '--cwd', 'abc', 'retry', 'false'], \n\tfunction(err, results, details){\n\t\tconsole.log(results.cwd); // the `path.resolved()`d 'abc'\n\t\tconsole.log(results.retry === false); // is a boolean, not a string\n\t}\n)\n```\n\nHow to extend a custom type ? See the \"advanced section\".\n\n## Validators and Setters\n\nValidators and setters of `clean` is implemented by `[checker](https://github.com/kaelzhang/node-checker)`, check the apis of `checker` for details.\n\nYou could check out the demo located at \"example/clean.js\". That is a very complicated situation of usage.\n\n```sh\nnode example/clean.js --username guest\n```\n\n\n\n# Programatical Details\n\n## constructor: clean(schema, options)\n\n\n### options\n\n#### options.offset `Number=`\n\nThe offset from which the parser should start to parse. Optional. Default to `2`.\n\n#### options.shorthands `Object=`\n\nThe shorthands used to parse the argv.\n\n#### options.schema `Object=`\n\nThe schema used to clean the given object or the parsred argv\n\n#### options.check_all `Boolean=false`\n\n#### options.parallel `Boolean=false`\n\n#### options.limit `Boolean=false`\n\n\n## .argv(argv)\n\nParses the argument vector, without cleaning the data.\n\n### argv `Array`\n\n### returns `Object`\n\nThe parsed object with shorthand rules applied.\n\n\n## .clean(data, callback)\n\nCleans the given data according to the `schema`.\n\n### data `Object`\n\nThe given data.\n\n### callback `function(err, results, details)`\n\n\n## .parseArgv(argv, callback)\n\nParses argument vector (argv) or something like argv, and cleans the parsed data according to the `schema`.\n\nThis method is equivalent to `c.clean(c.argv(argv), callback)`.\n\n# Advanced Section\n\n## .registerType(type, typeDef)\n\n\n\n\n\n\n","_id":"clean@2.1.4","dist":{"shasum":"4c93d479f635b64e0df1230729811030b71ed2e0","tarball":"http://localhost:1337/clean/-/clean-2.1.4.tgz"},"_from":".","_npmVersion":"1.3.11","_npmUser":{"name":"kael","email":"i@kael.me"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"2.1.5":{"name":"clean","version":"2.1.5","description":"clean parses and santitize argv or options for node, supporting fully extendable types, shorthands, validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git@github.com:kaelzhang/node-clean.git"},"keywords":["argv","parser","argument-vector","cleaner","simple"],"author":{"name":"Kael"},"license":"MIT","readmeFilename":"README.md","bugs":{"url":"https://github.com/kaelzhang/node-clean/issues"},"dependencies":{"checker":"~0.5.1","minimist":"~0.0.5"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"readme":"[![NPM version](https://badge.fury.io/js/clean.png)](http://badge.fury.io/js/clean)\n[![Build Status](https://travis-ci.org/kaelzhang/node-clean.png?branch=master)](https://travis-ci.org/kaelzhang/node-clean)\n[![Dependency Status](https://gemnasium.com/kaelzhang/node-clean.png)](https://gemnasium.com/kaelzhang/node-clean)\n\n# clean\n\nClean is small but powerful node.js module that parses and santitize argv or options for node, supporting:\n\n- fully extendable types\n- shorthands\n- validatiors\n- setters\n\n# Installation and Usage\n\n```sh\nnpm install clean --save\n```\n\n```js\nvar clean = require('clean')(options);\n```\n\n# Usage\n\n## Argv Shorthands\n\nWe can define shorthands with the option `options.shorthands`.\n\n```js\nvar shorthands = {\n\t// if `String`, define a shorthand for a key name\n\tc: 'cwd',\n\t// if `Array`, define a pattern slice of argv\n\tnr: ['--no-recursive'],\n\t// if `Object`, define a specific value\n\tr3: {\n\t\tretry: 3,\n\t\tstrict: false\n\t}\n};\nclean({\n\tshorthands: shorthands\n}).argv(['node', 'xxx', '-c', 'abc', '--nr', '--r3']); \n// notice that '-nr' will be considered as '-n -r'\n// The result is:\n// {\n//\t\tcwd: 'abc',\n//\t\trecursive: false,\n//\t\tretry: 3,\n//\t\tstrict: false \n// }\n```\n\n## Types\n\n```js\nclean({\n\tschema: {\n\t\tcwd: {\n\t\t\ttype: require('path')\n\t\t},\n\t\t\n\t\tretry: {\n\t\t\ttype: Boolean\n\t\t}\t\t\n\t}\n}).parseArgv(\n\t['node', 'xxx', '--cwd', 'abc', 'retry', 'false'], \n\tfunction(err, results, details){\n\t\tconsole.log(results.cwd); // the `path.resolved()`d 'abc'\n\t\tconsole.log(results.retry === false); // is a boolean, not a string\n\t}\n)\n```\n\nHow to extend a custom type ? See the \"advanced section\".\n\n## Validators and Setters\n\nValidators and setters of `clean` is implemented by `[checker](https://github.com/kaelzhang/node-checker)`, check the apis of `checker` for details.\n\nYou could check out the demo located at \"example/clean.js\". That is a very complicated situation of usage.\n\n```sh\nnode example/clean.js --username guest\n```\n\n\n\n# Programatical Details\n\n## constructor: clean(schema, options)\n\n\n### options\n\n#### options.offset `Number=`\n\nThe offset from which the parser should start to parse. Optional. Default to `2`.\n\n#### options.shorthands `Object=`\n\nThe shorthands used to parse the argv.\n\n#### options.schema `Object=`\n\nThe schema used to clean the given object or the parsred argv\n\n#### options.check_all `Boolean=false`\n\n#### options.parallel `Boolean=false`\n\n#### options.limit `Boolean=false`\n\n\n## .argv(argv)\n\nParses the argument vector, without cleaning the data.\n\n### argv `Array`\n\n### returns `Object`\n\nThe parsed object with shorthand rules applied.\n\n\n## .clean(data, callback)\n\nCleans the given data according to the `schema`.\n\n### data `Object`\n\nThe given data.\n\n### callback `function(err, results, details)`\n\n\n## .parseArgv(argv, callback)\n\nParses argument vector (argv) or something like argv, and cleans the parsed data according to the `schema`.\n\nThis method is equivalent to `c.clean(c.argv(argv), callback)`.\n\n# Advanced Section\n\n## .registerType(type, typeDef)\n\n\n\n\n\n\n","_id":"clean@2.1.5","dist":{"shasum":"62c230d6c08ab4d21388b9cbdfd2519bfd43bde9","tarball":"http://localhost:1337/clean/-/clean-2.1.5.tgz"},"_from":".","_npmVersion":"1.3.11","_npmUser":{"name":"kael","email":"i@kael.me"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"2.1.6":{"name":"clean","version":"2.1.6","description":"clean parses and santitize argv or options for node, supporting fully extendable types, shorthands, validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git@github.com:kaelzhang/node-clean.git"},"keywords":["argv","parser","argument-vector","cleaner","simple"],"author":{"name":"Kael"},"license":"MIT","readmeFilename":"README.md","bugs":{"url":"https://github.com/kaelzhang/node-clean/issues"},"dependencies":{"checker":"~0.5.1","minimist":"~0.0.5"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"readme":"[![NPM version](https://badge.fury.io/js/clean.png)](http://badge.fury.io/js/clean)\n[![Build Status](https://travis-ci.org/kaelzhang/node-clean.png?branch=master)](https://travis-ci.org/kaelzhang/node-clean)\n[![Dependency Status](https://gemnasium.com/kaelzhang/node-clean.png)](https://gemnasium.com/kaelzhang/node-clean)\n\n# clean\n\nClean is small but powerful node.js module that parses and santitize argv or options for node, supporting:\n\n- fully extendable types\n- shorthands\n- validatiors\n- setters\n\n# Installation and Usage\n\n```sh\nnpm install clean --save\n```\n\n```js\nvar clean = require('clean')(options);\n```\n\n# Usage\n\n## Argv Shorthands\n\nWe can define shorthands with the option `options.shorthands`.\n\n```js\nvar shorthands = {\n\t// if `String`, define a shorthand for a key name\n\tc: 'cwd',\n\t// if `Array`, define a pattern slice of argv\n\tnr: ['--no-recursive'],\n\t// if `Object`, define a specific value\n\tr3: {\n\t\tretry: 3,\n\t\tstrict: false\n\t}\n};\nclean({\n\tshorthands: shorthands\n}).argv(['node', 'xxx', '-c', 'abc', '--nr', '--r3']); \n// notice that '-nr' will be considered as '-n -r'\n// The result is:\n// {\n//\t\tcwd: 'abc',\n//\t\trecursive: false,\n//\t\tretry: 3,\n//\t\tstrict: false \n// }\n```\n\n## Types\n\n```js\nclean({\n\tschema: {\n\t\tcwd: {\n\t\t\ttype: require('path')\n\t\t},\n\t\t\n\t\tretry: {\n\t\t\ttype: Boolean\n\t\t}\t\t\n\t}\n}).parseArgv(\n\t['node', 'xxx', '--cwd', 'abc', 'retry', 'false'], \n\tfunction(err, results, details){\n\t\tconsole.log(results.cwd); // the `path.resolved()`d 'abc'\n\t\tconsole.log(results.retry === false); // is a boolean, not a string\n\t}\n)\n```\n\nHow to extend a custom type ? See the \"advanced section\".\n\n## Validators and Setters\n\nValidators and setters of `clean` is implemented by `[checker](https://github.com/kaelzhang/node-checker)`, check the apis of `checker` for details.\n\nYou could check out the demo located at \"example/clean.js\". That is a very complicated situation of usage.\n\n```sh\nnode example/clean.js --username guest\n```\n\n\n\n# Programatical Details\n\n## constructor: clean(schema, options)\n\n\n### options\n\n#### options.offset `Number=`\n\nThe offset from which the parser should start to parse. Optional. Default to `2`.\n\n#### options.shorthands `Object=`\n\nThe shorthands used to parse the argv.\n\n#### options.schema `Object=`\n\nThe schema used to clean the given object or the parsred argv\n\n#### options.check_all `Boolean=false`\n\n#### options.parallel `Boolean=false`\n\n#### options.limit `Boolean=false`\n\n\n## .argv(argv)\n\nParses the argument vector, without cleaning the data.\n\n### argv `Array`\n\n### returns `Object`\n\nThe parsed object with shorthand rules applied.\n\n\n## .clean(data, callback)\n\nCleans the given data according to the `schema`.\n\n### data `Object`\n\nThe given data.\n\n### callback `function(err, results, details)`\n\n\n## .parseArgv(argv, callback)\n\nParses argument vector (argv) or something like argv, and cleans the parsed data according to the `schema`.\n\nThis method is equivalent to `c.clean(c.argv(argv), callback)`.\n\n# Advanced Section\n\n## .registerType(type, typeDef)\n\n\n\n\n\n\n","_id":"clean@2.1.6","dist":{"shasum":"41c80b2b6f5432c60cddb81932ab56563b444f52","tarball":"http://localhost:1337/clean/-/clean-2.1.6.tgz"},"_from":".","_npmVersion":"1.3.11","_npmUser":{"name":"kael","email":"i@kael.me"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}}},"readme":"ERROR: No README data found!","maintainers":[{"name":"kael","email":"i@kael.me"}],"time":{"modified":"2013-10-29T10:27:09.584Z","created":"2013-10-09T14:58:35.029Z","0.0.0":"2013-10-09T14:58:56.786Z","1.1.3":"2013-10-09T17:11:53.515Z","1.1.4":"2013-10-09T17:14:02.537Z","2.1.1":"2013-10-10T04:10:32.004Z","2.1.2":"2013-10-14T13:43:09.309Z","2.1.3":"2013-10-14T15:49:01.158Z","2.1.4":"2013-10-17T03:15:37.028Z","2.1.5":"2013-10-17T03:21:04.145Z","2.1.6":"2013-10-29T10:27:09.584Z"},"author":{"name":"Kael"},"repository":{"type":"git","url":"git@github.com:kaelzhang/node-clean.git"},"_attachments":{}}
\ No newline at end of file
diff --git a/deps/npm/test/npm_cache/_cacache/content-v2/sha512/83/a4/d747b806caae7385778145bcf999fae69eeb6f14343f6801b79b6b7853538961694ac8b4791c7675c27928b5495d12d2f944867db1105e424d5fa9b1e015 b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/83/a4/d747b806caae7385778145bcf999fae69eeb6f14343f6801b79b6b7853538961694ac8b4791c7675c27928b5495d12d2f944867db1105e424d5fa9b1e015
new file mode 100644
index 00000000000000..3dd7b758016759
--- /dev/null
+++ b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/83/a4/d747b806caae7385778145bcf999fae69eeb6f14343f6801b79b6b7853538961694ac8b4791c7675c27928b5495d12d2f944867db1105e424d5fa9b1e015
@@ -0,0 +1 @@
+{"_id":"minimist","_rev":"14-5a3ee715a591f6fb1267503070d3d114","name":"minimist","description":"parse argument options","dist-tags":{"latest":"0.0.5"},"versions":{"0.0.0":{"name":"minimist","version":"0.0.0","description":"parse argument options","main":"index.js","devDependencies":{"tape":"~1.0.4","tap":"~0.4.0"},"scripts":{"test":"tap test/*.js"},"testling":{"files":"test/*.js","browsers":["ie/6..latest","ff/3.6","firefox/latest","chrome/10","chrome/latest","safari/5.1","safari/latest","opera/12"]},"repository":{"type":"git","url":"git://github.com/substack/minimist.git"},"homepage":"https://github.com/substack/minimist","keywords":["argv","getopt","parser","optimist"],"author":{"name":"James Halliday","email":"mail@substack.net","url":"http://substack.net"},"license":"MIT","readme":"# minimist\n\nparse argument options\n\nThis module is the guts of optimist's argument parser without all the\nfanciful decoration.\n\n[![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist)\n\n[![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist)\n\n# example\n\n``` js\nvar argv = require('minimist')(process.argv.slice(2));\nconsole.dir(argv);\n```\n\n```\n$ node example/parse.js -a beep -b boop\n{ _: [], a: 'beep', b: 'boop' }\n```\n\n```\n$ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz\n{ _: [ 'foo', 'bar', 'baz' ],\n x: 3,\n y: 4,\n n: 5,\n a: true,\n b: true,\n c: true,\n beep: 'boop' }\n```\n\n# methods\n\n``` js\nvar parseArgs = require('minimist')\n```\n\n## var argv = parseArgs(args, opts={})\n\nReturn an argument object `argv` populated with the array arguments from `args`.\n\n`argv._` contains all the arguments that didn't have an option associated with\nthem.\n\nNumeric-looking arguments will be returned as numbers unless `opts.string` or\n`opts.boolean` is set for that argument name.\n\nAny arguments after `'--'` will not be parsed and will end up in `argv._`.\n\noptions can be:\n\n* `opts.string` - a string or array of strings argument names to always treat as\nstrings\n* `opts.boolean` - a string or array of strings to always treat as booleans\n* `opts.alias` - an object mapping string names to strings or arrays of string\nargument names to use as aliases\n* `opts.default` - an object mapping string argument names to default values\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install minimist\n```\n\n# license\n\nMIT\n","readmeFilename":"readme.markdown","bugs":{"url":"https://github.com/substack/minimist/issues"},"_id":"minimist@0.0.0","dist":{"shasum":"0f62459b3333ea881e554e400243e130ef123568","tarball":"http://localhost:1337/minimist/-/minimist-0.0.0.tgz"},"_from":".","_npmVersion":"1.3.0","_npmUser":{"name":"substack","email":"mail@substack.net"},"maintainers":[{"name":"substack","email":"mail@substack.net"}],"directories":{}},"0.0.1":{"name":"minimist","version":"0.0.1","description":"parse argument options","main":"index.js","devDependencies":{"tape":"~1.0.4","tap":"~0.4.0"},"scripts":{"test":"tap test/*.js"},"testling":{"files":"test/*.js","browsers":["ie/6..latest","ff/3.6","firefox/latest","chrome/10","chrome/latest","safari/5.1","safari/latest","opera/12"]},"repository":{"type":"git","url":"git://github.com/substack/minimist.git"},"homepage":"https://github.com/substack/minimist","keywords":["argv","getopt","parser","optimist"],"author":{"name":"James Halliday","email":"mail@substack.net","url":"http://substack.net"},"license":"MIT","readme":"# minimist\n\nparse argument options\n\nThis module is the guts of optimist's argument parser without all the\nfanciful decoration.\n\n[![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist)\n\n[![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist)\n\n# example\n\n``` js\nvar argv = require('minimist')(process.argv.slice(2));\nconsole.dir(argv);\n```\n\n```\n$ node example/parse.js -a beep -b boop\n{ _: [], a: 'beep', b: 'boop' }\n```\n\n```\n$ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz\n{ _: [ 'foo', 'bar', 'baz' ],\n x: 3,\n y: 4,\n n: 5,\n a: true,\n b: true,\n c: true,\n beep: 'boop' }\n```\n\n# methods\n\n``` js\nvar parseArgs = require('minimist')\n```\n\n## var argv = parseArgs(args, opts={})\n\nReturn an argument object `argv` populated with the array arguments from `args`.\n\n`argv._` contains all the arguments that didn't have an option associated with\nthem.\n\nNumeric-looking arguments will be returned as numbers unless `opts.string` or\n`opts.boolean` is set for that argument name.\n\nAny arguments after `'--'` will not be parsed and will end up in `argv._`.\n\noptions can be:\n\n* `opts.string` - a string or array of strings argument names to always treat as\nstrings\n* `opts.boolean` - a string or array of strings to always treat as booleans\n* `opts.alias` - an object mapping string names to strings or arrays of string\nargument names to use as aliases\n* `opts.default` - an object mapping string argument names to default values\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install minimist\n```\n\n# license\n\nMIT\n","readmeFilename":"readme.markdown","bugs":{"url":"https://github.com/substack/minimist/issues"},"_id":"minimist@0.0.1","dist":{"shasum":"fa2439fbf7da8525c51b2a74e2815b380abc8ab6","tarball":"http://localhost:1337/minimist/-/minimist-0.0.1.tgz"},"_from":".","_npmVersion":"1.3.0","_npmUser":{"name":"substack","email":"mail@substack.net"},"maintainers":[{"name":"substack","email":"mail@substack.net"}],"directories":{}},"0.0.2":{"name":"minimist","version":"0.0.2","description":"parse argument options","main":"index.js","devDependencies":{"tape":"~1.0.4","tap":"~0.4.0"},"scripts":{"test":"tap test/*.js"},"testling":{"files":"test/*.js","browsers":["ie/6..latest","ff/5","firefox/latest","chrome/10","chrome/latest","safari/5.1","safari/latest","opera/12"]},"repository":{"type":"git","url":"git://github.com/substack/minimist.git"},"homepage":"https://github.com/substack/minimist","keywords":["argv","getopt","parser","optimist"],"author":{"name":"James Halliday","email":"mail@substack.net","url":"http://substack.net"},"license":"MIT","readme":"# minimist\n\nparse argument options\n\nThis module is the guts of optimist's argument parser without all the\nfanciful decoration.\n\n[![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist)\n\n[![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist)\n\n# example\n\n``` js\nvar argv = require('minimist')(process.argv.slice(2));\nconsole.dir(argv);\n```\n\n```\n$ node example/parse.js -a beep -b boop\n{ _: [], a: 'beep', b: 'boop' }\n```\n\n```\n$ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz\n{ _: [ 'foo', 'bar', 'baz' ],\n x: 3,\n y: 4,\n n: 5,\n a: true,\n b: true,\n c: true,\n beep: 'boop' }\n```\n\n# methods\n\n``` js\nvar parseArgs = require('minimist')\n```\n\n## var argv = parseArgs(args, opts={})\n\nReturn an argument object `argv` populated with the array arguments from `args`.\n\n`argv._` contains all the arguments that didn't have an option associated with\nthem.\n\nNumeric-looking arguments will be returned as numbers unless `opts.string` or\n`opts.boolean` is set for that argument name.\n\nAny arguments after `'--'` will not be parsed and will end up in `argv._`.\n\noptions can be:\n\n* `opts.string` - a string or array of strings argument names to always treat as\nstrings\n* `opts.boolean` - a string or array of strings to always treat as booleans\n* `opts.alias` - an object mapping string names to strings or arrays of string\nargument names to use as aliases\n* `opts.default` - an object mapping string argument names to default values\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install minimist\n```\n\n# license\n\nMIT\n","readmeFilename":"readme.markdown","bugs":{"url":"https://github.com/substack/minimist/issues"},"_id":"minimist@0.0.2","dist":{"shasum":"3297e0500be195b8fcb56668c45b925bc9bca7ab","tarball":"http://localhost:1337/minimist/-/minimist-0.0.2.tgz"},"_from":".","_npmVersion":"1.3.7","_npmUser":{"name":"substack","email":"mail@substack.net"},"maintainers":[{"name":"substack","email":"mail@substack.net"}],"directories":{}},"0.0.3":{"name":"minimist","version":"0.0.3","description":"parse argument options","main":"index.js","devDependencies":{"tape":"~1.0.4","tap":"~0.4.0"},"scripts":{"test":"tap test/*.js"},"testling":{"files":"test/*.js","browsers":["ie/6..latest","ff/5","firefox/latest","chrome/10","chrome/latest","safari/5.1","safari/latest","opera/12"]},"repository":{"type":"git","url":"git://github.com/substack/minimist.git"},"homepage":"https://github.com/substack/minimist","keywords":["argv","getopt","parser","optimist"],"author":{"name":"James Halliday","email":"mail@substack.net","url":"http://substack.net"},"license":"MIT","readme":"# minimist\n\nparse argument options\n\nThis module is the guts of optimist's argument parser without all the\nfanciful decoration.\n\n[![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist)\n\n[![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist)\n\n# example\n\n``` js\nvar argv = require('minimist')(process.argv.slice(2));\nconsole.dir(argv);\n```\n\n```\n$ node example/parse.js -a beep -b boop\n{ _: [], a: 'beep', b: 'boop' }\n```\n\n```\n$ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz\n{ _: [ 'foo', 'bar', 'baz' ],\n x: 3,\n y: 4,\n n: 5,\n a: true,\n b: true,\n c: true,\n beep: 'boop' }\n```\n\n# methods\n\n``` js\nvar parseArgs = require('minimist')\n```\n\n## var argv = parseArgs(args, opts={})\n\nReturn an argument object `argv` populated with the array arguments from `args`.\n\n`argv._` contains all the arguments that didn't have an option associated with\nthem.\n\nNumeric-looking arguments will be returned as numbers unless `opts.string` or\n`opts.boolean` is set for that argument name.\n\nAny arguments after `'--'` will not be parsed and will end up in `argv._`.\n\noptions can be:\n\n* `opts.string` - a string or array of strings argument names to always treat as\nstrings\n* `opts.boolean` - a string or array of strings to always treat as booleans\n* `opts.alias` - an object mapping string names to strings or arrays of string\nargument names to use as aliases\n* `opts.default` - an object mapping string argument names to default values\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install minimist\n```\n\n# license\n\nMIT\n","readmeFilename":"readme.markdown","bugs":{"url":"https://github.com/substack/minimist/issues"},"_id":"minimist@0.0.3","dist":{"shasum":"a7a2ef8fbafecbae6c1baa4e56ad81e77acacb94","tarball":"http://localhost:1337/minimist/-/minimist-0.0.3.tgz"},"_from":".","_npmVersion":"1.3.7","_npmUser":{"name":"substack","email":"mail@substack.net"},"maintainers":[{"name":"substack","email":"mail@substack.net"}],"directories":{}},"0.0.4":{"name":"minimist","version":"0.0.4","description":"parse argument options","main":"index.js","devDependencies":{"tape":"~1.0.4","tap":"~0.4.0"},"scripts":{"test":"tap test/*.js"},"testling":{"files":"test/*.js","browsers":["ie/6..latest","ff/5","firefox/latest","chrome/10","chrome/latest","safari/5.1","safari/latest","opera/12"]},"repository":{"type":"git","url":"git://github.com/substack/minimist.git"},"homepage":"https://github.com/substack/minimist","keywords":["argv","getopt","parser","optimist"],"author":{"name":"James Halliday","email":"mail@substack.net","url":"http://substack.net"},"license":"MIT","readme":"# minimist\n\nparse argument options\n\nThis module is the guts of optimist's argument parser without all the\nfanciful decoration.\n\n[![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist)\n\n[![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist)\n\n# example\n\n``` js\nvar argv = require('minimist')(process.argv.slice(2));\nconsole.dir(argv);\n```\n\n```\n$ node example/parse.js -a beep -b boop\n{ _: [], a: 'beep', b: 'boop' }\n```\n\n```\n$ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz\n{ _: [ 'foo', 'bar', 'baz' ],\n x: 3,\n y: 4,\n n: 5,\n a: true,\n b: true,\n c: true,\n beep: 'boop' }\n```\n\n# methods\n\n``` js\nvar parseArgs = require('minimist')\n```\n\n## var argv = parseArgs(args, opts={})\n\nReturn an argument object `argv` populated with the array arguments from `args`.\n\n`argv._` contains all the arguments that didn't have an option associated with\nthem.\n\nNumeric-looking arguments will be returned as numbers unless `opts.string` or\n`opts.boolean` is set for that argument name.\n\nAny arguments after `'--'` will not be parsed and will end up in `argv._`.\n\noptions can be:\n\n* `opts.string` - a string or array of strings argument names to always treat as\nstrings\n* `opts.boolean` - a string or array of strings to always treat as booleans\n* `opts.alias` - an object mapping string names to strings or arrays of string\nargument names to use as aliases\n* `opts.default` - an object mapping string argument names to default values\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install minimist\n```\n\n# license\n\nMIT\n","readmeFilename":"readme.markdown","bugs":{"url":"https://github.com/substack/minimist/issues"},"_id":"minimist@0.0.4","dist":{"shasum":"db41b1028484927a9425765b954075f5082f5048","tarball":"http://localhost:1337/minimist/-/minimist-0.0.4.tgz"},"_from":".","_npmVersion":"1.3.7","_npmUser":{"name":"substack","email":"mail@substack.net"},"maintainers":[{"name":"substack","email":"mail@substack.net"}],"directories":{}},"0.0.5":{"name":"minimist","version":"0.0.5","description":"parse argument options","main":"index.js","devDependencies":{"tape":"~1.0.4","tap":"~0.4.0"},"scripts":{"test":"tap test/*.js"},"testling":{"files":"test/*.js","browsers":["ie/6..latest","ff/5","firefox/latest","chrome/10","chrome/latest","safari/5.1","safari/latest","opera/12"]},"repository":{"type":"git","url":"git://github.com/substack/minimist.git"},"homepage":"https://github.com/substack/minimist","keywords":["argv","getopt","parser","optimist"],"author":{"name":"James Halliday","email":"mail@substack.net","url":"http://substack.net"},"license":"MIT","readme":"# minimist\n\nparse argument options\n\nThis module is the guts of optimist's argument parser without all the\nfanciful decoration.\n\n[![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist)\n\n[![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist)\n\n# example\n\n``` js\nvar argv = require('minimist')(process.argv.slice(2));\nconsole.dir(argv);\n```\n\n```\n$ node example/parse.js -a beep -b boop\n{ _: [], a: 'beep', b: 'boop' }\n```\n\n```\n$ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz\n{ _: [ 'foo', 'bar', 'baz' ],\n x: 3,\n y: 4,\n n: 5,\n a: true,\n b: true,\n c: true,\n beep: 'boop' }\n```\n\n# methods\n\n``` js\nvar parseArgs = require('minimist')\n```\n\n## var argv = parseArgs(args, opts={})\n\nReturn an argument object `argv` populated with the array arguments from `args`.\n\n`argv._` contains all the arguments that didn't have an option associated with\nthem.\n\nNumeric-looking arguments will be returned as numbers unless `opts.string` or\n`opts.boolean` is set for that argument name.\n\nAny arguments after `'--'` will not be parsed and will end up in `argv._`.\n\noptions can be:\n\n* `opts.string` - a string or array of strings argument names to always treat as\nstrings\n* `opts.boolean` - a string or array of strings to always treat as booleans\n* `opts.alias` - an object mapping string names to strings or arrays of string\nargument names to use as aliases\n* `opts.default` - an object mapping string argument names to default values\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install minimist\n```\n\n# license\n\nMIT\n","readmeFilename":"readme.markdown","bugs":{"url":"https://github.com/substack/minimist/issues"},"_id":"minimist@0.0.5","dist":{"shasum":"d7aa327bcecf518f9106ac6b8f003fa3bcea8566","tarball":"http://localhost:1337/minimist/-/minimist-0.0.5.tgz"},"_from":".","_npmVersion":"1.3.7","_npmUser":{"name":"substack","email":"mail@substack.net"},"maintainers":[{"name":"substack","email":"mail@substack.net"}],"directories":{}}},"readme":"# minimist\n\nparse argument options\n\nThis module is the guts of optimist's argument parser without all the\nfanciful decoration.\n\n[![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist)\n\n[![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist)\n\n# example\n\n``` js\nvar argv = require('minimist')(process.argv.slice(2));\nconsole.dir(argv);\n```\n\n```\n$ node example/parse.js -a beep -b boop\n{ _: [], a: 'beep', b: 'boop' }\n```\n\n```\n$ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz\n{ _: [ 'foo', 'bar', 'baz' ],\n x: 3,\n y: 4,\n n: 5,\n a: true,\n b: true,\n c: true,\n beep: 'boop' }\n```\n\n# methods\n\n``` js\nvar parseArgs = require('minimist')\n```\n\n## var argv = parseArgs(args, opts={})\n\nReturn an argument object `argv` populated with the array arguments from `args`.\n\n`argv._` contains all the arguments that didn't have an option associated with\nthem.\n\nNumeric-looking arguments will be returned as numbers unless `opts.string` or\n`opts.boolean` is set for that argument name.\n\nAny arguments after `'--'` will not be parsed and will end up in `argv._`.\n\noptions can be:\n\n* `opts.string` - a string or array of strings argument names to always treat as\nstrings\n* `opts.boolean` - a string or array of strings to always treat as booleans\n* `opts.alias` - an object mapping string names to strings or arrays of string\nargument names to use as aliases\n* `opts.default` - an object mapping string argument names to default values\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install minimist\n```\n\n# license\n\nMIT\n","maintainers":[{"name":"substack","email":"mail@substack.net"}],"time":{"0.0.0":"2013-06-25T08:17:18.123Z","0.0.1":"2013-06-25T08:22:05.384Z","0.0.2":"2013-08-28T23:00:17.595Z","0.0.3":"2013-09-12T16:27:07.340Z","0.0.4":"2013-09-17T15:13:28.184Z","0.0.5":"2013-09-19T06:45:40.016Z"},"author":{"name":"James Halliday","email":"mail@substack.net","url":"http://substack.net"},"repository":{"type":"git","url":"git://github.com/substack/minimist.git"},"users":{"chrisdickinson":true},"_attachments":{"minimist-0.0.5.tgz":{"content_type":"application/octet-stream","revpos":12,"digest":"md5-7fj5eF/2Az1v1uUELt0w5Q==","length":5977,"stub":true},"minimist-0.0.4.tgz":{"content_type":"application/octet-stream","revpos":10,"digest":"md5-qnJpUhb/cbsf0S5JujcKvA==","length":5952,"stub":true},"minimist-0.0.3.tgz":{"content_type":"application/octet-stream","revpos":8,"digest":"md5-2LmD2Da9atq7rqeguE/HPQ==","length":5871,"stub":true},"minimist-0.0.2.tgz":{"content_type":"application/octet-stream","revpos":6,"digest":"md5-8HHOFLuI1T4ko3lvq362pA==","length":5821,"stub":true},"minimist-0.0.1.tgz":{"content_type":"application/octet-stream","revpos":4,"digest":"md5-sEDD6p3usslEbK4/f4Rbpw==","length":5691,"stub":true},"minimist-0.0.0.tgz":{"content_type":"application/octet-stream","revpos":2,"digest":"md5-g7dzol87egGxtducC9bdFw==","length":5687,"stub":true}}}
\ No newline at end of file
diff --git a/deps/npm/test/npm_cache/_cacache/content-v2/sha512/87/44/66cb46d039cc912bd3ee29bfae97ac7f4dd4051cd240c1b25548747f9f1c6fdc3a2a9e65b058ab28f0a22b4aaee58075e0c77fd00ddf656536bc543290be b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/87/44/66cb46d039cc912bd3ee29bfae97ac7f4dd4051cd240c1b25548747f9f1c6fdc3a2a9e65b058ab28f0a22b4aaee58075e0c77fd00ddf656536bc543290be
new file mode 100644
index 00000000000000..b4e8a5f74ed1a4
--- /dev/null
+++ b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/87/44/66cb46d039cc912bd3ee29bfae97ac7f4dd4051cd240c1b25548747f9f1c6fdc3a2a9e65b058ab28f0a22b4aaee58075e0c77fd00ddf656536bc543290be
@@ -0,0 +1 @@
+{"_id":"npm-test-peer-deps","_rev":"2-fb584f2e7674d4ae2a93f2f9086e7268","name":"npm-test-peer-deps","dist-tags":{"latest":"0.0.0"},"versions":{"0.0.0":{"author":{"name":"Domenic Denicola"},"name":"npm-test-peer-deps","version":"0.0.0","peerDependencies":{"request":"0.9.x"},"dependencies":{"underscore":"1.3.1"},"_id":"npm-test-peer-deps@0.0.0","dist":{"shasum":"82f3ccba11914dc88bcb185ee3b1b33b564272bc","tarball":"http://localhost:1337/npm-test-peer-deps/-/npm-test-peer-deps-0.0.0.tgz"},"_from":".","_npmVersion":"1.3.25","_npmUser":{"name":"domenic","email":"domenic@domenicdenicola.com"},"maintainers":[{"name":"domenic","email":"domenic@domenicdenicola.com"}],"directories":{}}},"readme":"ERROR: No README data found!","maintainers":[{"name":"domenic","email":"domenic@domenicdenicola.com"}],"time":{"0.0.0":"2014-02-08T04:56:36.743Z"},"readmeFilename":"","_attachments":{}}
\ No newline at end of file
diff --git a/deps/npm/test/npm_cache/_cacache/content-v2/sha512/88/49/914fc692dc5441fec8231a33caec98409b6d522fa46bed4a673127876224b9cb8bc35e51e251c8a897a1d71dd9d7f46b6217ec8ef30ec4d83f4b9a43098d b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/88/49/914fc692dc5441fec8231a33caec98409b6d522fa46bed4a673127876224b9cb8bc35e51e251c8a897a1d71dd9d7f46b6217ec8ef30ec4d83f4b9a43098d
new file mode 100644
index 00000000000000..9de484272a7327
--- /dev/null
+++ b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/88/49/914fc692dc5441fec8231a33caec98409b6d522fa46bed4a673127876224b9cb8bc35e51e251c8a897a1d71dd9d7f46b6217ec8ef30ec4d83f4b9a43098d
@@ -0,0 +1 @@
+{"name":"add-named-update-protocol-porti","versions":{"0.0.0":{"name":"add-named-update-protocol-porti","version":"0.0.0","dist":{"tarball":"http://127.0.0.1:1338/registry/add-named-update-protocol-porti/-/add-named-update-protocol-porti-0.0.0.tgz","shasum":"356a192b7913b04c54574d18c28d46e6395428ab"}}}}
\ No newline at end of file
diff --git a/deps/npm/test/npm_cache/_cacache/content-v2/sha512/9d/5b/15d0ad75fc513f4a327b5e441803dd220edeb4f9660e454fe9d263b543ba356c71330a5964f864d1c24aada16bea028eb40106762b142b30d448cdc08593 b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/9d/5b/15d0ad75fc513f4a327b5e441803dd220edeb4f9660e454fe9d263b543ba356c71330a5964f864d1c24aada16bea028eb40106762b142b30d448cdc08593
new file mode 100644
index 0000000000000000000000000000000000000000..b4bac32b24d86ca4afa12aecdeaa0d9a7dc23148
GIT binary patch
literal 27629
zcmV)5K*_%!iwFP!000006YPC!d)vm9@cS9RVnnBFk|sf2VrQ$?ab;Uhw3Q`4lDtk<
zQbG^}DMTc|0-$AC*8cYQ+~xuhlw{X++O65tB7wQjnRB0WWY4ZO*
z^JjH+_0hNAs=DX!SzG;XRsFw>hmTg*)*d}tg>vxaA(mhL+yklm%=041U?_iYJ3JX>
z{{D-6t~{k0!Mr$2v&N=s?BqcZs{irTVR$x~M@5vWKgWNG-{<}``d&59iek3e>kTI9
z*yo46R-=PWB*8So2F8gL1ZolUxT#Vi09q=xvncQeeYh^B?ll
zESjZxT%_40yucB{v*L2b9mE9{f=BagB2QpU@bB4t;D_n77vm)IUXl)@F81bQ9jJ0m
zjZcF((IW+L=N_YE94Aq3yTi&@s%7xZS$?cwe7+D?jKMvXxG{)TR3dOnOZc-~mmPqKKL1tWOk
zw8@*`wiV?CHU@1f{N8KhDwy(vc`}?tm0=va^HitawfB3&?bpxe`3THNMS08RH;Qx(B-+jgZU*aU{fK+`}McnlfRpQ47zI7eEp)pzwz+lqqXY!Uww!;;p_ST$A5Y~wKbb%X%L=mdc7X}r`CM+LsY2Z
zEK-K(t2`?3-MX({q?wAKxS9n8@P~$Qb%WTuHe5{2iq@pe)6%PQad9o2u;9?Bvn%!brvDV2#v{j
z0bYtG*J`OC8FI~FIE2>C6yp($Q04P5jG|!<-EHqZr#`X>O2(OOAct2+2Rqw`W(1=k
zp5z^@VIIj8Y2vy1VR!%O;nCIszg7zVoCV0W(HeyTO!R_gfj{wJBZ%|RuzEC|6_-xk
z!KIpIbDSZLIi17CKx@Ps;xlX{i}Lw|e#uC5^~z=)ulhcW@qr!igEi&*>YLNn;$Tx;
zY8r0t>$f8SE?keF5vpWiR5M8nHB5_9Jc)AC5H$teASjwiK2Qa$o=Owg3mkZq6xpS%
zv23{9r88_hXs#ShU{T>4jzo8M_c3<>$l*_3*++Qo%;pHzTiOZ60CDikS;bRZ(1J(^
z^Nf>uRC+fH@*KNdHdj13%XSDGdKSEmf?|`~dT5xgjAJVx`54iOZ$x{bO2yRu=p;{i--zB%g|~RqS)63Sq!FdfzX_&-vCFe
z^C+5eOZ_67M}1m$SzWAq9G=CKVHPDF-}g`Ht!^&HGVr@U3Sb+lCbv|aA&y|gF-(*D
z1&_z~;4@8!^GW1K?`CNRb9_wO?s;zk$VT`xi~cr;_M0U^ujO?doaq#@(i0YW=@c<<
zeDf4aXaUh}!w8Mo#P+uP3RX737ydSW*`CI@w@?Z;xik#?P1WQ4d7QzUt84gy0FxDN
zIer2qz0o`gQG}=ID-BGroFv#I3=Uc(SpvpuqKux7^gH$V@nhA{4K!L%6#6g~O(`4%
z*I$umt@6toT2JOv)Fre;xCId%wmc}JVY5XKvZ$B?wgTZ6Tfxm6Lz_u-p?-k*)KACL
z)zsL}FOyI`fiI&Jxy@j?{XT9u0)#Tw7XIXkdeQ#%3nl(uSaUWjC0nbyl=))oD^ZE8G>Rgnv8@?@j!L5Z|(s{NWDJxY8_$#sek|0K>q1
zu~th@P)2ksEL7nSp|}Fr+#vSmrwN=!+}?(+@4iTe@YqbEDcpi?fG8T^^gKnx!!QE9
zUBqyRr6^R_m842IXeRT?r1Jc_b=yXHGOlUx?s+sLG@*CSJc5#Iz|1;DccWxnEi2EU
z@J=!U90tsVR6v@+6vS4r&&uNvxPNvrgu;ZV<-msY1X$D%Xo3&F2p(N@;lsJvzerwX
zDR7p>B`n&-umStb?NW94*Apu4Y!<)Mt#FyT;UL=df!d{$w^7q6-cl=%b=!6(S(+A3
zv-A_@T53dh-%*WTqgAQ7SQ1~fOb1Ci1#E;RSIiQ~GYzXNntL|Wt>B(1(o|T~xA8F2
zyg@qyhXlx*fvWt;e+Erf@+QUo8_0(qKMw;CVW0>to_3*aO75S9lC56sM?
z!F*hvJ3Rq-wU2ORpW|cGB(Pl-_~S>K$WEt-`Dy(eRtNXcFgZ0;Kx0aSv8Ge~!*jQi
zO4*-KN^h!ef{-Wn=&U60oEcJMja}gR_O_4ADxI;Gz`Y=vS`oJ_4)au#g_-*B(w`7g
zls1pftniEMZ;s;d5M{rsKO+l^#*^QtHep$u^zu5#t!e;NJ_e~GD#73*qSZVmzi_2VpY7zU99*;#4z2Q`Y
zMO<26Lq{1|+OpS9p>DA7ROVnB;%4eQar@Bq
z)c4Tv6Sal_2Ph7|ck#z+4Q3|tUxSw`yn&n?XZG}t1@8rHubBn6SwoVBC_|wt2Y6m9
zncheT6zfRBfkSHCO_gyP;058&99ZfOR-`1=EtITPrq2RYaiyahRq4LN@SsR|aR!`=
zk%}G7CV@3qGO(Qh>G2qNNs9^$J+c!x3|b#K=Q5@=iFgQ5q@AQi8@87l^kojH6^TVz
ze;?J7m!!)hJSpRxWzi^p2S0KUpn+v;CJxqty3np@^b0Ws0ONrg={%m!CX8?hwQ1pn
zsp+PiKW<)G+2%JuUrmwb@WUJgSqr#qi9JC*V=j6)7XiT7qL_ve7x#0XASo|nT#e*D
zv0tVr%z>s&hKeEuirdlEF92heXdZXz_QlV6a47I*TSar#pe4k7nh20=S*}`Mfis@I
z5mr0$w&m2gcD|Px4upWy1vZ4AYPHf0HIKQk3gI(_`53eIb#Yv_ZTFTx;K$Jjf=0((CuCF&r)7~J+I<_eQ<3<)uf+BvaOEr8S3cYVE
zJ5@|Afw@gu3q=@ktLKgvz$!Avk>)fJvy8|j6h{V~;qrU#Ng()VCKKPG>cGe05etUJ
zfDe$FN)1P)7!1w8N(TeRR|dVC>GXZQmYeFnuGdlS>C(ng!-tmozMT`eufgsQzqOzX
z{M9i^F{dksOfy2$^|*#BaCP+3x|$tV4io!XVC_6B_6;H7X%r-EkLg&rqv?eJt$^o8
z1p!=)sIwqJHK8(WH>p3z*Ph1uMt6mh>?{i5Sek)W2$&Tiw6(P~4>XR_G0<|1Zkk0L
z!T>n1OGRcWH2_4d2GJzFux+p{4cRyA>YO{s*B_-VcjtN=;n#E+}>ybKYK^E
zlWPY2mgfTO1+Z+FN%(>nP^E7L0-ZfVsYFZL@)%VM*&z`BsEnTLc4C~rDr;-;Q`U2s
zBEnYU;uzMlUh82`Adh2V`|JU=C&n4HRgLJP-R?xRBa5WcA(quubh@=ikr6>@8{z2=
z9x9ajlF6kwETC>yfLMD~SR#bD1koY#MU{ooH!yG*KYTCkS>h`~7pxOAXodAk`Mnx3@4NgD>IDDcKGd4sIA)}fl8j=g*+AV!gV-Fb)_^8ZbsHBG
z;l-hiH<3C|2#W}Gxh
zfe`*>i3FiFCN11T%>$7d1palIa)Rno8TYKPmWd;g%kh_y>?epJbwWtA
zAb-?IAJGDmDbb&4#109L?x)CAN8hQ-k
zn>c6RA!PMr4EJV2J5r_vJWA{5m*#mq5K;P0a7HM&p8LNezfcmsjg{bIsOF6{^Uf4g
z)6422Rrhz%tWbgtKxcJU2C?bcDU9x{V8SH^=>ltukT6<2Wk%sk1#1t~$dNsas9~+<
zH7zXKzPXLtWfhz*2r0)-&5pYN7#>0Mr!6xN!G%@_#;ysj%_GXhUM0rlSLh*y_BRm4
zQ#DDEsx!Q10B1wo*2ebU*30dq{r%njy&oF-rMOm&Jm>f_)RB;aFoC-Q1H`S-mZq|f
z)~Khe9t@*aY0RX#S=4EdPUVbhZ!RggsIu0&AHZU|F;ZQZMNwpPYeBW6R#wUyD|gn~
zH6sCnc@&4|O;>%nShZ#us9$%FI+pAfAVcZHsllEX0iB$hu)cC=KpG@9pMy?lvm?(E
zmYX+-dv)NN+mY^?RI<{7J5k%%a`T|6SUtUq!??>~W^_c>dY7sjfYE+5rTLh*SjDGJ
zVQCx?_!G4b4CtC1%LRC=gFy>ao>~euWvHuJ7|rwib0IZH)nkvO?dbEMzuWPjEY1Q7
z{HMi!2cjhUObD>Xo;R|xSsB%Zjy>)!n2nw1!U`jH3r^FOO2P(N7DGG^Y#CfDX^ccI
zzrDA=y?3O@WqkkW`ObmE`N)c+!)Zg_HsCZixQ;oqb
zPvFnU*pw^V=uiv6t}InEq%A8jsL9mQ)=fym>^6?gqY+BQ!zdul
zrLWW;3X`^%s%7SO8YnFD0vUAl2Y|TvsC@!4yz8;GdR;wg+wB5TlqQza#bq?{pdSC%
z4&%<0QPv4XM$6;Z5jA(E**a)p26{T@jU@Jiw7P6{LjfzF174>K2Bb?sH%SAgc|MFad>6(Z-Rhc}|AYyohGH
zjnO<~TO!_v+1}mXe+3d`meJLEt57H@3;tdu!N`(ZL?ok``$gVIi&q{jQdMvn6PnWEK21{>}O&~4EB`?8N
zCcYRmVBgxqG|JohZb|KA2@17-@J768I!2PPsa2`8s{`nKp3L*Pm^Gv*ZhK?1RN0~E
z?u*xZf8E=Ev)9{5}$*MIN4ZlpnKFHe*;H<6TiGp
z&_ri_T=-RUX74UYA$VlSs2l~f;?Bu2iZwaI=xrQEz1Mrs(2C19B7p<}`b{r#<~+F;
zy}`@fK+h)epvMewF5GiIo5aO)YU==IV-98Sm*#QMeZSTHpVQ3`$0sKzz0<4p>*HVO
z+XH@}f3==8`9bSRPb5>mB8IYt+rA&Yi$a*26yhnU$5Q%ujbw`Z_ln%u0V7@#Qr
zra63Z*1ErGD>&P-vLdanQ>)w#$4yg_EuG&579
zu|(g0{?Yj73!I_ve;;gbJ%73FPlx}2K5o4Ky}tHveSPu%_qS_b?|*-Z&pp0tMYjNZC8q^R|G^>Nx&p
zimqI$h_70za1!8!)fvgJq@@@3LhI`3va?t4h11J)e*Z1|G~iN2?`F`6MbL`xp61vs
zG%<+SafLjBlN77X;t6fc3a#qv98R$3_51zbV9p%yhv?IoZH4x#Dh6k2d3gC%PY@Q5
zzXH!dm-FzR3`GQTF2a?)Jop+mVLc=(_=_M1kRnRpI8=S9As5=i_v%%jy|??)vRG+(
zp~RJq%U>W@fP$;A)uxkUGrUtS8D-KpU-{tuLRk(w^1bU8E(@*F3*a`8S@*ok&B|i~
zz}_$o=LYIeDcn_F8(Yd$X^|8_{&&L&@1EnX9nNQbYnY=B&^C;u`%TYltEYHpSZ%+X
z0f&g}W*fd27;tIGh4ra{7mXkEy(ZOfxBK>$cNj&~SbqxB&*AjR*FoU?*idp$)jWBq
zU^?*_xYP)r+1aa4m_4m*`BLYxfXyjifW&U)U0+=tlA-rnz0!`RDS)>ajD}cLDnn6r
zOuxp5;Q)TW5FkY16`sLh8GuZXg=fB*HHqytQ~w;ZW{zU>@12IaaT3oEZ*3}b_tx{CAy*Cz
z5nr*D^E9{=8QVmJ=$k5iMt*)B0{WpA0dY@sH{uSrvFZHo6pVG<`|Wa@M7%+Q$+;dI
zVT;Qg;~cm<8W}g431}ZM|oK(xWL`uSUm%b
zjx+}`oW)>aIL|mqU(+afbA@@
z0(bygHNbEmi2x2?GyqH%;e~VA2GRVY5{3|!;VJm?oWW57klK4fX$}J5&t3H+J4juH
z;$jS822_$3o6GUKM@fk81VG=1z~Hu;OFqfRGXjE5OVmfVhYNvxo#n
z&(b+M0RoMP^DxMU=yC*u5O@B*dYXbj8sKfam}9iyT4M+kkog!skORUU<2ub&pY%U4
z926MIcLf9(qZvg75!TlMLNr?^Fv@O})Dp$D8O%{>1zZVfp|Yrp7k%m0mbt7AjgfZB
zQA?S7i8|;>?&f;+l%#d+GQ10zb)H~w+P1-gS)kTJbPfMr8_w=Fc
z}Rah47l7NniSBAqdCSvqu2?aMu1
zV6-PLis6%vU^$5N5eC?^5mZQTiaJpDJ?WU~0FWGE(6j--L4KW;n&a0k0SjSr3W9(--%&B!U9RKj^C_HU+1cG-FCJ^%fDT
z|7`Q5k6_F<7UoC=67_~PTz(5jExurYNS#Fw%(j(e
z14vOw1~>_fDCJ%@H1Rf=05Wtq4Kk&XSFm>5L+{f_5Heu}#F@ZB4<g7r`2R
z9=OmSZGmV*Gi_5=%nyKK<{`&)Bkg$-;zHt1U62<)ZXlwH6cq}@5zLb$LZ}0I-5iX*
zn3D*t%&?0off6&ygymo;c>!QvoFQxv@%SMIV_DP2kz(ZpZD|HLYgod9Ji}Cm@)k4
z96H}p;=7PEcnTL{xDPg{B%1xe$U|h7vcPQTB%`x)B6?zYF>Z*bC86U4Xo*jO6C@TL
z=tRdS6Cja;>X>;t*gh;2k}hc2flT+{AUB9GhOXz#$f3oBtvsZaQnniJl$`nwu~bvw
z%aAZt7%I~CAP|!D+@ih&@q4PuyQ20dp*8CuwZ>*kPQ>o~X#Eq4U?S8{>&^6Znuk%O
z8V88ppQQ=oioC(MD;S?L<9`Pkl5h$~D(I`7gi$o@3viXF^O>4r%)uFA%NS)BIiOON-TNJU?G!^tEnzdey#vB+~MjDLInEu5`9W8TDGz&W?mu=ujL;+7kj&4_b
z<20BEZ;Fy*7QtDH;R=&TiX0pUpJGbPi{zOLr+Pqe}W|6dYOo4fCjx
zfdp3c&9Ab+Z1>x?p<^!v336#Lfk8f`8&3#EsOwhW`fD8#RA>lvd(7O-K`uBRCnT}q
z`QuAuf?2VsXZ&&k=Hl_b1Pz5E5OhFt?X}yzfyg?%#_;T2?-Cxe_JoS)60l8Bz1=Qz
z=oIla9*rUr*oge&2vBW9Wn}d{_EDj=%jSd2`@kJPN6whFtnD^~6ClVu3}%+D(uD2h
z3v2T-)XG`tjz>V};Pku%!9m`Y*l-}BNmv9JWa3e*N`MQa2e!M<<3av{B1FFn5CL^O
zRuuDnt>aUW_fAgQ!O6)Jl=^73gl_iDi+-ucexn`0GY2$+^H>zFg;hu66vw2E*7_he
zgrMtj7|jZg6`L66HOlEAi8Q*fj`ryD$t5;MMcP)w+O|S-2gB0&6)6aJFB+ju2+paO
zY8NbCIc7_o3$rCv(>!+tSHw|X{{i5rt3j(1wyygfk?JoDBMST{3!^e)JlOAX;uKz&
znJ=&1rpOxYHoJnc`i0N|wLXB&AW<8Cgsr}Yo-}u4f28sNTH(3ek)vMY;?l(WGu?H#F36EdH6m^S2iq+?>vQ9umQmad!Wc+6$SIHUg7N
z0(L6RhOA%;=ZG0E5?m`51w6dyeg^en?|)1$P{ffAN!vlK8jbX`fnvEcu?0iUnu|4V#6^88=_qN!T$hIdITuF
z_VwR~0KT63-|wT(KOy$jRm{sYpM7Q);O6tc4oAOw{@2$Yt$scKU*gm2v8~ReX9WSY
zo6SKAATK`Av!FGG%}npeiIws-y?@`8XxHIHUv-+#Xf{56O__24a8bQww1|dkUq{MsQ3mRSrpUT5=
z@60|JMMLMopcnRr`g_=mdL#J(kKz9V{cFIlPPyy+a*DR&qqKS%{MTP$RnRq$YDuvk
zbtAKqXD?8=e@x*LeT>^Ali^8K!Xb~!S5lVrHzL&O!Ls^z_+ho8Dad?gk4u?KzEr$r};O~5xJ8Fn8@3NTg
zWO$euR)f1}Ixya`XJy&S)UAtDvcY$qg>ly7_r|vWe4i5iNA+4&W>V)u9DMx|~`tVQWx{YQU6Qz!l5`dZ>Ei+l729>aIKh4mJbZeC1Cqq?FL
z!Vp^LivaUfO(Tr+dny~aAZaeoQ%DI@ffqH#QnFI$5O!IL+G9`04vx_TG$%zf+k)|y
z9h%?@<9>q99fOEsHu#=#em%oD7E4(g;I>+=+Y%CdO45THlNS)A*Kj_iv|bb^Bs@#^
z=f#XX_zkr;QuY$I2oe1&Bh_p9zSpy9Cz>bC4<{c!oU~d1sc5wQX04&`Yll;ciVc_o
z1ca{5yoFb@8|s1KPVi$(Y5$8G3u!ZMT`AaTDfs<=tp91Tk?3tOQ78A&G#R6YAEQgS
z6Mdi>f5L02xMb>^7{4dc$^C|&6=JRuKJxQfaFI~LF^t`K+~|gl4ov=3F9ZgEeyqc_
zGIX;sj$!SICMA`-2gg$-qwH_Fc&9wbFnc>LHI5%
zO076SJ`5RYb{UcB0K*S
zE!*TCl@5n!OG=O>C8ys)0`6OLG5ZucwOuD??MP6$;fzgV00pEx<;v`h6u&2Tr>&5j
zVw0Q~8~7%HL-#%!`&HvKJM2E9SAC#y@S6m6CB1lZ2=
z4yz-KFO%Uf^x{fQo4ZOo5#InopSTI1wgsQC0Sn8&WbtbSpqWK)u75H|G|_!n)MQs|
z`5v0-;X4;aR8Rl~9<)uHXsKqL!zh|<{|$(8)0XNuSfE1~9fvGw^>C-_g*Z2
z;+IKSC)rjLoh@A2y%F>LpMo2AH;j|m-L_&;F{9@pWCDoM**rPVEnkvLmLPs!!^t4^
zf>IBF=IV)F>rjqNSJ7thbdC;lO$y|6rzV-!`C&S{Y*H$F2^dxC$-t5A=oRzC}WaQg%nxQ2hwbku!f2ks-O(0`d^^ak=E
z>_z5FZ*+Gef)jZhbdY&`d<)ZYhYiK`WLmk%l>F)(HqFW(Jr_RAf)9JYV~>{N6OME84q|X5W
zI3dHYzhMFe`inV$x@y(`R{lQvdS3o7er`GcyF1Ue_YSxJzCLb~{}=WD-+s6DmHvN`
z&$D!P3EcZxq1IPd|JZ?l*H-D@HMN7u9@PK%>M%T;%%cKhwPUT}Bk+(DeV~UMd%2FZs;F`$1G!CT(at9c{F~X23?4GG#k*jDrFFY7MMdqV8Dsd7{
z=0hDI7`=}s@iZ3CQ1Tt+DRa1yj)&vya7Y*Y8_~38^T8y>xJ$zrn;y(D`g@KK!zjr)
z5`2$>u1zM`0!5}7l}0CFmr`6C=wFtI`n@<~7kJOi1S3^qFh&fbE(v!iko8<5Bs3ZTlC;45S{(U_gIVcal~?&$z@d_P!8kY@KGM^{e24?i
z6M-SrF}^%SF)dFMgUJ23tq%8J9KG2(*j77->ea#ifA2isey$o@hwvMi!Z$ldKkmOi
zQc&VxYwzf1wf{nG?ftC&y0iDZ<8A-+>R|iuQ0*V6otLk6cebB*)Xv_s-Pg}|_I^-L
zq2Av9k=osPxpM@K9_=gaP@3A=KJ=imm)i%=euSS}Pj_~Aj(+Z_7duCL*yampbW6S3
zIyl;S_Ih{gK)rf>@M`~X8@hk)?d|XF?7cXEUbbIu?;ZKjD|Dr{{~NQq9{#wsyGvbd
zy@t^r(D>pKrs%r`s^Ht*5)&+!ajf+3wcP%Z_@!^>XWnZK|~o
ztsGD>9-a5*$8CB9{cXX2&yIHX_i#;~?e85Oz|RiM^5Do+eY118-BDWyJBKh-@5RAB
zG>^*(HTJ0ysJFMxt>99t(vCn8{Qde6zj{2D=i6Jm(AFVVb_@TjMZEd?aX+`ve;Wq%
z?@$QbNdF&ge7m~H|F5rq)&GBy4{y02z2k&ck8R*v&+fg=w(!_yUSJEs?!jwU-k)Ui
zE(IbZnY$ygxSdvg{RF*xp>lI=b(M?$u)F{C@CeYYTiSW}vhLy4HPuB;!69BKxjheO
z@8smeNw0_Lv3m{grf3A5#1}XfjD|PS*B?_8no{++2&aHfB-^uu?dx%bIF6);b3KU0
z+wW#gp6lNz*)9z0_?O;kLlV@s)#LA}zhsVyaWu=s!T+clU9x&CwI8TP`-!RB=Kh9h
zp%YpH+2jzJcPk;I56z4w
z9|uQ^ZdVb+oq`g%^B3nL2ESOyp~e%@=>#5|Zy!D|?v$}QQwMRVLy3{}8t7XvdJtG-
zp&L)qBHPV0G^4h>$#jfUg>}`JdGP*ABd4Wk*o^I2dO=9^2nugIX`5|K?Tmxc69#96
zFXjEF3rnU?y9`%4D|mhebT6D2n>yUKq%76>DE&$!z6q6$S5$T6-b&KndyUH4>GajQ
z*G7W_(CUjxjfQli8{{h}k~P?Znoc8ZHx5_jrQMn8G7D_ZTBW^t>FmwYjag~R{l*&D
zuT1aeK?oO6+LPaH*Oe2^ti0H=HNe3@!)XRgu@Fa2HX^B8fwm&G$MCFiIG+R=W&nvf
zhnk5kOOaGL)M;jNnvz^C*%L6xEJohNu(M;}y(lLJU^a}tR1}v0i+O<=3K?^+y&W7wmr
zh<6U{Pbx;~dxzW-4qvYhWfo~wLMFMJ)4`TY1K2?G%Z0j-6m$V6kHKtXc%q_!Ql3Q`
zXXH^P%0rY?x}s*&0IS2T%f1Q{JK{u8l2}T>*hmAF+*b|q%hJgxWt@@GDH{1p;y0r=
zmx%#k(tKE~XL46a&Ku&vdu8FKvevHxmPYZ`YuW83ub?u`euC^lpFDB+U4CFRz2wQMy=yH=V;u$lm<2Skx`z$O}s`BFbP36oKM+
z)T2qa#{Ou;Ltd@UT<5)U5i_2MrZV9SfuDBR4RZ$>KYxM}e}>nKh%^>Ny3%lpK6s$_
z$V_B~zc~+OtPdWz{E+*qwS<=Xq`l9|Wfy3v^vA;B({gr0=cOH-%1V+z28x*kNl
zWyVp*(=M}BSv2B&3nW^FS@7O4V4FfO4aj${MxF8X+4Z}t%g+02-72U0)rAM|F3q?1
z@+};ihw#-rjqKCNJRO~xr|?xj9owg4d@6lg8Y9VV*kdfq3w3gZ=12)73o^_oekXj#
zVcJ2qW;U6J=S-W@(TMGwn6uVqLqmhC!!uA)9cXD5VFFQ3+-3)G5m8!q&6qB-pd?UU
zHJnW@QzjFl8s8Gppg~*lL2&quC|q!N883?ftz!*%g>
z0BuEomzUQZbjhiE)j*13%DL^BnO2ZKOwkNc0p^H;bKShh|LUa#rj6sQB@`SOGZq1Z
zcLlqW8Qmuz9NHY|?=i1!Dfmw%tf;hS0;$xTJ2MXUP>d&K)Yq&`DsGijR5n^G)U5Dv
zg&mnzxA4dxC=Rc3Z4eGAK`pyZ=@JPp(aT%T=8fkV6$OOiFoBVXmmNVl*bZ_0sAIsi
zX+=CSiU?xg;7DK?D?B22xbDN^i}*q%<|BLjB)uO=uR1GQpbZ++2-V=6zzQ3|h(1{Q
z>RJtg@Z7MOaKbx^(~_a;9Hfs>_C1%pPthet+KYlbrX=IIH&BonrV9T3<>ac_eR90o
z{p0C_)`@@OJHG%eYi@;VFFn2ueh)$le>q+|HDA|H?Wu;fxEVgA&g!Wg=_)92YSa2-u|O
zomw#Gw{PI+q^RT8nWAagr3j%jSY{Ch^_%(J4wkHF$J(-9!^rSw=oM4
z=O_b>Qs5jUNn0t=s>rpKtXnRuAO?n+8?A(7|F(xwr>_L^r-$b@Cs`P+A`iD9BWqp_badpXs>iG;rUJpmHnW_LAXCST*vOT`;Y)7DMD#K{%{4HPKPv}t8|Nukxd>ms*gx20Xro!)nO@#ZEf+g(kbNSQpjH&2dx6L;ECWB5i9KN2
z3OgHm4)xTPb6`nOGakA}fixpnYtuX`bz`(sQd}}$dzA*RO7(fqAGl#GogrQ|?$F?M
zO@rO8^9Uha9P03NNH|}n;7oPsWB?V(A_|D<)WyXWxCG+8T8=ZQEIc(^rRNyJm)Jev
z^vto0x^%Ki1>^GEh3P;@a-LpS14@>cDJ(rV%jDFrE?K9vWaT-o-EEE^F{bMqH&Fs9
zZ`MLUSXWgK0Fej*s%L&jVq;?4cUEFl?qi}$&W*@98=|4*JIb7c!HjjRRA!i-HVDpb
z-dfv~<=-pr5H7aVfMAz?#Rad$Bd@ht*jK~Y5=;Eta3zlZE;1DtCsH`%Ni4WrpgQA(0m7Zov9Xfimvf4C>lGy`{
zMu@n(UaSD<2z2CAVdgXgky);wPCT{STT5y`ofphBpMY#`d|G7(DLAKR=9sZomrzJX
z=^0TG4YVaO!MpWl(~z;`mLy4@r7@
z;#xm7`LPp|j2f*wi#f8VQGljtZ0f>ENDggrMq9OzC7aDom>L$xdl9pb`=ywtupt;g
zo6->fZ63c3Cb<7>z+(Bc4RwF0+Ks4uLaCM2;1XLg8N^AI79_}K>+H+u+oiMCS{0x+
zM)}Jv3)-eYo=biI(H?L
z*#zA)v`~WCRi8m^`5hq)pV+v@^*<{@#@rPXDf4?;xH@+B8u~A!F)i5bG
zQp$*&TuSonI;F6|L@;*i7JD3?x;)2%q`p+LFNsGon!`6cN6&uTtdr*%M0yU05KD>%
zn<65;cA*`pJB&U`6Oamr94xSgzXh2!uHvQ7r>?x`qQ~v#+^7}89-z(14ZUDMl
z08jn7+qE-U4u;O=%Op`LN*rg3Y$!mKsN~44F&+|YibTga!iavaw3g%xa=a3p5|5Ot
z<6q!woCq6MnxD|S#f%K0w&f$h$MnOYb`MbTfG4=|i>vhE6L5nXpgLXX=^`=!sa=Wm
z!ilN}eavtk+ts51K(f9DNw2X-jm!finJUrU^1qQqOkCh%iy1anS~N3hV_B
zI{L%-Ei=AFT7?XUFDjp&s!bXO-C91~>KW#<)2y7GTG$dK*{szS)&L2yqP29`LAZBr
zjd?$|SX5|Wm1Sh{@roP>HeIH1`9~}-n{Zyb#%{8_Fpv>04)EzF6mTc!0G1p_=Cc~i
zL~S-|LC~c%%XX_(zgv_cV__}C)>Oaxt-G%NGWJDUa@sNbw74tj3=o1y6?SE?ZC
z8s4f(3G6eV7BZNH{Wj!*v9v~5h)INpjCl7K>HvumJ+T}@lZ=j%d^rns^M-x+_zlB%
zbryAJ^FFwnw}e@_ZC|A^r>S}(FC=YBzT(Pc=`)mx#3sa*rFkEhxt{!j7#$Ye7g>JS
z)#G?M;=VJYDsywUt%MaH%RG-OC&tdYmBdPe1}Y01-U1k>(WTG^WD%;0Q1t<&qsa&+RcT%~Y?Vs5jHrwX!+BPs)8cL;?z(tM$5#rN
zqgVTfJ3smBYTf@&f5TrdZ$}NcK~Vm2`}B#xOb!H;BwA8P(?BepT!zHzGzWLt!v$Hb
z4#pX@>*Uuvf_Oa^ASD1AusI&PPII*}0;3&L_NP|wCFX@B3=z?JDLJ;~b%)f{C1VjqVf@5X#!$*i3(dRbMte%TvTKU;t0ERUKY
zOkArawWW)ZnswK55nhX~1f3j{bzyeA%3$Q2F7p+RJ``9pPK-D0HfnI0-C?C4
z@_w_aKIxA0)%ta--D;kkJZYVLz#M1RiuG$peX|DtU#}#Bb03%k_Cd>D<;9`fEantj
z{Qu3mDj}?EhsUYdklDit>8?)o*bc@)oPf}#!}b_tYcoJ1kS3AdHEt83#4Rd&OQ#!e
z%;!jb(LFWC+Z7$#lBuMHL%RcdOxi$!ouY99^GseM7Z3$SNnacz#fL88Z7{-%9Lyh3
ze#)S@p92VqE)@e6o%48nQ{OhUqk<`>QOrc)hKYAI2F#;~v5Kcj0I8wASp&G@f7TIk
zd@!a+4Qds23_{G~Km2x*z}r7y%tm~`|5|T5(wt97nh=8q%2yRN?q<2j6D(3y%x-E1
zM2dPWT}E)o1OQh
zX3t!FE1Yzi+S42cNv`F@gK2J1`uRdej^om*yK6L=hGdCnjmqonlE0-X-@3x%51-
zXn99vQV$8J@247Uo3YLe|t)gqQ24?GrQ@3^wcD>n+fM_a5r71zUaUduW*Q)*4hS)I$+z>Z%E^a?R
zp5tIpMvN3Irx<2+X!SJL-{
zIVmFw2$La7vsf|dW1)5nWr+E8?Hh^QIa4Zv$kvD&hn@-
z)ag{v#K^cCRxazJTNUb4sJ*f#o*t2*qC{$}KOnqcj&vomQ6;7kFX^bGQJ@L>P~osS
zRuODRr*wZ({3Xeqs2j#vIFA>u>kj~INfJhE>g(3|%3;T)vTh^>gPHVpngYZz+1F0Y
z8BS46zk3aKUy?a#K*?AWcdL7pTq-i5q*ynHB2+hzpjC?WYM3@XsS(rSGvCz1Z5GTO
zC&Tn2-!wcTl8hety{Qvn`VxuWrbENQ>YOmG;!DBJU^0l%cpnT~#G+ulc@hoHO~V_4
z%NaLIJOn!Q7GB~LJkx8qo}L9zCl)3!EcU<9|*>Kh#(lr`Dh(mN+L>In70-@2e0=?cDNk|Svbi~Y4
zXMEkDCs8ZJysjm65@AOmF-7{8&o(=-U!Y&74rIe8fz{$?s930{%rz&jx+K=m+xsN%
z+H&Xtc9b$g)vSHyqf*J3}+o;5nb!fpUii#GHe6LIfO1J;XRh)-Tx0*379sWg#jz=9?3Jdz_-KU3+-bO6{)QMHFhxKJqY3-s&G5%;V!iK1
zh$pDWPxHbvus2E5S;P9=3(~RUI-bS$4(doIb3B5FAJ?OWXrU1$A3xTL;AO|@EY0(H
zFcGI7Q9l7OV0tPnBe}+Cro}kja%jaxDbsP0vqh|g;JB-z%xHfFy*43`b}wYBj!>0S
z6!cBE{Gj))ciDU23*CqOXi$ELCGr22htAI<3c${R9t#~b3}YHpsVV7tZ9TH5v^$a?
zFA~e|!uau6WzjUiNIaQU0wZC?pzrzyE@S!fdtiV%(hX2KHihO5Vl>9c2+Cy#kQqol
z<5WHp;*mvYhXz`Ks_Ki;eB$4-id|@3
z%Y*8x`qovs2h)Do!uwxIdPCW}m%M8^E_I24c+X-jm=l8Hpq>Wfz|q4eh%$`t6EmQ6
z9%hq@=_SDJb+4VETuMxqwtj1XEP2Dd(R>!(lGeLcDsh%xpxb?iz)}bD(rxzqcNn1n
zpF{91zHOd7JsZ`k}EXncNTW=0{tXH
zzD4lDMMQc7z4mx(o-Rc2bG-y_e%#sJ-qiL-04t_TcK*a2a~n})NJM7+4(NV>EnwHD
z`fzc97M90Usa>Cn^F;O9Cp0mP3JQ3$(r{Hf?EghGb(YnIk)MA0NtwK62!d!I0)F_v
z<|J$7F7HXv&B~se6-FI70=6)XvX*nzlWn#9h&jK7IW0L@b(>#%uee9Cy*1jdVr}jH
z;x-Ih?p&+B_R4?RM};X1y86=i9DM$xIqr$!?5F84>Ij(OC7a;V$Q4P11SiO1APwmD
zUnV`|EDQilF)iD&Y;E!(5zgGjY%cR`TFHVE$M3$XfXhkp{rBIS$(b#ABD^Bzp>bLD
ztX10a5#!ZSMVC=tyoK}ZFh$1f<^+guxAkjdS+Vgp2{~&!ttcJY9;_>wqNIA}>QrG6
zRc9r;DN3y_gLgVz$5cvgOQCEn)mM1m_A^;$H8sPC7NF52FBCD515IE4WP
zFkNa|lAbe{!;{snIRb1sZ42FmwxM(7>`FqGxNq!g^rS{$7r-)*TnX2fshG}Mn5
z&q5F)7~wS(J9tq_l9LEJGU|%#iXa4>?fTTZSQd?Y2x)ljj`Z3Zq_m5|pl1+0wfO>C
z;b1@xWOPw0gX)MfTiYnb&2XzbSjvj{^Mt@X5E*|VEaOEndeR_EG-}esr%WV%x;%8!
z=J~a)O~f_oFXpT;erYEsO}~YnXpL{yqzlu8YxcXLiR13^$?4T~v(un`I`Uf
zOMFT@{O^4f{w+VBcK#>v!2Y^5HQdeT|Jz589#+r)#`@R%uV3c#C&!lps3-{ylN4Pm
zi5b6$25JB#KgTHj9{Hyncot;W=`cwk%Tu0l7{M+_v^NLbliz`MQ@%I9
z+jZPV>T~7j;&rKV^gE{|kL5^{dxav=uw!z5hOA;-MAICVqcAXPuY;LtcG)$Ci;|y=
zDQXPz@C02UvM7v_(5Z{oFRB*dT^o9hzUb&WCxxcA5MDB@Eu)QqT+F(|n?dM{YQkB9
zIEwBb?7o5%z0iR<&0?U#MKb1~SM{l~*v!%}HN-f2MKHGM*k`A^u4GR;HQlAlZ(1t?
zKZz~okvCfS5}y4TC)A?1YA3-PY=zZ9+mSwo60S;$tv6$eEl=5X*(70WbkLc9DmMy)
zi{P@P7&k`ujwFM@#UULp8SSm#Vp&@$d+@z-AX9Q0xhjedX%^h&aZZ<=lP(-ls9%z7
zHmx%3+NN+QErb5<-L`3y@{Zjwq{&B)2d`NcCG)&>RM$s}e!2%oKQOl7sq@_ex0a2Z
z+!aoCCCq0;@ip8KJAj2a-4P=b3y7TLBr2NcoI}Zs@E9JPnrw7BMV*5@ObmR_?yB`J
zxq|AOb}XCpFPzZ%Cd}!19Sqq_*LXP%P!hcppDTNY+;n6z&X%8_
z#iPQ)iRqY7VJR$?4D!#9T_{_+bhB(u{y6oW>Kj3N;;KY#>s`E>HHDW%`7dpe*%KPl
zaeR7;qfw{jvI~WrV?0^Jq_V1J56Rya8KBQ8rhOX!zq|8nd+%`jj(yzB{{yT4ZI%CD
zTi^J~|9_FsvvhXJ+{pUs>K{As@7gN;yQX&XAPCj}`06k`1N?_~BfYK3gqaug3IU&UbbygP=}sjuII~Mu
zfTPJa9TgXp$`}o(d7g$bc?A#CaE=O8O7P~Pax+JRZKN89QlrtLPKFVP8Nz|;SNf)?
zhn^7Vr%y@Z;v}5R$wZTvxCx4Ji@;bO=||6UYrAmH;l3A
z!Mwom_V^IEW{!bkjbeRlevUzQK-Vp`?Ck{0<0Fa~w3{^s(;-wzDtMZCNUkoCT7mPC;
z<0CyCjCPDyQ`k8}B7`nX6Cmd3$8B}E|KjM)*1@*gIaIF>_Wyh5`Sx?w*gAyYjgES=
zbM)i>>mvmv4z~7=epdT0)Yjh5>aRO{&pY1sPp=NP4-eJ;f!cZbYIkS*c}MN+J==Z#
zd}r?m^%Uyu?H{S#otHaD(CE>=!VaaWo$W&p8hg2Y@a#wUx%G5scjxHmj(V|kw1;iJ
zfJV2}tF42hooBCiw+_^+*9Wim54WNF=ic7_-p<~O1L$S@<@Vl@54}QHYWu(8hdTUm
zYj>Br+IkJ6KcMkF+kf@*!Ojmq9(h0R?>^s#hflX*WLr;nx4A2r)U(~KotGW;eCy@b
z58G60A6hw}VmvzU&5zsk2>RQC|DGM~?C;^4Jlo$pI)I-YnB~EdsrqK;aJ!?n4t5S<
zsNRc%eP|w+6Kd>JBT#Q|n_I!9R;3+*BKZ6DA%69EEYG*McA>39tn3!{ziJKtg+HH0
z|D~4y8=$A#`?#6@uhF}T{{P|n+E@DjB|gZ`Hqa#8*aX>JZ!n4HjiDr02ON%Xw?vMX
zte+fsmpEHC@{=N|tCd7uR1&co$mizJNGnaP`K$gzej3by;%4xW8EZ3^Meaj%1`TwB
zonVppFY)`_hvLW_Q)4{d+_r`{5wK_x%;;8vEsDOhH0O#cn@5#Jvow#3G`n=#x}4F-
z#&~sxAI-CgJlX8^;NP?P0Pk1zVjO%AQh$BL%^Vf6MnWjBkG5NM~gImjODB-W_K7&FAFYuQ`oR5)7-r6?$=~CeGS~*;;wL8ci4$v<6#M+{^D%;
zurhD1!45gWq`E5n7DOVD{hVdO>0l3Sr%x_)IG-Ws5#az(Ph_U_-!1^I`QJ9=4YcIE
zdb{d>=RfMsGC2jmN=T`EHw}j6Mv>FABLeOypHH!g|6Kj!@IN=c8;v#|{xJxn;JZiB
zMle_p{y2)(*S`(I^)Ogp-&p&jGxSlGPRk4Szr14qU<381;s1l}t>-Vd{ps*decWRI
z{dRR@LH>KV`qlpXMLzdPaYMW6d7qBqJ@0i6V6@B3Ala_6wdU{LEN&(M1IC^nNnjdmEi<43M
zZ_d7c+WALP+8YFU9NudD{eQRGf7kJ>SNOk2U;RJ6%!gu3M0w$y)%(DK+;92#)Wc(1
zKFRlezjwdodHA|{pJ3p=8!W7_w;%QS7MmxYaqs5STlmmRFlkoL@NI>XU>IO2qa$7{
zr>_dGExvNBb38)Q3I&KgLAne#X;iYWO9Pg@V|bFae(;y0fIf(4@km0*;VV4$Y{f^L-Y`pN663IPXrQCDRWy*64JvFlfj8Q4
zVizTrOv6hK=qjDTTUwJW;R`E>$<$))PO&)BjiudgEbH`--`%Cp(w<#coBHE_e%b^)
z<1ub7Gr5iuB_;O4EzZT+H_oETByIdt?v?dAhAEB5ltDzr(`htBwZHg{{9oA9#RE-_
z3!h9x?}%BcljicRTwQ!sM;GseuV{NFik|I_sg6(Uq1pWwZ@?F%ERkbGoivPdF=bkc~jrw2yA;FMQNSqh(BrYy}&pTc@emakVKgU?t^Zb;EK@LbG
z$M7(bkJm8cyc@=TnvEB6?kpL%PBj6!xh&>>*0E>I6;RGbHpqgE{m=2@dz2*@;gyo@
z&89HM?YB{ane8#^tBzNq?<8VN&(t6`SFgz+h6D*J?&x^$0|^hCQA$}NS2iz8&bRIV
z3y9Vs(FWD$K$?AzRpzJ&zjsfOF5L6_{l2AdlxWsW(N&FGn%Q#VX#&?AWd0~kk&$oH
zmlnqjoJT85i@^^rM)%HFa0oP_7g2ygzjMzM>lm$UF3Igi@^;hn+KMt;E6U!DbA~6|
z?}~_$HCyV94n=L%(Aj$%s^4z+?JMsvim0*v6eE_L_4U_5;QZK7qlBuMCl3`+Aih{g
zTk{f9ly`0jn2o6O2u
zMSyW9F+Vh}0CeOziA^}J=HyaG6<{y0vtWkqev{}e#zU4+d>5G02m7KJDmV~56`6w$
zFlaLN*^D6P5p>)PK0Iypp{JMgNfE=ERR^Q*sOtcMHqrpcA)-l_7I7*qc@)x_7dfb1
z;{LsUX<2Diq9I4%lnAM`UY?10RF2mR<9-dtr+;#{gFXQ)|@lPlKVVaq{
zP_TE>*=?KWv(KL7CZXuhW~un_QItoqd?Ok&X|rt{280zIJX$)G@Jx2UZ5y$^my;xX
zr)Q84DV-RC(ZD1Y=YR<`goUKyIh^Hq6o+*F_;QTkbS&yEq!h;h{ha?o;boCN$8_tv
z0s8wKqHWNp*a99E(IqfYzf$e?p2>Bt@9GcZ(FleK;ERTK8o@;%%1&85&q+U*%?FqF
zbJc$y!#Z)ss&<>f$>f4HaD@}rVP~S;*J@k~3OqOr{rSR;z1gqy*E~L@EAZ{$r?2|dF~PEOu%z#(|M57(9hc4AT;M&)KI>})HZqAtE5fSBXD
zmN1Cde*hTdYS8M0t?Rx{+Hj#0HkjzM2wnuNupIBvJs0!K^=-$F|!{*d;z8
zj)Gbrz{22e(vPs!*L#aOXiRV(K<$9FqVQZ2%TTY#G69N?6Jk8Dd~uNy6_E5;hNqDn
zsfewRP@B-BigG)7-ZR2X7(vRs5%WC2DPK4=VBqYsX-C6v9Xs;!7Iwt8;@vr3w{RTy
zlwACZk?8>PE+zX4(O82?VLot4T{B0V)wAb9f<#*FzQhHfFNIh!>0t;R#o@adn6*;T(QZ
zSHqP9y3_)iQ8YS*)=(!~Z2
zs_^WRGY9u|K9Mkxc1wK%MBekh
zdL|@avW@mRIR-GXFw7JgtS!AM(wftfXPcFySI!sU&{L3AMsvD{fT6W=A(xtiLRC12
zDlajEuvQq}PnJs9-R1`2Y2;pB?7jT;)z;CEedw3aEy22G1CKBab6nmqhjqlFhU-{T
z)-XX-nxe!kE!%GpO~5n%lGZL4&Y#!whX~jQ&8eBypGI_3I3Af0a+e*+mq|2;ua&)V
zsS=4a{h9$txLKcjrQrXlkA&4au(6ADJre@S4MOciTS7ArcH~CD*6r$GfLn+JOfWg~
zi{sb9I$``*VB&jyP6qEO(<{%Bu^iAwn-E8Xn!z=+R3L=`o8K33@{|B4P8!vgz->{)
zERlriatdcVAsUyWm~0dCNNO@@h2&>S3x|Z3awJBQ57WeU58kHnkfLPe7=n{#kYDDA
z$?%pu5)`})#8jMzl^Zb!M2UN@Hv0ygZK8T8yBDIN2!ju(jY|2o9$ho*au-_>Z7I$T{T69b$bVDaX)
zq4k#UkOUnyqdpK0H*hY15{pjbkBi(hv^>bhbC$sJdCbAJ5YBP28UN9dlfLkDY%`hM
ztecqqnM<_OT_=;s!pO~pEAzc>)y|jKYDuOO+$uNkm5zmgVa(_}bxc(3;&MWlv=wSW!vSr#hm#tbl+(?3cl7^eft
zgKh;g^Wofp8W=9lXc;su$yuYJuPq`9h_W~xN=(Ff
za++kF)H=%bHrXK6qqylhMyrIvC0Ws`3N*&oXNcr#YKr*ZQutTAl&VvVYMLszz;n@y
zPP*VQ>}R5Rk(uPWodq31@MO?m2t7{?^n33v-}lzmH+n%RnipC|eh$y@owH5MJF>Dl
z(?LMBDvO4eaXM5-QC6quLxS%aPE_wIXCS#ly$hME{>JkiO%NG^&lwc{X21OrNOi%GK2WA<0|IKTGHRdJ#BM2mde
zIT+bf)=HKmiSR2fMw%~E-S>M^oudytR>z9n@W|_qyp@OekEFb}A9^OQBC0?&2H|j_
zTu2Ru^nOTHhS5Ty!7$Ex4gDp=;WSQZWE7l**eb^|lS9oS
z&RR6ep7Y6_GC*DfrU!#u9Z@Xpbr!&su!D(nfsr2IHWZA1H^^c*)aLJ4+MX5T0e@O+yPwgRaC|f!M
z36=<#AIv})BkJZ*GA3sremRtmo-ZM7%Qt-*B(&-H7yiFi{()`!Yb8i%+h{_b&L8=p23XvDVh2~8Y#
z;W?A0u3NfWri_EIdt0HC6XAp068Ot$2T}Uz?Zu9{%&}YKxK!e{_q-6t6;M1b!D1tj
z5W1*;JqPxcBa=%=kH-wNZp_g_v*s)fF#%$!18sC1VCXeIb^vQ8}
zerg)lW&XbQwRrmRN&;HaG@-Pgz2YJzcVsi(t6sa)zFxw%^@4mjvj4dAyy`uXdQeJ)
zh)N5tNJK??gB~&TJxUF$E7^S?rihu&jE5TQ*u_+W$|aaXG_+&aB%#M$U+W);{(3$!
z`aXSe*3XyRn$=`o#+Fbv8Tm#pKqe!K#l9?gXQw8xHhzo_3`b>#XTF7Y#5fVD=13Zt
z_FE=eDGQT^K`aQZ@Y+_X>G77zpSqPp585_GO-yFiQ
z*p#$M$pPLb)NM7KuM6q`1KbvTtvMBops5z9I%bm4*|3b_OfpPY97R#(g^9UH#qup3
z1#wz7>9o$A1efd`T=Lf!Z?!1M<0#Yl>h#_W*~c(WiaDDbNsY})>q%6&UVujLhl;_F
z@1&qlmC0GJ{Xo!puHqq|n^Q?v(1QHFouIdv5Ozui;an#Wuq*S^Pd{xUq(HlX75V-H
zS^a_m+7LMkzvQ1A#qYG29cCZENHDP8@M?;UkI@-vKI7EXMLIKWq5FHKR~Axvsc}^7
zy}qem3m9m3-A47d#c!-(p?_KTGt_8Vl$RHgPP9$luTe1Ebb+qaMY*-xIx!m^QI1$n
z3TWJQ#LI%%#*pI~vt6>MO^IY+%q^*05%n^BaWFXjfO>5PN2pBFuvKmntFJsjfPmYl7Ey
z^TUT0doXu|&D-!^@zZI27U
zAL(^Txp%nIA3|#N9eK0Y7IR8*p+$nthpZD`x9l0St7cD|!%PJonm}i0&t+w5JY##(
z1O0_wx7=UDR<+?PX!+V{TJ*gQqu7=GbR`a&1Ptl6X}B%9U`I&eZIH$3Jm)JOK(&yu
zGVpe+7JI4Lg_{KUy><
zCvR6+&Mz&=mO{?8q>D97ZjaEUDq1XU5>{(xM_(WhI?w~d#gA2Zyy{+U0DcTT
z3Gw;bsh_7=(F`WDv!Laa(cCx|T0g}kP=-zs;L(c>dO_gkyeop)KKo0C$KUFz@*cS=
zH_tQ_`v2`+&2oY;5Pp>mCmfL}RCK1Dw$no|J+ud(Kq{!yQXT8TpW)q`-Ay1_KphX-
z=@1V}vI&92WRraRt>IQ2ZEBQBO_uqGtZK>(D*$F{M5Rtpb?_0yFv1t+uF#j&>*)Q>
zr&l;nx9*DP(0?8#mvXuC?Uq(3W;*#q%HW%&D6<1h)zGc-aY#3Y#7v`1>wvVBfrPCl
z)Hegtp}AZWK)+COy5%eL9JCsHUUkQg=n6-Ex;&jq>eLpcI{>9Bhg3x;bpYJ`07beb
zBtrCPA&&RUeyX%{1>!<^%lk^kpF}y*0J|W$=M>IVn{X`FOENOpFc{pS!ahl{a=EYs
z`opAB%c3IK7Q!8SBDUP^jm4vg#ghoco#+KEN5~%xC*l7LnUj
zjRhy@<(sWzOui7Yvm-txTtvojnWQ3JKnO0<`AR`Rz86GU~{V0#^2MJ<3
zW5}*=*>#^?M{(sgajp)0@fRi5DQD|KJ43w*loCT_G8CpYXiOMwO&uj<9-AJiLMHCA
zqjSm$1IZ4Nd{9o`3-A$3?&1?<_J!2Q3-?=Wh|
z|4W2M8UQ)WTa0*|44`(>C?^FvyEvq=W?4|76Tn@KP9YFB3uu;6Sqi|jLl69K;L+Mt
zajT$7CRveT4s}ifoYqPbIE~Q?7qwBS;-T^1_WtpYFsyt96Z3Ik)D&4mXMK=md236
zEV)cA&0*FpzJqLD-QGgo4Sox+$aI8@JoqBOcr}KL%d4_;=Sr_IF0amAo=Yn=Emvyp
z(p<^vWu>rk)0IV!&it{Nux8@C#(9lG;kwryc2|n@GzA)PxiLOvU@$veK7#@1A^_{D
BICB61
literal 0
HcmV?d00001
diff --git a/deps/npm/test/npm_cache/_cacache/content-v2/sha512/ad/a1/cd9122e6beb95de36bb8dc10c255a6a7d7b8bfbe21b72843ab6db402ee8cb8bde5fb2d050a7eb96ea330e8be1a394c4c7c444c8b541f8e180b7f12506fe8 b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/ad/a1/cd9122e6beb95de36bb8dc10c255a6a7d7b8bfbe21b72843ab6db402ee8cb8bde5fb2d050a7eb96ea330e8be1a394c4c7c444c8b541f8e180b7f12506fe8
new file mode 100644
index 00000000000000..009905080b2b41
--- /dev/null
+++ b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/ad/a1/cd9122e6beb95de36bb8dc10c255a6a7d7b8bfbe21b72843ab6db402ee8cb8bde5fb2d050a7eb96ea330e8be1a394c4c7c444c8b541f8e180b7f12506fe8
@@ -0,0 +1 @@
+{"_id":"async","_rev":"223-5df87f4f3d2584f561ccb40c7361a399","name":"async","description":"Higher-order functions and common patterns for asynchronous code","dist-tags":{"latest":"0.2.10"},"versions":{"0.1.0":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.0","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"web":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_id":"async@0.1.0","engines":{"node":"*"},"_nodeSupported":true,"_npmVersion":"0.2.7-2","_nodeVersion":"v0.3.1-pre","dist":{"tarball":"http://localhost:1337/async/-/async-0.1.0.tgz","shasum":"ab8ece0c40627e4e8f0e09c8fcf7c19ed0c4241c"},"directories":{}},"0.1.1":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.1","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"web":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_id":"async@0.1.1","engines":{"node":"*"},"_nodeSupported":true,"_npmVersion":"0.2.7-2","_nodeVersion":"v0.3.1-pre","dist":{"tarball":"http://localhost:1337/async/-/async-0.1.1.tgz","shasum":"fb965e70dbea44c8a4b8a948472dee7d27279d5e"},"directories":{}},"0.1.2":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.2","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"web":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_id":"async@0.1.2","engines":{"node":"*"},"_nodeSupported":true,"_npmVersion":"0.2.7-2","_nodeVersion":"v0.3.1-pre","dist":{"tarball":"http://localhost:1337/async/-/async-0.1.2.tgz","shasum":"be761882a64d3dc81a669f9ee3d5c28497382691"},"directories":{}},"0.1.3":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.3","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"web":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_id":"async@0.1.3","engines":{"node":"*"},"_nodeSupported":true,"_npmVersion":"0.2.7-2","_nodeVersion":"v0.3.1-pre","dist":{"tarball":"http://localhost:1337/async/-/async-0.1.3.tgz","shasum":"629ca2357112d90cafc33872366b14f2695a1fbc"},"directories":{}},"0.1.4":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.4","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"web":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_id":"async@0.1.4","engines":{"node":"*"},"_nodeSupported":true,"_npmVersion":"0.2.7-2","_nodeVersion":"v0.3.1-pre","dist":{"tarball":"http://localhost:1337/async/-/async-0.1.4.tgz","shasum":"29de4b98712ab8858411d8d8e3361a986c3b2c18"},"directories":{}},"0.1.5":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.5","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"web":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_id":"async@0.1.5","engines":{"node":"*"},"_nodeSupported":true,"_npmVersion":"0.2.7-2","_nodeVersion":"v0.3.1-pre","dist":{"tarball":"http://localhost:1337/async/-/async-0.1.5.tgz","shasum":"9d83e3d4adb9c962fc4a30e7dd04bf1206c28ea5"},"directories":{}},"0.1.6":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.6","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"web":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_id":"async@0.1.6","engines":{"node":"*"},"_nodeSupported":true,"_npmVersion":"0.2.7-2","_nodeVersion":"v0.3.1-pre","dist":{"tarball":"http://localhost:1337/async/-/async-0.1.6.tgz","shasum":"2dfb4fa1915f86056060c2e2f35a7fb8549907cc"},"directories":{}},"0.1.7":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.7","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"web":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_id":"async@0.1.7","engines":{"node":"*"},"_nodeSupported":true,"_npmVersion":"0.2.4-1","_nodeVersion":"v0.2.5","dist":{"tarball":"http://localhost:1337/async/-/async-0.1.7.tgz","shasum":"e9268d0d8cd8dcfe0db0895b27dcc4bcc5c739a5"},"directories":{}},"0.1.8":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.8","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"web":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_id":"async@0.1.8","engines":{"node":"*"},"_nodeSupported":true,"dist":{"tarball":"http://localhost:1337/async/-/async-0.1.8.tgz","shasum":"52f2df6c0aa6a7f8333e1fbac0fbd93670cf6758"},"directories":{}},"0.1.9":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.9","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"dependencies":{},"devDependencies":{},"_id":"async@0.1.9","engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.0.1rc7","_nodeVersion":"v0.4.7","_defaultsLoaded":true,"dist":{"shasum":"f984d0739b5382c949cc3bea702d21d0dbd52040","tarball":"http://localhost:1337/async/-/async-0.1.9.tgz"},"directories":{}},"0.1.10":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.10","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_npmJsonOpts":{"file":"/home/caolan/.npm/async/0.1.10/package/package.json","wscript":false,"contributors":false,"serverjs":false},"_id":"async@0.1.10","dependencies":{},"devDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.0.27","_nodeVersion":"v0.4.11","_defaultsLoaded":true,"dist":{"shasum":"12b32bf098fa7fc51ae3ac51441b8ba15f437cf1","tarball":"http://localhost:1337/async/-/async-0.1.10.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.11":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.11","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_npmJsonOpts":{"file":"/home/caolan/.npm/async/0.1.11/package/package.json","wscript":false,"contributors":false,"serverjs":false},"_id":"async@0.1.11","dependencies":{},"devDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.0.27","_nodeVersion":"v0.4.12","_defaultsLoaded":true,"dist":{"shasum":"a397a69c6febae232d20a76a5b10d8742e2b8215","tarball":"http://localhost:1337/async/-/async-0.1.11.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.12":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.12","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_npmJsonOpts":{"file":"/home/caolan/.npm/async/0.1.12/package/package.json","wscript":false,"contributors":false,"serverjs":false},"_id":"async@0.1.12","dependencies":{},"devDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.0.27","_nodeVersion":"v0.4.12","_defaultsLoaded":true,"dist":{"shasum":"ab36be6611dc63d91657128e1d65102b959d4afe","tarball":"http://localhost:1337/async/-/async-0.1.12.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.13":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.13","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.13","dependencies":{},"devDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.0.101","_nodeVersion":"v0.4.9","_defaultsLoaded":true,"dist":{"shasum":"f1e53ad69dab282d8e75cbec5e2c5524b6195eab","tarball":"http://localhost:1337/async/-/async-0.1.13.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.14":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.14","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.14","dependencies":{},"devDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.0.101","_nodeVersion":"v0.4.9","_defaultsLoaded":true,"dist":{"shasum":"0fcfaf089229fc657798203d1a4544102f7d26dc","tarball":"http://localhost:1337/async/-/async-0.1.14.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.15":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.15","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.15","dependencies":{},"devDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.0.101","_nodeVersion":"v0.4.9","_defaultsLoaded":true,"dist":{"shasum":"2180eaca2cf2a6ca5280d41c0585bec9b3e49bd3","tarball":"http://localhost:1337/async/-/async-0.1.15.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.16":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.16","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.16","dependencies":{},"devDependencies":{},"optionalDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.1.0-3","_nodeVersion":"v0.6.10","_defaultsLoaded":true,"dist":{"shasum":"b3a61fdc1a9193d4f64755c7600126e254223186","tarball":"http://localhost:1337/async/-/async-0.1.16.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.17":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.17","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"dependencies":{"uglify-js":"1.2.x"},"devDependencies":{"nodeunit":">0.0.0","nodelint":">0.0.0"},"scripts":{"preinstall":"make clean","install":"make build","test":"make test"},"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.17","optionalDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.1.1","_nodeVersion":"v0.6.11","_defaultsLoaded":true,"dist":{"shasum":"03524a379e974dc9ee5c811c6ee3815d7bc54f6e","tarball":"http://localhost:1337/async/-/async-0.1.17.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.18":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.18","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.18","dependencies":{},"optionalDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.1.1","_nodeVersion":"v0.6.11","_defaultsLoaded":true,"dist":{"shasum":"c59c923920b76d5bf23248c04433920c4d45086a","tarball":"http://localhost:1337/async/-/async-0.1.18.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.19":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.19","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.19","dependencies":{},"optionalDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.1.21","_nodeVersion":"v0.6.18","_defaultsLoaded":true,"dist":{"shasum":"4fd6125a70f841fb10b14aeec6e23cf1479c71a7","tarball":"http://localhost:1337/async/-/async-0.1.19.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.20":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.20","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.20","dependencies":{},"optionalDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.1.21","_nodeVersion":"v0.6.18","_defaultsLoaded":true,"dist":{"shasum":"ba0e47b08ae972e04b5215de28539b313482ede5","tarball":"http://localhost:1337/async/-/async-0.1.20.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.21":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.21","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.21","dependencies":{},"optionalDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.1.21","_nodeVersion":"v0.6.18","_defaultsLoaded":true,"dist":{"shasum":"b5b12e985f09ab72c202fa00f623cd9d997e9464","tarball":"http://localhost:1337/async/-/async-0.1.21.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.1.22":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./index","author":{"name":"Caolan McMahon"},"version":"0.1.22","repository":{"type":"git","url":"git://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"_npmUser":{"name":"caolan","email":"caolan@caolanmcmahon.com"},"_id":"async@0.1.22","dependencies":{},"optionalDependencies":{},"engines":{"node":"*"},"_engineSupported":true,"_npmVersion":"1.1.21","_nodeVersion":"v0.6.18","_defaultsLoaded":true,"dist":{"shasum":"0fc1aaa088a0e3ef0ebe2d8831bab0dcf8845061","tarball":"http://localhost:1337/async/-/async-0.1.22.tgz"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.0":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.0","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"_id":"async@0.2.0","dist":{"shasum":"db1c645337bab79d0ca93d95f5c72d9605be0fce","tarball":"http://localhost:1337/async/-/async-0.2.0.tgz"},"_npmVersion":"1.2.0","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.1":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.1","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"_id":"async@0.2.1","dist":{"shasum":"4e37d08391132f79657a99ca73aa4eb471a6f771","tarball":"http://localhost:1337/async/-/async-0.2.1.tgz"},"_npmVersion":"1.2.0","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.2":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.2","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"jam":{"main":"lib/async.js","include":["lib/async.js","README.md","LICENSE"]},"_id":"async@0.2.2","dist":{"shasum":"8414ee47da7548126b4d3d923850d54e68a72b28","tarball":"http://localhost:1337/async/-/async-0.2.2.tgz"},"_npmVersion":"1.2.0","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.3":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.3","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"jam":{"main":"lib/async.js","include":["lib/async.js","README.md","LICENSE"]},"_id":"async@0.2.3","dist":{"shasum":"79bf601d723a2e8c3e91cb6bb08f152dca309fb3","tarball":"http://localhost:1337/async/-/async-0.2.3.tgz"},"_npmVersion":"1.2.0","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.4":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.4","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"jam":{"main":"lib/async.js","include":["lib/async.js","README.md","LICENSE"]},"_id":"async@0.2.4","dist":{"shasum":"0550e510cf43b83e2fcf1cb96399f03f1efd50eb","tarball":"http://localhost:1337/async/-/async-0.2.4.tgz"},"_npmVersion":"1.2.0","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.5":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.5","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"jam":{"main":"lib/async.js","include":["lib/async.js","README.md","LICENSE"]},"_id":"async@0.2.5","dist":{"shasum":"45f05da480749ba4c1dcd8cd3a3747ae7b36fe52","tarball":"http://localhost:1337/async/-/async-0.2.5.tgz"},"_npmVersion":"1.2.0","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.6":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.6","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"jam":{"main":"lib/async.js","include":["lib/async.js","README.md","LICENSE"]},"scripts":{"test":"nodeunit test/test-async.js"},"_id":"async@0.2.6","dist":{"shasum":"ad3f373d9249ae324881565582bc90e152abbd68","tarball":"http://localhost:1337/async/-/async-0.2.6.tgz"},"_from":".","_npmVersion":"1.2.11","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.7":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.7","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"jam":{"main":"lib/async.js","include":["lib/async.js","README.md","LICENSE"]},"scripts":{"test":"nodeunit test/test-async.js"},"_id":"async@0.2.7","dist":{"shasum":"44c5ee151aece6c4bf5364cfc7c28fe4e58f18df","tarball":"http://localhost:1337/async/-/async-0.2.7.tgz"},"_from":".","_npmVersion":"1.2.11","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.8":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.8","repository":{"type":"git","url":"http://github.com/caolan/async.git"},"bugs":{"url":"http://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"http://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"jam":{"main":"lib/async.js","include":["lib/async.js","README.md","LICENSE"]},"scripts":{"test":"nodeunit test/test-async.js"},"_id":"async@0.2.8","dist":{"shasum":"ba1b3ffd1e6cdb1e999aca76ef6ecee8e7f55f53","tarball":"http://localhost:1337/async/-/async-0.2.8.tgz"},"_from":".","_npmVersion":"1.2.11","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.9":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.9","repository":{"type":"git","url":"https://github.com/caolan/async.git"},"bugs":{"url":"https://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"https://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"jam":{"main":"lib/async.js","include":["lib/async.js","README.md","LICENSE"]},"scripts":{"test":"nodeunit test/test-async.js"},"_id":"async@0.2.9","dist":{"shasum":"df63060fbf3d33286a76aaf6d55a2986d9ff8619","tarball":"http://localhost:1337/async/-/async-0.2.9.tgz"},"_from":".","_npmVersion":"1.2.23","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}},"0.2.10":{"name":"async","description":"Higher-order functions and common patterns for asynchronous code","main":"./lib/async","author":{"name":"Caolan McMahon"},"version":"0.2.10","repository":{"type":"git","url":"https://github.com/caolan/async.git"},"bugs":{"url":"https://github.com/caolan/async/issues"},"licenses":[{"type":"MIT","url":"https://github.com/caolan/async/raw/master/LICENSE"}],"devDependencies":{"nodeunit":">0.0.0","uglify-js":"1.2.x","nodelint":">0.0.0"},"jam":{"main":"lib/async.js","include":["lib/async.js","README.md","LICENSE"]},"scripts":{"test":"nodeunit test/test-async.js"},"_id":"async@0.2.10","dist":{"shasum":"b6bbe0b0674b9d719708ca38de8c237cb526c3d1","tarball":"http://localhost:1337/async/-/async-0.2.10.tgz"},"_from":".","_npmVersion":"1.3.2","_npmUser":{"name":"caolan","email":"caolan.mcmahon@gmail.com"},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"directories":{}}},"maintainers":[{"name":"caolan","email":"caolan@caolanmcmahon.com"}],"author":{"name":"Caolan McMahon"},"repository":{"type":"git","url":"https://github.com/caolan/async.git"},"time":{"modified":"2014-03-13T17:04:44.914Z","created":"2010-12-19T16:41:51.765Z","0.1.0":"2010-12-19T16:41:51.765Z","0.1.1":"2010-12-19T16:41:51.765Z","0.1.2":"2010-12-19T16:41:51.765Z","0.1.3":"2010-12-19T16:41:51.765Z","0.1.4":"2010-12-19T16:41:51.765Z","0.1.5":"2010-12-19T16:41:51.765Z","0.1.6":"2010-12-19T16:41:51.765Z","0.1.7":"2010-12-19T16:41:51.765Z","0.1.8":"2011-01-18T09:56:53.975Z","0.1.9":"2011-04-27T20:48:08.634Z","0.1.10":"2011-09-19T04:40:01.573Z","0.1.11":"2011-10-14T17:07:28.752Z","0.1.12":"2011-10-14T17:19:19.452Z","0.1.13":"2011-10-29T22:33:52.448Z","0.1.14":"2011-10-29T22:40:14.486Z","0.1.15":"2011-11-01T23:05:01.415Z","0.1.16":"2012-02-13T04:56:23.926Z","0.1.17":"2012-02-27T02:40:58.997Z","0.1.18":"2012-02-27T16:51:02.109Z","0.1.19":"2012-05-24T06:51:06.109Z","0.1.20":"2012-05-24T06:53:39.997Z","0.1.21":"2012-05-24T07:16:16.753Z","0.1.22":"2012-05-30T18:26:44.821Z","0.1.23":"2012-10-04T13:52:08.947Z","0.2.0":"2013-02-04T11:38:08.943Z","0.2.1":"2013-02-04T11:52:34.110Z","0.2.2":"2013-02-05T15:55:23.202Z","0.2.3":"2013-02-06T12:48:37.415Z","0.2.4":"2013-02-07T17:26:22.236Z","0.2.5":"2013-02-10T22:42:00.162Z","0.2.6":"2013-03-03T11:29:52.674Z","0.2.7":"2013-04-09T20:50:04.712Z","0.2.8":"2013-05-01T10:04:07.430Z","0.2.9":"2013-05-28T07:50:48.795Z","0.2.10":"2014-01-23T16:23:57.271Z"},"users":{"thejh":true,"avianflu":true,"dylang":true,"ragingwind":true,"mvolkmann":true,"mikl":true,"linus":true,"pvorb":true,"dodo":true,"danielr":true,"suor":true,"dolphin278":true,"kurijov":true,"langpavel":true,"alexindigo":true,"fgribreau":true,"hughsk":true,"pid":true,"tylerstalder":true,"gillesruppert":true,"coiscir":true,"xenomuta":true,"jgoodall":true,"jswartwood":true,"drudge":true,"cpsubrian":true,"joeferner":true,"bencevans":true,"Scryptonite":true,"damonoehlman":true,"glukki":true,"tivac":true,"shama":true,"gimenete":true,"bryanburgers":true,"hij1nx":true,"sandeepmistry":true,"minddiaper":true,"fiws":true,"ljharb":true,"popeindustries":true,"charmander":true,"dbrockman":true,"eknkc":true,"booyaa":true,"afc163":true,"maxmaximov":true,"meryn":true,"hfcorriez":true,"hyqhyq_3":true,"zonetti":true,"cmilhench":true,"cparker15":true,"jfromaniello":true,"ExxKA":true,"devoidfury":true,"cedrickchee":true,"niftymonkey":true,"paulj":true,"leesei":true,"jamesmgreene":true,"igorissen":true,"zaphod1984":true,"moonpyk":true,"joliva":true,"netroy":true,"chrisweb":true,"cuprobot":true,"tmaximini":true,"lupomontero":true,"john.pinch":true,"everywhere.js":true,"frankblizzard":true,"alanshaw":true,"forivall":true,"kubakubula":true,"doliveira":true,"dstokes":true,"pana":true,"irae":true,"mhaidarh":true,"feross":true,"tetsu3a":true,"qubyte":true,"darosh":true,"pragmadash":true,"denisix":true,"samuelrn":true,"tigefa":true,"tcrowe":true,"tpwk":true,"eins78":true,"sierrasoftworks":true,"yoavf":true,"irakli":true,"hypergeometric":true,"gammasoft":true,"youxiachai":true,"kahboom":true,"elisee":true,"soroush":true,"thomas-so":true,"shenaor":true,"dannynemer":true,"paulomcnally":true,"timur.shemsedinov":true,"slianfeng":true,"ettalea":true,"mananvaghasiya":true,"daniel7912":true,"themiddleman":true,"jacques":true,"kerimdzhanov":true,"jorgemsrs":true,"ivandimanov":true,"vegera":true,"aselzer":true,"kentcdodds":true,"putaoshu":true,"imdsm":true,"cilindrox":true},"readme":"# Async.js\n\nAsync is a utility module which provides straight-forward, powerful functions\nfor working with asynchronous JavaScript. Although originally designed for\nuse with [node.js](http://nodejs.org), it can also be used directly in the\nbrowser. Also supports [component](https://github.com/component/component).\n\nAsync provides around 20 functions that include the usual 'functional'\nsuspects (map, reduce, filter, each…) as well as some common patterns\nfor asynchronous control flow (parallel, series, waterfall…). All these\nfunctions assume you follow the node.js convention of providing a single\ncallback as the last argument of your async function.\n\n\n## Quick Examples\n\n```javascript\nasync.map(['file1','file2','file3'], fs.stat, function(err, results){\n // results is now an array of stats for each file\n});\n\nasync.filter(['file1','file2','file3'], fs.exists, function(results){\n // results now equals an array of the existing files\n});\n\nasync.parallel([\n function(){ ... },\n function(){ ... }\n], callback);\n\nasync.series([\n function(){ ... },\n function(){ ... }\n]);\n```\n\nThere are many more functions available so take a look at the docs below for a\nfull list. This module aims to be comprehensive, so if you feel anything is\nmissing please create a GitHub issue for it.\n\n## Common Pitfalls\n\n### Binding a context to an iterator\n\nThis section is really about bind, not about async. If you are wondering how to\nmake async execute your iterators in a given context, or are confused as to why\na method of another library isn't working as an iterator, study this example:\n\n```js\n// Here is a simple object with an (unnecessarily roundabout) squaring method\nvar AsyncSquaringLibrary = {\n squareExponent: 2,\n square: function(number, callback){ \n var result = Math.pow(number, this.squareExponent);\n setTimeout(function(){\n callback(null, result);\n }, 200);\n }\n};\n\nasync.map([1, 2, 3], AsyncSquaringLibrary.square, function(err, result){\n // result is [NaN, NaN, NaN]\n // This fails because the `this.squareExponent` expression in the square\n // function is not evaluated in the context of AsyncSquaringLibrary, and is\n // therefore undefined.\n});\n\nasync.map([1, 2, 3], AsyncSquaringLibrary.square.bind(AsyncSquaringLibrary), function(err, result){\n // result is [1, 4, 9]\n // With the help of bind we can attach a context to the iterator before\n // passing it to async. Now the square function will be executed in its \n // 'home' AsyncSquaringLibrary context and the value of `this.squareExponent`\n // will be as expected.\n});\n```\n\n## Download\n\nThe source is available for download from\n[GitHub](http://github.com/caolan/async).\nAlternatively, you can install using Node Package Manager (npm):\n\n npm install async\n\n__Development:__ [async.js](https://github.com/caolan/async/raw/master/lib/async.js) - 29.6kb Uncompressed\n\n## In the Browser\n\nSo far it's been tested in IE6, IE7, IE8, FF3.6 and Chrome 5. Usage:\n\n```html\n\n\n```\n\n## Documentation\n\n### Collections\n\n* [each](#each)\n* [eachSeries](#eachSeries)\n* [eachLimit](#eachLimit)\n* [map](#map)\n* [mapSeries](#mapSeries)\n* [mapLimit](#mapLimit)\n* [filter](#filter)\n* [filterSeries](#filterSeries)\n* [reject](#reject)\n* [rejectSeries](#rejectSeries)\n* [reduce](#reduce)\n* [reduceRight](#reduceRight)\n* [detect](#detect)\n* [detectSeries](#detectSeries)\n* [sortBy](#sortBy)\n* [some](#some)\n* [every](#every)\n* [concat](#concat)\n* [concatSeries](#concatSeries)\n\n### Control Flow\n\n* [series](#series)\n* [parallel](#parallel)\n* [parallelLimit](#parallellimittasks-limit-callback)\n* [whilst](#whilst)\n* [doWhilst](#doWhilst)\n* [until](#until)\n* [doUntil](#doUntil)\n* [forever](#forever)\n* [waterfall](#waterfall)\n* [compose](#compose)\n* [applyEach](#applyEach)\n* [applyEachSeries](#applyEachSeries)\n* [queue](#queue)\n* [cargo](#cargo)\n* [auto](#auto)\n* [iterator](#iterator)\n* [apply](#apply)\n* [nextTick](#nextTick)\n* [times](#times)\n* [timesSeries](#timesSeries)\n\n### Utils\n\n* [memoize](#memoize)\n* [unmemoize](#unmemoize)\n* [log](#log)\n* [dir](#dir)\n* [noConflict](#noConflict)\n\n\n## Collections\n\n\n\n### each(arr, iterator, callback)\n\nApplies an iterator function to each item in an array, in parallel.\nThe iterator is called with an item from the list and a callback for when it\nhas finished. If the iterator passes an error to this callback, the main\ncallback for the each function is immediately called with the error.\n\nNote, that since this function applies the iterator to each item in parallel\nthere is no guarantee that the iterator functions will complete in order.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A function to apply to each item in the array.\n The iterator is passed a callback(err) which must be called once it has \n completed. If no error has occured, the callback should be run without \n arguments or with an explicit null argument.\n* callback(err) - A callback which is called after all the iterator functions\n have finished, or an error has occurred.\n\n__Example__\n\n```js\n// assuming openFiles is an array of file names and saveFile is a function\n// to save the modified contents of that file:\n\nasync.each(openFiles, saveFile, function(err){\n // if any of the saves produced an error, err would equal that error\n});\n```\n\n---------------------------------------\n\n\n\n### eachSeries(arr, iterator, callback)\n\nThe same as each only the iterator is applied to each item in the array in\nseries. The next iterator is only called once the current one has completed\nprocessing. This means the iterator functions will complete in order.\n\n\n---------------------------------------\n\n\n\n### eachLimit(arr, limit, iterator, callback)\n\nThe same as each only no more than \"limit\" iterators will be simultaneously \nrunning at any time.\n\nNote that the items are not processed in batches, so there is no guarantee that\n the first \"limit\" iterator functions will complete before any others are \nstarted.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* limit - The maximum number of iterators to run at any time.\n* iterator(item, callback) - A function to apply to each item in the array.\n The iterator is passed a callback(err) which must be called once it has \n completed. If no error has occured, the callback should be run without \n arguments or with an explicit null argument.\n* callback(err) - A callback which is called after all the iterator functions\n have finished, or an error has occurred.\n\n__Example__\n\n```js\n// Assume documents is an array of JSON objects and requestApi is a\n// function that interacts with a rate-limited REST api.\n\nasync.eachLimit(documents, 20, requestApi, function(err){\n // if any of the saves produced an error, err would equal that error\n});\n```\n\n---------------------------------------\n\n\n### map(arr, iterator, callback)\n\nProduces a new array of values by mapping each value in the given array through\nthe iterator function. The iterator is called with an item from the array and a\ncallback for when it has finished processing. The callback takes 2 arguments, \nan error and the transformed item from the array. If the iterator passes an\nerror to this callback, the main callback for the map function is immediately\ncalled with the error.\n\nNote, that since this function applies the iterator to each item in parallel\nthere is no guarantee that the iterator functions will complete in order, however\nthe results array will be in the same order as the original array.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A function to apply to each item in the array.\n The iterator is passed a callback(err, transformed) which must be called once \n it has completed with an error (which can be null) and a transformed item.\n* callback(err, results) - A callback which is called after all the iterator\n functions have finished, or an error has occurred. Results is an array of the\n transformed items from the original array.\n\n__Example__\n\n```js\nasync.map(['file1','file2','file3'], fs.stat, function(err, results){\n // results is now an array of stats for each file\n});\n```\n\n---------------------------------------\n\n\n### mapSeries(arr, iterator, callback)\n\nThe same as map only the iterator is applied to each item in the array in\nseries. The next iterator is only called once the current one has completed\nprocessing. The results array will be in the same order as the original.\n\n\n---------------------------------------\n\n\n### mapLimit(arr, limit, iterator, callback)\n\nThe same as map only no more than \"limit\" iterators will be simultaneously \nrunning at any time.\n\nNote that the items are not processed in batches, so there is no guarantee that\n the first \"limit\" iterator functions will complete before any others are \nstarted.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* limit - The maximum number of iterators to run at any time.\n* iterator(item, callback) - A function to apply to each item in the array.\n The iterator is passed a callback(err, transformed) which must be called once \n it has completed with an error (which can be null) and a transformed item.\n* callback(err, results) - A callback which is called after all the iterator\n functions have finished, or an error has occurred. Results is an array of the\n transformed items from the original array.\n\n__Example__\n\n```js\nasync.mapLimit(['file1','file2','file3'], 1, fs.stat, function(err, results){\n // results is now an array of stats for each file\n});\n```\n\n---------------------------------------\n\n\n### filter(arr, iterator, callback)\n\n__Alias:__ select\n\nReturns a new array of all the values which pass an async truth test.\n_The callback for each iterator call only accepts a single argument of true or\nfalse, it does not accept an error argument first!_ This is in-line with the\nway node libraries work with truth tests like fs.exists. This operation is\nperformed in parallel, but the results array will be in the same order as the\noriginal.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A truth test to apply to each item in the array.\n The iterator is passed a callback(truthValue) which must be called with a \n boolean argument once it has completed.\n* callback(results) - A callback which is called after all the iterator\n functions have finished.\n\n__Example__\n\n```js\nasync.filter(['file1','file2','file3'], fs.exists, function(results){\n // results now equals an array of the existing files\n});\n```\n\n---------------------------------------\n\n\n### filterSeries(arr, iterator, callback)\n\n__alias:__ selectSeries\n\nThe same as filter only the iterator is applied to each item in the array in\nseries. The next iterator is only called once the current one has completed\nprocessing. The results array will be in the same order as the original.\n\n---------------------------------------\n\n\n### reject(arr, iterator, callback)\n\nThe opposite of filter. Removes values that pass an async truth test.\n\n---------------------------------------\n\n\n### rejectSeries(arr, iterator, callback)\n\nThe same as reject, only the iterator is applied to each item in the array\nin series.\n\n\n---------------------------------------\n\n\n### reduce(arr, memo, iterator, callback)\n\n__aliases:__ inject, foldl\n\nReduces a list of values into a single value using an async iterator to return\neach successive step. Memo is the initial state of the reduction. This\nfunction only operates in series. For performance reasons, it may make sense to\nsplit a call to this function into a parallel map, then use the normal\nArray.prototype.reduce on the results. This function is for situations where\neach step in the reduction needs to be async, if you can get the data before\nreducing it then it's probably a good idea to do so.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* memo - The initial state of the reduction.\n* iterator(memo, item, callback) - A function applied to each item in the\n array to produce the next step in the reduction. The iterator is passed a\n callback(err, reduction) which accepts an optional error as its first \n argument, and the state of the reduction as the second. If an error is \n passed to the callback, the reduction is stopped and the main callback is \n immediately called with the error.\n* callback(err, result) - A callback which is called after all the iterator\n functions have finished. Result is the reduced value.\n\n__Example__\n\n```js\nasync.reduce([1,2,3], 0, function(memo, item, callback){\n // pointless async:\n process.nextTick(function(){\n callback(null, memo + item)\n });\n}, function(err, result){\n // result is now equal to the last value of memo, which is 6\n});\n```\n\n---------------------------------------\n\n\n### reduceRight(arr, memo, iterator, callback)\n\n__Alias:__ foldr\n\nSame as reduce, only operates on the items in the array in reverse order.\n\n\n---------------------------------------\n\n\n### detect(arr, iterator, callback)\n\nReturns the first value in a list that passes an async truth test. The\niterator is applied in parallel, meaning the first iterator to return true will\nfire the detect callback with that result. That means the result might not be\nthe first item in the original array (in terms of order) that passes the test.\n\nIf order within the original array is important then look at detectSeries.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A truth test to apply to each item in the array.\n The iterator is passed a callback(truthValue) which must be called with a \n boolean argument once it has completed.\n* callback(result) - A callback which is called as soon as any iterator returns\n true, or after all the iterator functions have finished. Result will be\n the first item in the array that passes the truth test (iterator) or the\n value undefined if none passed.\n\n__Example__\n\n```js\nasync.detect(['file1','file2','file3'], fs.exists, function(result){\n // result now equals the first file in the list that exists\n});\n```\n\n---------------------------------------\n\n\n### detectSeries(arr, iterator, callback)\n\nThe same as detect, only the iterator is applied to each item in the array\nin series. This means the result is always the first in the original array (in\nterms of array order) that passes the truth test.\n\n\n---------------------------------------\n\n\n### sortBy(arr, iterator, callback)\n\nSorts a list by the results of running each value through an async iterator.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A function to apply to each item in the array.\n The iterator is passed a callback(err, sortValue) which must be called once it\n has completed with an error (which can be null) and a value to use as the sort\n criteria.\n* callback(err, results) - A callback which is called after all the iterator\n functions have finished, or an error has occurred. Results is the items from\n the original array sorted by the values returned by the iterator calls.\n\n__Example__\n\n```js\nasync.sortBy(['file1','file2','file3'], function(file, callback){\n fs.stat(file, function(err, stats){\n callback(err, stats.mtime);\n });\n}, function(err, results){\n // results is now the original array of files sorted by\n // modified date\n});\n```\n\n---------------------------------------\n\n\n### some(arr, iterator, callback)\n\n__Alias:__ any\n\nReturns true if at least one element in the array satisfies an async test.\n_The callback for each iterator call only accepts a single argument of true or\nfalse, it does not accept an error argument first!_ This is in-line with the\nway node libraries work with truth tests like fs.exists. Once any iterator\ncall returns true, the main callback is immediately called.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A truth test to apply to each item in the array.\n The iterator is passed a callback(truthValue) which must be called with a \n boolean argument once it has completed.\n* callback(result) - A callback which is called as soon as any iterator returns\n true, or after all the iterator functions have finished. Result will be\n either true or false depending on the values of the async tests.\n\n__Example__\n\n```js\nasync.some(['file1','file2','file3'], fs.exists, function(result){\n // if result is true then at least one of the files exists\n});\n```\n\n---------------------------------------\n\n\n### every(arr, iterator, callback)\n\n__Alias:__ all\n\nReturns true if every element in the array satisfies an async test.\n_The callback for each iterator call only accepts a single argument of true or\nfalse, it does not accept an error argument first!_ This is in-line with the\nway node libraries work with truth tests like fs.exists.\n\n__Arguments__\n\n* arr - An array to iterate over.\n* iterator(item, callback) - A truth test to apply to each item in the array.\n The iterator is passed a callback(truthValue) which must be called with a \n boolean argument once it has completed.\n* callback(result) - A callback which is called after all the iterator\n functions have finished. Result will be either true or false depending on\n the values of the async tests.\n\n__Example__\n\n```js\nasync.every(['file1','file2','file3'], fs.exists, function(result){\n // if result is true then every file exists\n});\n```\n\n---------------------------------------\n\n\n### concat(arr, iterator, callback)\n\nApplies an iterator to each item in a list, concatenating the results. Returns the\nconcatenated list. The iterators are called in parallel, and the results are\nconcatenated as they return. There is no guarantee that the results array will\nbe returned in the original order of the arguments passed to the iterator function.\n\n__Arguments__\n\n* arr - An array to iterate over\n* iterator(item, callback) - A function to apply to each item in the array.\n The iterator is passed a callback(err, results) which must be called once it \n has completed with an error (which can be null) and an array of results.\n* callback(err, results) - A callback which is called after all the iterator\n functions have finished, or an error has occurred. Results is an array containing\n the concatenated results of the iterator function.\n\n__Example__\n\n```js\nasync.concat(['dir1','dir2','dir3'], fs.readdir, function(err, files){\n // files is now a list of filenames that exist in the 3 directories\n});\n```\n\n---------------------------------------\n\n\n### concatSeries(arr, iterator, callback)\n\nSame as async.concat, but executes in series instead of parallel.\n\n\n## Control Flow\n\n\n### series(tasks, [callback])\n\nRun an array of functions in series, each one running once the previous\nfunction has completed. If any functions in the series pass an error to its\ncallback, no more functions are run and the callback for the series is\nimmediately called with the value of the error. Once the tasks have completed,\nthe results are passed to the final callback as an array.\n\nIt is also possible to use an object instead of an array. Each property will be\nrun as a function and the results will be passed to the final callback as an object\ninstead of an array. This can be a more readable way of handling results from\nasync.series.\n\n\n__Arguments__\n\n* tasks - An array or object containing functions to run, each function is passed\n a callback(err, result) it must call on completion with an error (which can\n be null) and an optional result value.\n* callback(err, results) - An optional callback to run once all the functions\n have completed. This function gets a results array (or object) containing all \n the result arguments passed to the task callbacks.\n\n__Example__\n\n```js\nasync.series([\n function(callback){\n // do some stuff ...\n callback(null, 'one');\n },\n function(callback){\n // do some more stuff ...\n callback(null, 'two');\n }\n],\n// optional callback\nfunction(err, results){\n // results is now equal to ['one', 'two']\n});\n\n\n// an example using an object instead of an array\nasync.series({\n one: function(callback){\n setTimeout(function(){\n callback(null, 1);\n }, 200);\n },\n two: function(callback){\n setTimeout(function(){\n callback(null, 2);\n }, 100);\n }\n},\nfunction(err, results) {\n // results is now equal to: {one: 1, two: 2}\n});\n```\n\n---------------------------------------\n\n\n### parallel(tasks, [callback])\n\nRun an array of functions in parallel, without waiting until the previous\nfunction has completed. If any of the functions pass an error to its\ncallback, the main callback is immediately called with the value of the error.\nOnce the tasks have completed, the results are passed to the final callback as an\narray.\n\nIt is also possible to use an object instead of an array. Each property will be\nrun as a function and the results will be passed to the final callback as an object\ninstead of an array. This can be a more readable way of handling results from\nasync.parallel.\n\n\n__Arguments__\n\n* tasks - An array or object containing functions to run, each function is passed \n a callback(err, result) it must call on completion with an error (which can\n be null) and an optional result value.\n* callback(err, results) - An optional callback to run once all the functions\n have completed. This function gets a results array (or object) containing all \n the result arguments passed to the task callbacks.\n\n__Example__\n\n```js\nasync.parallel([\n function(callback){\n setTimeout(function(){\n callback(null, 'one');\n }, 200);\n },\n function(callback){\n setTimeout(function(){\n callback(null, 'two');\n }, 100);\n }\n],\n// optional callback\nfunction(err, results){\n // the results array will equal ['one','two'] even though\n // the second function had a shorter timeout.\n});\n\n\n// an example using an object instead of an array\nasync.parallel({\n one: function(callback){\n setTimeout(function(){\n callback(null, 1);\n }, 200);\n },\n two: function(callback){\n setTimeout(function(){\n callback(null, 2);\n }, 100);\n }\n},\nfunction(err, results) {\n // results is now equals to: {one: 1, two: 2}\n});\n```\n\n---------------------------------------\n\n\n### parallelLimit(tasks, limit, [callback])\n\nThe same as parallel only the tasks are executed in parallel with a maximum of \"limit\" \ntasks executing at any time.\n\nNote that the tasks are not executed in batches, so there is no guarantee that \nthe first \"limit\" tasks will complete before any others are started.\n\n__Arguments__\n\n* tasks - An array or object containing functions to run, each function is passed \n a callback(err, result) it must call on completion with an error (which can\n be null) and an optional result value.\n* limit - The maximum number of tasks to run at any time.\n* callback(err, results) - An optional callback to run once all the functions\n have completed. This function gets a results array (or object) containing all \n the result arguments passed to the task callbacks.\n\n---------------------------------------\n\n\n### whilst(test, fn, callback)\n\nRepeatedly call fn, while test returns true. Calls the callback when stopped,\nor an error occurs.\n\n__Arguments__\n\n* test() - synchronous truth test to perform before each execution of fn.\n* fn(callback) - A function to call each time the test passes. The function is\n passed a callback(err) which must be called once it has completed with an \n optional error argument.\n* callback(err) - A callback which is called after the test fails and repeated\n execution of fn has stopped.\n\n__Example__\n\n```js\nvar count = 0;\n\nasync.whilst(\n function () { return count < 5; },\n function (callback) {\n count++;\n setTimeout(callback, 1000);\n },\n function (err) {\n // 5 seconds have passed\n }\n);\n```\n\n---------------------------------------\n\n\n### doWhilst(fn, test, callback)\n\nThe post check version of whilst. To reflect the difference in the order of operations `test` and `fn` arguments are switched. `doWhilst` is to `whilst` as `do while` is to `while` in plain JavaScript.\n\n---------------------------------------\n\n\n### until(test, fn, callback)\n\nRepeatedly call fn, until test returns true. Calls the callback when stopped,\nor an error occurs.\n\nThe inverse of async.whilst.\n\n---------------------------------------\n\n\n### doUntil(fn, test, callback)\n\nLike doWhilst except the test is inverted. Note the argument ordering differs from `until`.\n\n---------------------------------------\n\n\n### forever(fn, callback)\n\nCalls the asynchronous function 'fn' repeatedly, in series, indefinitely.\nIf an error is passed to fn's callback then 'callback' is called with the\nerror, otherwise it will never be called.\n\n---------------------------------------\n\n\n### waterfall(tasks, [callback])\n\nRuns an array of functions in series, each passing their results to the next in\nthe array. However, if any of the functions pass an error to the callback, the\nnext function is not executed and the main callback is immediately called with\nthe error.\n\n__Arguments__\n\n* tasks - An array of functions to run, each function is passed a \n callback(err, result1, result2, ...) it must call on completion. The first\n argument is an error (which can be null) and any further arguments will be \n passed as arguments in order to the next task.\n* callback(err, [results]) - An optional callback to run once all the functions\n have completed. This will be passed the results of the last task's callback.\n\n\n\n__Example__\n\n```js\nasync.waterfall([\n function(callback){\n callback(null, 'one', 'two');\n },\n function(arg1, arg2, callback){\n callback(null, 'three');\n },\n function(arg1, callback){\n // arg1 now equals 'three'\n callback(null, 'done');\n }\n], function (err, result) {\n // result now equals 'done' \n});\n```\n\n---------------------------------------\n\n### compose(fn1, fn2...)\n\nCreates a function which is a composition of the passed asynchronous\nfunctions. Each function consumes the return value of the function that\nfollows. Composing functions f(), g() and h() would produce the result of\nf(g(h())), only this version uses callbacks to obtain the return values.\n\nEach function is executed with the `this` binding of the composed function.\n\n__Arguments__\n\n* functions... - the asynchronous functions to compose\n\n\n__Example__\n\n```js\nfunction add1(n, callback) {\n setTimeout(function () {\n callback(null, n + 1);\n }, 10);\n}\n\nfunction mul3(n, callback) {\n setTimeout(function () {\n callback(null, n * 3);\n }, 10);\n}\n\nvar add1mul3 = async.compose(mul3, add1);\n\nadd1mul3(4, function (err, result) {\n // result now equals 15\n});\n```\n\n---------------------------------------\n\n### applyEach(fns, args..., callback)\n\nApplies the provided arguments to each function in the array, calling the\ncallback after all functions have completed. If you only provide the first\nargument then it will return a function which lets you pass in the\narguments as if it were a single function call.\n\n__Arguments__\n\n* fns - the asynchronous functions to all call with the same arguments\n* args... - any number of separate arguments to pass to the function\n* callback - the final argument should be the callback, called when all\n functions have completed processing\n\n\n__Example__\n\n```js\nasync.applyEach([enableSearch, updateSchema], 'bucket', callback);\n\n// partial application example:\nasync.each(\n buckets,\n async.applyEach([enableSearch, updateSchema]),\n callback\n);\n```\n\n---------------------------------------\n\n\n### applyEachSeries(arr, iterator, callback)\n\nThe same as applyEach only the functions are applied in series.\n\n---------------------------------------\n\n\n### queue(worker, concurrency)\n\nCreates a queue object with the specified concurrency. Tasks added to the\nqueue will be processed in parallel (up to the concurrency limit). If all\nworkers are in progress, the task is queued until one is available. Once\na worker has completed a task, the task's callback is called.\n\n__Arguments__\n\n* worker(task, callback) - An asynchronous function for processing a queued\n task, which must call its callback(err) argument when finished, with an \n optional error as an argument.\n* concurrency - An integer for determining how many worker functions should be\n run in parallel.\n\n__Queue objects__\n\nThe queue object returned by this function has the following properties and\nmethods:\n\n* length() - a function returning the number of items waiting to be processed.\n* concurrency - an integer for determining how many worker functions should be\n run in parallel. This property can be changed after a queue is created to\n alter the concurrency on-the-fly.\n* push(task, [callback]) - add a new task to the queue, the callback is called\n once the worker has finished processing the task.\n instead of a single task, an array of tasks can be submitted. the respective callback is used for every task in the list.\n* unshift(task, [callback]) - add a new task to the front of the queue.\n* saturated - a callback that is called when the queue length hits the concurrency and further tasks will be queued\n* empty - a callback that is called when the last item from the queue is given to a worker\n* drain - a callback that is called when the last item from the queue has returned from the worker\n\n__Example__\n\n```js\n// create a queue object with concurrency 2\n\nvar q = async.queue(function (task, callback) {\n console.log('hello ' + task.name);\n callback();\n}, 2);\n\n\n// assign a callback\nq.drain = function() {\n console.log('all items have been processed');\n}\n\n// add some items to the queue\n\nq.push({name: 'foo'}, function (err) {\n console.log('finished processing foo');\n});\nq.push({name: 'bar'}, function (err) {\n console.log('finished processing bar');\n});\n\n// add some items to the queue (batch-wise)\n\nq.push([{name: 'baz'},{name: 'bay'},{name: 'bax'}], function (err) {\n console.log('finished processing bar');\n});\n\n// add some items to the front of the queue\n\nq.unshift({name: 'bar'}, function (err) {\n console.log('finished processing bar');\n});\n```\n\n---------------------------------------\n\n\n### cargo(worker, [payload])\n\nCreates a cargo object with the specified payload. Tasks added to the\ncargo will be processed altogether (up to the payload limit). If the\nworker is in progress, the task is queued until it is available. Once\nthe worker has completed some tasks, each callback of those tasks is called.\n\n__Arguments__\n\n* worker(tasks, callback) - An asynchronous function for processing an array of\n queued tasks, which must call its callback(err) argument when finished, with \n an optional error as an argument.\n* payload - An optional integer for determining how many tasks should be\n processed per round; if omitted, the default is unlimited.\n\n__Cargo objects__\n\nThe cargo object returned by this function has the following properties and\nmethods:\n\n* length() - a function returning the number of items waiting to be processed.\n* payload - an integer for determining how many tasks should be\n process per round. This property can be changed after a cargo is created to\n alter the payload on-the-fly.\n* push(task, [callback]) - add a new task to the queue, the callback is called\n once the worker has finished processing the task.\n instead of a single task, an array of tasks can be submitted. the respective callback is used for every task in the list.\n* saturated - a callback that is called when the queue length hits the concurrency and further tasks will be queued\n* empty - a callback that is called when the last item from the queue is given to a worker\n* drain - a callback that is called when the last item from the queue has returned from the worker\n\n__Example__\n\n```js\n// create a cargo object with payload 2\n\nvar cargo = async.cargo(function (tasks, callback) {\n for(var i=0; i \n### auto(tasks, [callback])\n\nDetermines the best order for running functions based on their requirements.\nEach function can optionally depend on other functions being completed first,\nand each function is run as soon as its requirements are satisfied. If any of\nthe functions pass an error to their callback, that function will not complete\n(so any other functions depending on it will not run) and the main callback\nwill be called immediately with the error. Functions also receive an object\ncontaining the results of functions which have completed so far.\n\nNote, all functions are called with a results object as a second argument, \nso it is unsafe to pass functions in the tasks object which cannot handle the\nextra argument. For example, this snippet of code:\n\n```js\nasync.auto({\n readData: async.apply(fs.readFile, 'data.txt', 'utf-8')\n}, callback);\n```\n\nwill have the effect of calling readFile with the results object as the last\nargument, which will fail:\n\n```js\nfs.readFile('data.txt', 'utf-8', cb, {});\n```\n\nInstead, wrap the call to readFile in a function which does not forward the \nresults object:\n\n```js\nasync.auto({\n readData: function(cb, results){\n fs.readFile('data.txt', 'utf-8', cb);\n }\n}, callback);\n```\n\n__Arguments__\n\n* tasks - An object literal containing named functions or an array of\n requirements, with the function itself the last item in the array. The key\n used for each function or array is used when specifying requirements. The \n function receives two arguments: (1) a callback(err, result) which must be \n called when finished, passing an error (which can be null) and the result of \n the function's execution, and (2) a results object, containing the results of\n the previously executed functions.\n* callback(err, results) - An optional callback which is called when all the\n tasks have been completed. The callback will receive an error as an argument\n if any tasks pass an error to their callback. Results will always be passed\n\tbut if an error occurred, no other tasks will be performed, and the results\n\tobject will only contain partial results.\n \n\n__Example__\n\n```js\nasync.auto({\n get_data: function(callback){\n // async code to get some data\n },\n make_folder: function(callback){\n // async code to create a directory to store a file in\n // this is run at the same time as getting the data\n },\n write_file: ['get_data', 'make_folder', function(callback){\n // once there is some data and the directory exists,\n // write the data to a file in the directory\n callback(null, filename);\n }],\n email_link: ['write_file', function(callback, results){\n // once the file is written let's email a link to it...\n // results.write_file contains the filename returned by write_file.\n }]\n});\n```\n\nThis is a fairly trivial example, but to do this using the basic parallel and\nseries functions would look like this:\n\n```js\nasync.parallel([\n function(callback){\n // async code to get some data\n },\n function(callback){\n // async code to create a directory to store a file in\n // this is run at the same time as getting the data\n }\n],\nfunction(err, results){\n async.series([\n function(callback){\n // once there is some data and the directory exists,\n // write the data to a file in the directory\n },\n function(callback){\n // once the file is written let's email a link to it...\n }\n ]);\n});\n```\n\nFor a complicated series of async tasks using the auto function makes adding\nnew tasks much easier and makes the code more readable.\n\n\n---------------------------------------\n\n\n### iterator(tasks)\n\nCreates an iterator function which calls the next function in the array,\nreturning a continuation to call the next one after that. It's also possible to\n'peek' the next iterator by doing iterator.next().\n\nThis function is used internally by the async module but can be useful when\nyou want to manually control the flow of functions in series.\n\n__Arguments__\n\n* tasks - An array of functions to run.\n\n__Example__\n\n```js\nvar iterator = async.iterator([\n function(){ sys.p('one'); },\n function(){ sys.p('two'); },\n function(){ sys.p('three'); }\n]);\n\nnode> var iterator2 = iterator();\n'one'\nnode> var iterator3 = iterator2();\n'two'\nnode> iterator3();\n'three'\nnode> var nextfn = iterator2.next();\nnode> nextfn();\n'three'\n```\n\n---------------------------------------\n\n\n### apply(function, arguments..)\n\nCreates a continuation function with some arguments already applied, a useful\nshorthand when combined with other control flow functions. Any arguments\npassed to the returned function are added to the arguments originally passed\nto apply.\n\n__Arguments__\n\n* function - The function you want to eventually apply all arguments to.\n* arguments... - Any number of arguments to automatically apply when the\n continuation is called.\n\n__Example__\n\n```js\n// using apply\n\nasync.parallel([\n async.apply(fs.writeFile, 'testfile1', 'test1'),\n async.apply(fs.writeFile, 'testfile2', 'test2'),\n]);\n\n\n// the same process without using apply\n\nasync.parallel([\n function(callback){\n fs.writeFile('testfile1', 'test1', callback);\n },\n function(callback){\n fs.writeFile('testfile2', 'test2', callback);\n }\n]);\n```\n\nIt's possible to pass any number of additional arguments when calling the\ncontinuation:\n\n```js\nnode> var fn = async.apply(sys.puts, 'one');\nnode> fn('two', 'three');\none\ntwo\nthree\n```\n\n---------------------------------------\n\n\n### nextTick(callback)\n\nCalls the callback on a later loop around the event loop. In node.js this just\ncalls process.nextTick, in the browser it falls back to setImmediate(callback)\nif available, otherwise setTimeout(callback, 0), which means other higher priority\nevents may precede the execution of the callback.\n\nThis is used internally for browser-compatibility purposes.\n\n__Arguments__\n\n* callback - The function to call on a later loop around the event loop.\n\n__Example__\n\n```js\nvar call_order = [];\nasync.nextTick(function(){\n call_order.push('two');\n // call_order now equals ['one','two']\n});\ncall_order.push('one')\n```\n\n\n### times(n, callback)\n\nCalls the callback n times and accumulates results in the same manner\nyou would use with async.map.\n\n__Arguments__\n\n* n - The number of times to run the function.\n* callback - The function to call n times.\n\n__Example__\n\n```js\n// Pretend this is some complicated async factory\nvar createUser = function(id, callback) {\n callback(null, {\n id: 'user' + id\n })\n}\n// generate 5 users\nasync.times(5, function(n, next){\n createUser(n, function(err, user) {\n next(err, user)\n })\n}, function(err, users) {\n // we should now have 5 users\n});\n```\n\n\n### timesSeries(n, callback)\n\nThe same as times only the iterator is applied to each item in the array in\nseries. The next iterator is only called once the current one has completed\nprocessing. The results array will be in the same order as the original.\n\n\n## Utils\n\n\n### memoize(fn, [hasher])\n\nCaches the results of an async function. When creating a hash to store function\nresults against, the callback is omitted from the hash and an optional hash\nfunction can be used.\n\nThe cache of results is exposed as the `memo` property of the function returned\nby `memoize`.\n\n__Arguments__\n\n* fn - the function you to proxy and cache results from.\n* hasher - an optional function for generating a custom hash for storing\n results, it has all the arguments applied to it apart from the callback, and\n must be synchronous.\n\n__Example__\n\n```js\nvar slow_fn = function (name, callback) {\n // do something\n callback(null, result);\n};\nvar fn = async.memoize(slow_fn);\n\n// fn can now be used as if it were slow_fn\nfn('some name', function () {\n // callback\n});\n```\n\n\n### unmemoize(fn)\n\nUndoes a memoized function, reverting it to the original, unmemoized\nform. Comes handy in tests.\n\n__Arguments__\n\n* fn - the memoized function\n\n\n### log(function, arguments)\n\nLogs the result of an async function to the console. Only works in node.js or\nin browsers that support console.log and console.error (such as FF and Chrome).\nIf multiple arguments are returned from the async function, console.log is\ncalled on each argument in order.\n\n__Arguments__\n\n* function - The function you want to eventually apply all arguments to.\n* arguments... - Any number of arguments to apply to the function.\n\n__Example__\n\n```js\nvar hello = function(name, callback){\n setTimeout(function(){\n callback(null, 'hello ' + name);\n }, 1000);\n};\n```\n```js\nnode> async.log(hello, 'world');\n'hello world'\n```\n\n---------------------------------------\n\n\n### dir(function, arguments)\n\nLogs the result of an async function to the console using console.dir to\ndisplay the properties of the resulting object. Only works in node.js or\nin browsers that support console.dir and console.error (such as FF and Chrome).\nIf multiple arguments are returned from the async function, console.dir is\ncalled on each argument in order.\n\n__Arguments__\n\n* function - The function you want to eventually apply all arguments to.\n* arguments... - Any number of arguments to apply to the function.\n\n__Example__\n\n```js\nvar hello = function(name, callback){\n setTimeout(function(){\n callback(null, {hello: name});\n }, 1000);\n};\n```\n```js\nnode> async.dir(hello, 'world');\n{hello: 'world'}\n```\n\n---------------------------------------\n\n\n### noConflict()\n\nChanges the value of async back to its original value, returning a reference to the\nasync object.\n","readmeFilename":"README.md","bugs":{"url":"https://github.com/caolan/async/issues"},"_attachments":{}}
\ No newline at end of file
diff --git a/deps/npm/test/npm_cache/_cacache/content-v2/sha512/b4/8a/454c55f95fa0351cca479761f5ff792f8f7ab4448f2b1399a3ac3778a60a293f71feeda29678ce15b71712b0803f9866e92c0cbc4549b4807435dcf7a767 b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/b4/8a/454c55f95fa0351cca479761f5ff792f8f7ab4448f2b1399a3ac3778a60a293f71feeda29678ce15b71712b0803f9866e92c0cbc4549b4807435dcf7a767
new file mode 100644
index 0000000000000000000000000000000000000000..0866e68dc09d2e2e7d587bd6aaaf23ee40764ec5
GIT binary patch
literal 143
zcmb2|=3oE=VQmkebACSlSM?h%R7D>BS;-T^1_WtpYFsyt96TVRk)D&4mXMK=md236
zEV)c=nZvAGdsn-Wt58a%ko)-1Hk{$sgRHqy
z)FY4b?K{gZ;$8p51WsQJ?BNg%#p>RB9|(4EGb35=_(GOgA}7?1HbjE`1i@3Zjz4Y~
mxWRP27lre|6e7=sdDti<%=52TOOhl>K6nE8r0AXi2mk=&S6NE{
literal 0
HcmV?d00001
diff --git a/deps/npm/test/npm_cache/_cacache/content-v2/sha512/cd/eb/0f5065be03e89547a33e064d911969953c45eb05df664ca4d537b970dc9f768123463a6f75ce6b836d50ee73c18ac7a25e763f2b9869612cdbf427195d4b b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/cd/eb/0f5065be03e89547a33e064d911969953c45eb05df664ca4d537b970dc9f768123463a6f75ce6b836d50ee73c18ac7a25e763f2b9869612cdbf427195d4b
new file mode 100644
index 00000000000000..df235e9e79ec2e
--- /dev/null
+++ b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/cd/eb/0f5065be03e89547a33e064d911969953c45eb05df664ca4d537b970dc9f768123463a6f75ce6b836d50ee73c18ac7a25e763f2b9869612cdbf427195d4b
@@ -0,0 +1 @@
+{"_id":"test-repo-url-ssh","_rev":"2-cc990259d480838e6847fb520d305a9e","name":"test-repo-url-ssh","description":"Test repo with non-github ssh repository url","dist-tags":{"latest":"0.0.1"},"versions":{"0.0.1":{"name":"test-repo-url-ssh","version":"0.0.1","description":"Test repo with non-github ssh repository url","main":"index.js","scripts":{"test":"echo \"Error: no test specified\" && exit 1"},"repository":{"type":"git","url":"git@gitlab.com:evanlucas/test-repo-url-ssh.git"},"author":{"name":"Evan Lucas","email":"evanlucas@me.com"},"license":"ISC","_id":"test-repo-url-ssh@0.0.1","dist":{"shasum":"2a77307e108bfb57107c4c334abb5ef5395dc68a","tarball":"http://localhost:1337/test-repo-url-ssh/-/test-repo-url-ssh-0.0.1.tgz"},"_from":".","_npmVersion":"1.4.2","_npmUser":{"name":"evanlucas","email":"evanlucas@me.com"},"maintainers":[{"name":"evanlucas","email":"evanlucas@me.com"}],"directories":{}}},"readme":"ERROR: No README data found!","maintainers":[{"name":"evanlucas","email":"evanlucas@me.com"}],"time":{"0.0.1":"2014-02-16T18:50:00.142Z"},"readmeFilename":"","_attachments":{}}
\ No newline at end of file
diff --git a/deps/npm/test/npm_cache/_cacache/content-v2/sha512/d4/45/ed72e65ed0b9fec5a6a41794caadda951ba79a0541648e259c8021b3fc96487d2caedf869ac142b4b0f31998c436f171d98a9a1740e3ac8eebb5c1103c53 b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/d4/45/ed72e65ed0b9fec5a6a41794caadda951ba79a0541648e259c8021b3fc96487d2caedf869ac142b4b0f31998c436f171d98a9a1740e3ac8eebb5c1103c53
new file mode 100644
index 00000000000000..11b928e5cddd9e
--- /dev/null
+++ b/deps/npm/test/npm_cache/_cacache/content-v2/sha512/d4/45/ed72e65ed0b9fec5a6a41794caadda951ba79a0541648e259c8021b3fc96487d2caedf869ac142b4b0f31998c436f171d98a9a1740e3ac8eebb5c1103c53
@@ -0,0 +1 @@
+{"_id":"checker","_rev":"23-39ff9491581c529b8b828651a196c7a3","name":"checker","description":"Checker is the collection of common abstract methods for validatiors and setters.","dist-tags":{"latest":"0.5.2"},"versions":{"0.0.0":{"name":"checker","version":"0.0.0","description":"","main":"index.js","scripts":{"test":"echo \"Error: no test specified\" && exit 1"},"author":"","license":"MIT","readme":"ERROR: No README data found!","_id":"checker@0.0.0","dist":{"shasum":"6a7a3977bbe770560d4fcc86eb3a32a52c9b368d","tarball":"http://localhost:1337/checker/-/checker-0.0.0.tgz"},"_from":".","_npmVersion":"1.3.11","_npmUser":{"name":"kael","email":"i@kael.me"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"0.2.1":{"name":"checker","version":"0.2.1","description":"Checker is the collection of common abstract methods for validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git://github.com/kaelzhang/node-checker.git"},"keywords":["checker","validator","validate","setter"],"author":{"name":"kael"},"license":"MIT","bugs":{"url":"https://github.com/kaelzhang/node-checker/issues"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"dependencies":{"async":"~0.2.9"},"readme":"[![Build Status](https://travis-ci.org/kaelzhang/node-checker.png?branch=master)](https://travis-ci.org/kaelzhang/node-checker)\n\n(THIS DOCUMENTAION IS NOT FINISHED YET.)\n\n# checker\n\nChecker is the collection of common abstract node.js methods for validatiors and setters.\n\t\n# Usage\n```sh\nnpm install checker --save\n```\n\n```js\nvar checker = require('checker');\n```\n\n# Synopsis\n\n```js\nchecker(schema, options).check(data, callback);\n```\n\n# Validation, Error Messages\n\n## Simple synchronous validators\n\n```js\nvar schema = {\n\tusername: {\n\t\tvalidator: function(value){\n\t\t\treturn /^[a-zA-Z0-9]{6,}$/.test(value);\n\t\t},\n\t\tmessage: 'Username must only contain letters, numbers; Username must contain at least 6 charactors'\n\t}\n};\n\nvar c = checker(schema);\n\nc.check({\n\tusername: 'a'\n}, function(err){\n\tif(err){\n\t\tconsole.log(err); // Then, `schema.username.message` will be displayed.\n\t}\n});\n```\n\n## Regular expressions as validators\n\nThe error hint of the example above is bad, because we want to know the very certain reason why we are wrong.\n\nThe `schema` below is equivalent to the one of the previous section:\n\n```js\n{\n\tvalidator: [\n\t\tfunction(value){\n\t\t\treturn value && value.length > 5;\n\t\t}, \n\t\t/^[a-zA-Z0-9]+$/\n\t],\n\tmessage: [\n\t\t'Username must contain at least 6 charactors', \n\t\t'Username must only contain letters and numbers'\n\t];\n}\n```\n\n## Asynchronous validators\n\n```js\n{\n\tvalidator: function(value){\n\t\tvar done = this.async();\n\t\t// this is an async method, and takes sooooo long...\n\t\tremote_check(value, function(err){\n\t\t\tdone(err); // `err` will pass to the `callback`\n\t\t});\n\t}\n}\n```\n\n\n# Programmatical Details\n\n## Options\n\n#### options.default_message `String`\n\nDefault error message\n\n#### options.parallel `Boolean=false`\n\n#### options.limit `Boolean=false`\n\n#### options.check_all `Boolean=false`\n\n\n\n## Schema Structures \n\n```js\n{\n\t: \n}\n```\n\n\nWhere `rule` might contains (all properties are optional):\n\n#### validator \n\n- `RegExp` The regular exp that input must matches against\n- `Function` Validation function. If `arguments.length === 3`, it will be considered as an async methods\n- `Array.` Group of validations. Asks will check each validator one by one. If validation fails, the rest validators will be skipped.\n- See sections above for details\n\t\n#### setter `Function|Array.`\n\nSee sections above for details.\n\n#### message `String`\n\nDefault error message\n\n#### default: `String`\n","_id":"checker@0.2.1","dist":{"shasum":"f25a07a1429cd9cee4a668f19fa99fa7e380deda","tarball":"http://localhost:1337/checker/-/checker-0.2.1.tgz"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"0.3.1":{"name":"checker","version":"0.3.1","description":"Checker is the collection of common abstract methods for validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git://github.com/kaelzhang/node-checker.git"},"keywords":["checker","validator","validate","setter"],"author":{"name":"kael"},"license":"MIT","bugs":{"url":"https://github.com/kaelzhang/node-checker/issues"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"dependencies":{"async":"~0.2.9"},"readme":"[![Build Status](https://travis-ci.org/kaelzhang/node-checker.png?branch=master)](https://travis-ci.org/kaelzhang/node-checker)\n\n(THIS DOCUMENTAION IS NOT FINISHED YET.)\n\n# checker\n\nChecker is the collection of common abstract node.js methods for validatiors and setters.\n\t\n# Usage\n```sh\nnpm install checker --save\n```\n\n```js\nvar checker = require('checker');\n```\n\n# Synopsis\n\n```js\nchecker(schema, options).check(data, callback);\n```\n\n# Validation, Error Messages\n\n## Simple synchronous validators\n\n```js\nvar schema = {\n\tusername: {\n\t\tvalidator: function(value){\n\t\t\treturn /^[a-zA-Z0-9]{6,}$/.test(value);\n\t\t},\n\t\tmessage: 'Username must only contain letters, numbers; Username must contain at least 6 charactors'\n\t}\n};\n\nvar c = checker(schema);\n\nc.check({\n\tusername: 'a'\n}, function(err){\n\tif(err){\n\t\tconsole.log(err); // Then, `schema.username.message` will be displayed.\n\t}\n});\n```\n\n## Regular expressions as validators\n\nThe error hint of the example above is bad, because we want to know the very certain reason why we are wrong.\n\nThe `schema` below is equivalent to the one of the previous section:\n\n```js\n{\n\tvalidator: [\n\t\tfunction(value){\n\t\t\treturn value && value.length > 5;\n\t\t}, \n\t\t/^[a-zA-Z0-9]+$/\n\t],\n\tmessage: [\n\t\t'Username must contain at least 6 charactors', \n\t\t'Username must only contain letters and numbers'\n\t];\n}\n```\n\n## Asynchronous validators\n\n```js\n{\n\tvalidator: function(value){\n\t\tvar done = this.async();\n\t\t// this is an async method, and takes sooooo long...\n\t\tremote_check(value, function(err){\n\t\t\tdone(err); // `err` will pass to the `callback`\n\t\t});\n\t}\n}\n```\n\n\n# Programmatical Details\n\n## Options\n\n#### options.default_message `String`\n\nDefault error message\n\n#### options.parallel `Boolean=false`\n\n#### options.limit `Boolean=false`\n\n#### options.check_all `Boolean=false`\n\n\n\n## Schema Structures \n\n```js\n{\n\t: \n}\n```\n\n\nWhere `rule` might contains (all properties are optional):\n\n#### validator \n\n- `RegExp` The regular exp that input must matches against\n- `Function` Validation function. If `arguments.length === 3`, it will be considered as an async methods\n- `Array.` Group of validations. Asks will check each validator one by one. If validation fails, the rest validators will be skipped.\n- See sections above for details\n\t\n#### setter `Function|Array.`\n\nSee sections above for details.\n\n#### message `String`\n\nDefault error message\n\n#### default: `String`\n","_id":"checker@0.3.1","dist":{"shasum":"c285c3f3c29c4186156d9e94945ad3892e64c739","tarball":"http://localhost:1337/checker/-/checker-0.3.1.tgz"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"0.3.2":{"name":"checker","version":"0.3.2","description":"Checker is the collection of common abstract methods for validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git://github.com/kaelzhang/node-checker.git"},"keywords":["checker","validator","validate","setter"],"author":{"name":"kael"},"license":"MIT","bugs":{"url":"https://github.com/kaelzhang/node-checker/issues"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"dependencies":{"async":"~0.2.9"},"readme":"[![Build Status](https://travis-ci.org/kaelzhang/node-checker.png?branch=master)](https://travis-ci.org/kaelzhang/node-checker)\n\n# checker\n\nChecker is the collection of common abstract node.js methods for validatiors and setters.\n\t\n# Usage\n```sh\nnpm install checker --save\n```\n\n```js\nvar checker = require('checker');\n```\n\n# Synopsis\n\n```js\nchecker(schema, options).check(data, function(err, value, details){\n});\n```\n\n### err `mixed`\n\n### parsed `Object`\n\nThe cleaned and parsed `data`.\n\n### details `Object`\n\n```\n{\n\t: \n}\n```\n\n- `detail.value` `mixed` the parsed value\n- `detail.is_default` `Boolean` if the current property is defined in `schema`, but the input data doesn't have it, then the value will be `true`\n- `detail.is_cooked` `Boolean` if there're any setters, it will be `true`\n- `detail.origin` the origin value of the property\n\n\n# Validation, Error Messages\n\n## Simple synchronous validators\n\n```js\nvar schema = {\n\tusername: {\n\t\tvalidator: function(value){\n\t\t\treturn /^[a-zA-Z0-9]{6,}$/.test(value);\n\t\t},\n\t\tmessage: 'Username must only contain letters, numbers; ' \n\t\t\t+ 'Username must contain at least 6 charactors'\n\t}\n};\n\nvar c = checker(schema);\n\nc.check({\n\tusername: 'a'\n}, function(err){\n\tif(err){\n\t\tconsole.log(err); // Then, `schema.username.message` will be displayed.\n\t}\n});\n```\n\n## Regular expressions as validators\n\nThe error hint of the example above is bad, because we want to know the very certain reason why we are wrong.\n\nThe `schema` below is equivalent to the one of the previous section:\n\n```js\n{\n\tvalidator: [\n\t\tfunction(value){\n\t\t\treturn value && value.length > 5;\n\t\t}, \n\t\t/^[a-zA-Z0-9]+$/\n\t],\n\tmessage: [\n\t\t'Username must contain at least 6 charactors', \n\t\t'Username must only contain letters and numbers'\n\t];\n}\n```\n\n## Asynchronous validators\n\n```js\n{\n\tvalidator: function(value){\n\t\tvar done = this.async();\n\t\t// this is an async method, and takes sooooo long...\n\t\tremote_check(value, function(err){\n\t\t\tdone(err); // `err` will pass to the `callback`\n\t\t});\n\t}\n}\n```\n\n\n# Programmatical Details\n\n## Options\n\n#### options.default_message `String`\n\nDefault error message\n\n#### options.parallel `Boolean=false`\n\nBy default, `checker` will check each properties in series, \n\n#### options.limit `Boolean=false`\n\nIf `options.limit` is `true` and a certain property of the input data is not defined in the `schema`, the property will be removed.\n\nDefault to `false`.\n\n#### options.check_all `Boolean=false`\n\nNot implemented yet.\n\n#### options.context `Object`\n\nSee sections below.\n\n## Schema Structures \n\n```js\n{\n\t: \n}\n```\n\n\nWhere `rule` might contains (all properties are optional):\n\n#### validator \n\n- `RegExp` The regular exp that input must matches against\n- `Function` Validation function. If `arguments.length === 3`, it will be considered as an async methods\n- `Array.` Group of validations. Asks will check each validator one by one. If validation fails, the rest validators will be skipped.\n- See sections above for details\n\t\n#### setter `Function|Array.`\n\nSee sections above for details.\n\n#### message `String`\n\nDefault error message\n\n#### default: `String`\n\n\n## `this` object inside validators and setters\n\nInside validators(`rule.validator`) and setters(`rule.setter`), there're several opaque methods\n\n### this.async()\n\nGenerate the `done` function to make the validator or setter become an async method.\n\n\tvar done = this.async();\n\t\nFor details, see the demos above.\n\n### this.get(name)\n\nThe value of the input object by name\n\n### this.set(name, value)\n\nChange the value of the specified property of the input object.\n\n```\n{\n\tusername: {\n\t},\n\t\n\tpassword: {\n\t\tvalidator: function(value){\n\t\t\tvar username = this.get('username');\n\t\t\t\n\t\t\t// Guests are welcome even without passwords\n\t\t\treturn value || username === 'guest';\n\t\t}\n\t}\n}\n```\n\nNotice that you'd better use `this.get` and `this.set` with the `options.parallel` setting as `false`(the default value). Otherwise, it might encounter unexpected situations, because the value of the object is ever changing due to the setter.\n\nSo, use them wisely.\n\n### this.context `Object`\n\nThe `options.context` itself.\n\n\n\n","_id":"checker@0.3.2","dist":{"shasum":"bc4b84036a5699c609e3c627923cb87d8058a79d","tarball":"http://localhost:1337/checker/-/checker-0.3.2.tgz"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"0.4.2":{"name":"checker","version":"0.4.2","description":"Checker is the collection of common abstract methods for validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git://github.com/kaelzhang/node-checker.git"},"keywords":["checker","validator","validate","setter"],"author":{"name":"kael"},"license":"MIT","bugs":{"url":"https://github.com/kaelzhang/node-checker/issues"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"dependencies":{"async":"~0.2.9"},"readme":"[![Build Status](https://travis-ci.org/kaelzhang/node-checker.png?branch=master)](https://travis-ci.org/kaelzhang/node-checker)\n\n# checker\n\nChecker is the collection of common abstract node.js methods for validatiors and setters.\n\t\n# Usage\n```sh\nnpm install checker --save\n```\n\n```js\nvar checker = require('checker');\n```\n\n# Synopsis\n\n```js\nchecker(schema, options).check(data, function(err, value, details){\n});\n```\n\n### err `mixed`\n\n### results `Object`\n\nThe parsed object.\n\n### details `Object`\n\n```\n{\n\t: \n}\n```\n\n- `detail.value` `mixed` the parsed value\n- `detail.is_default` `Boolean` if the current property is defined in `schema`, but the input data doesn't have it, then the value will be `true`\n- `detail.is_cooked` `Boolean` if there're any setters, it will be `true`\n- `detail.origin` the origin value of the property\n- `detail.error` the error belongs to the current property. If not exists, it will be `null`\n\n\n# Validation, Error Messages\n\n## Simple synchronous validators\n\n```js\nvar schema = {\n\tusername: {\n\t\tvalidator: function(value){\n\t\t\treturn /^[a-zA-Z0-9]{6,}$/.test(value);\n\t\t},\n\t\tmessage: 'Username must only contain letters, numbers; ' \n\t\t\t+ 'Username must contain at least 6 charactors'\n\t}\n};\n\nvar c = checker(schema);\n\nc.check({\n\tusername: 'a'\n}, function(err){\n\tif(err){\n\t\tconsole.log(err); // Then, `schema.username.message` will be displayed.\n\t}\n});\n```\n\n## Regular expressions as validators\n\nThe error hint of the example above is bad, because we want to know the very certain reason why we are wrong.\n\nThe `schema` below is equivalent to the one of the previous section:\n\n```js\n{\n\tvalidator: [\n\t\tfunction(value){\n\t\t\treturn value && value.length > 5;\n\t\t}, \n\t\t/^[a-zA-Z0-9]+$/\n\t],\n\tmessage: [\n\t\t'Username must contain at least 6 charactors', \n\t\t'Username must only contain letters and numbers'\n\t];\n}\n```\n\n## Asynchronous validators\n\n```js\n{\n\tvalidator: function(value){\n\t\tvar done = this.async();\n\t\t// this is an async method, and takes sooooo long...\n\t\tremote_check(value, function(err){\n\t\t\tdone(err); // `err` will pass to the `callback`\n\t\t});\n\t}\n}\n```\n\n\n# Programmatical Details\n\n## Options\n\n#### options.default_message `String`\n\nDefault error message\n\n#### options.parallel `Boolean=false`\n\nBy default, `checker` will check each properties in series, \n\n#### options.limit `Boolean=false`\n\nIf `options.limit` is `true` and a certain property of the input data is not defined in the `schema`, the property will be removed.\n\nDefault to `false`.\n\n#### options.check_all `Boolean=false`\n\nBy default, `checker` will exit immediately at the first error. But if `options.check_all` is `true`, it will parse all the properties, and collect every possible error.\n\n#### options.context `Object`\n\nSee sections below.\n\n## Schema Structures \n\n```js\n{\n\t: \n}\n```\n\n\nWhere `rule` might contains (all properties are optional):\n\n#### validator \n\n- `RegExp` The regular exp that input must matches against\n- `Function` Validation function. If `arguments.length === 3`, it will be considered as an async methods\n- `Array.` Group of validations. Asks will check each validator one by one. If validation fails, the rest validators will be skipped.\n- See sections above for details\n\t\n#### setter `Function|Array.`\n\nSee sections above for details.\n\n#### message `String`\n\nDefault error message\n\n#### default: `String`\n\n\n## `this` object inside validators and setters\n\nInside validators(`rule.validator`) and setters(`rule.setter`), there're several opaque methods\n\n### this.async()\n\nGenerate the `done` function to make the validator or setter become an async method.\n\n\tvar done = this.async();\n\t\nFor details, see the demos above.\n\n### this.get(name)\n\nThe value of the input object by name\n\n### this.set(name, value)\n\nChange the value of the specified property of the input object.\n\n```\n{\n\tusername: {\n\t},\n\t\n\tpassword: {\n\t\tvalidator: function(value){\n\t\t\tvar username = this.get('username');\n\t\t\t\n\t\t\t// Guests are welcome even without passwords\n\t\t\treturn value || username === 'guest';\n\t\t}\n\t}\n}\n```\n\nNotice that you'd better use `this.get` and `this.set` with the `options.parallel` setting as `false`(the default value). Otherwise, it might encounter unexpected situations, because the value of the object is ever changing due to the setter.\n\nSo, use them wisely.\n\n### this.context `Object`\n\nThe `options.context` itself.\n\n\n\n","_id":"checker@0.4.2","dist":{"shasum":"7b033fdad0f000f88302ff1f5a8e59d8f466580e","tarball":"http://localhost:1337/checker/-/checker-0.4.2.tgz"},"maintainers":[{"name":"kael","email":"i@kael.me"}],"directories":{}},"0.5.1":{"name":"checker","version":"0.5.1","description":"Checker is the collection of common abstract methods for validatiors and setters.","main":"index.js","scripts":{"test":"make test"},"repository":{"type":"git","url":"git://github.com/kaelzhang/node-checker.git"},"keywords":["checker","validator","validate","setter"],"author":{"name":"kael"},"license":"MIT","bugs":{"url":"https://github.com/kaelzhang/node-checker/issues"},"devDependencies":{"mocha":"~1.13.0","chai":"~1.8.0"},"dependencies":{"async":"~0.2.9"},"readme":"[![Build Status](https://travis-ci.org/kaelzhang/node-checker.png?branch=master)](https://travis-ci.org/kaelzhang/node-checker)\n\n# checker\n\nChecker is the collection of common abstract node.js methods for validatiors and setters.\n\t\n# Usage\n```sh\nnpm install checker --save\n```\n\n```js\nvar checker = require('checker');\n```\n\n# Synopsis\n\n```js\nchecker(schema, options).check(data, function(err, value, details){\n});\n```\n\n### err `mixed`\n\n### results `Object`\n\nThe parsed object.\n\n### details `Object`\n\n```\n{\n\t: \n}\n```\n\n- `detail.value` `mixed` the parsed value\n- `detail.is_default` `Boolean` if the current property is defined in `schema`, but the input data doesn't have it, then the value will be `true`\n- `detail.is_cooked` `Boolean` if there're any setters, it will be `true`\n- `detail.origin` the origin value of the property\n- `detail.error` the error belongs to the current property. If not exists, it will be `null`\n\n\n# Validation, Error Messages\n\n## Simple synchronous validators\n\n```js\nvar schema = {\n\tusername: {\n\t\tvalidator: function(value){\n\t\t\treturn /^[a-zA-Z0-9]{6,}$/.test(value);\n\t\t},\n\t\tmessage: 'Username must only contain letters, numbers; ' \n\t\t\t+ 'Username must contain at least 6 charactors'\n\t}\n};\n\nvar c = checker(schema);\n\nc.check({\n\tusername: 'a'\n}, function(err){\n\tif(err){\n\t\tconsole.log(err); // Then, `schema.username.message` will be displayed.\n\t}\n});\n```\n\n## Regular expressions as validators\n\nThe error hint of the example above is bad, because we want to know the very certain reason why we are wrong.\n\nThe `schema` below is equivalent to the one of the previous section:\n\n```js\n{\n\tvalidator: [\n\t\tfunction(value){\n\t\t\treturn value && value.length > 5;\n\t\t}, \n\t\t/^[a-zA-Z0-9]+$/\n\t],\n\tmessage: [\n\t\t'Username must contain at least 6 charactors', \n\t\t'Username must only contain letters and numbers'\n\t];\n}\n```\n\n## Asynchronous validators\n\n```js\n{\n\tvalidator: function(value){\n\t\tvar done = this.async();\n\t\t// this is an async method, and takes sooooo long...\n\t\tremote_check(value, function(err){\n\t\t\tdone(err); // `err` will pass to the `callback`\n\t\t});\n\t}\n}\n```\n\n\n# Programmatical Details\n\n## Options\n\n#### options.default_message `String`\n\nDefault error message\n\n#### options.parallel `Boolean=false`\n\nBy default, `checker` will check each properties in series, \n\n#### options.limit `Boolean=false`\n\nIf `options.limit` is `true` and a certain property of the input data is not defined in the `schema`, the property will be removed.\n\nDefault to `false`.\n\n#### options.check_all `Boolean=false`\n\nBy default, `checker` will exit immediately at the first error. But if `options.check_all` is `true`, it will parse all the properties, and collect every possible error.\n\n#### options.context `Object`\n\nSee sections below.\n\n## Schema Structures \n\n```js\n{\n\t: