Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: script to compare fresh install time #103

Open
wants to merge 6 commits into
base: master
Choose a base branch
from

Conversation

olizilla
Copy link
Member

A script to repeatably compare the first install time of a module via ipfs-npm against npm.

We know ipfs-npm is likely to be slower on first install time, but I want to do whatever we can to bring that time down, as it's the first experience folks will have with it.

We create a docker image with ipfs-npm in, then time how long it take to install takes there, then run the same install again in a stock node docker image via npm. Both will have empty caches for the run.

Example output

$ ./test/perf/docker-race.sh 
found ipfs-npm Docker image

---- ipfs-npm flavour ----
👿 Spawning an in-process IPFS node using repo at /root/.jsipfs
...
/usr/local/bin/iim -> /usr/local/lib/node_modules/iim/src/bin.js
+ iim@0.6.1
added 415 packages from 782 contributors in 115.216s
🎁 /usr/local/bin/npm exited with code 0
🔏 Updating package-lock.json

real	2m0.062s
user	0m0.033s
sys	0m0.024s

---- npm flavour ----
/usr/local/bin/iim -> /usr/local/lib/node_modules/iim/src/bin.js
+ iim@0.6.1
added 415 packages from 782 contributors in 8.459s

real	0m10.811s
user	0m0.032s
sys	0m0.014s

License: MIT
Signed-off-by: Oli Evans oli@tableflip.io

A quick test to compare the first install time of a module via
ipfs-npm against npm

License: MIT
Signed-off-by: Oli Evans <oli@tableflip.io>
@ghost ghost assigned olizilla May 14, 2019
@ghost ghost added the status/in-progress In progress label May 14, 2019
@olizilla
Copy link
Member Author

Today it's 40s rather than 2 mins for ipfs-npm

$ ./docker-race.sh 
found ipfs-npm Docker image

---- ipfs-npm flavour ----
👿 Spawning an in-process IPFS node using repo at /root/.jsipfs
...
+ iim@0.6.1
added 415 packages from 782 contributors in 36.717s
🎁 /usr/local/bin/npm exited with code 0
🔏 Updating package-lock.json

real	0m40.804s
user	0m0.033s
sys	0m0.018s

---- npm flavour ----
/usr/local/bin/iim -> /usr/local/lib/node_modules/iim/src/bin.js
+ iim@0.6.1
added 415 packages from 782 contributors in 8.375s

real	0m10.934s
user	0m0.031s
sys	0m0.028s

License: MIT
Signed-off-by: Oli Evans <oli@tableflip.io>
@achingbrain
Copy link
Member

Nice. I wonder if the time of day affects it?

@achingbrain achingbrain self-requested a review May 15, 2019 08:41
Copy link
Member

@lidel lidel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool! I've run it multiple times and consistently get around 45s for npm-ipfs

Small ask: we should run tests in ephemeral containers (see notes below).

test/perf/docker-race.sh Outdated Show resolved Hide resolved
test/perf/docker-race.sh Outdated Show resolved Hide resolved
test/perf/docker-race.sh Outdated Show resolved Hide resolved
olizilla and others added 4 commits May 15, 2019 11:04
prefix relative path to docker build step

Co-Authored-By: Marcin Rataj <lidel@lidel.org>
No need to keep container around after the test, as it will slowly eat up disk space. If you've already did run it multiple times:

    see last 10 containers via docker ps -a | head -10
    remove image and all stale containers that use it docker rmi ipfs-npm -f

It also makes sense to disable NAT and just use host network interfaces directly, bringing test closer to reality.

This should run in ephemeral container with no NAT:

Co-Authored-By: Marcin Rataj <lidel@lidel.org>
No need to keep container around after the test, as it will slowly eat up disk space. If you've already did run it multiple times:

    see last 10 containers via docker ps -a | head -10
    remove image and all stale containers that use it docker rmi ipfs-npm -f

It also makes sense to disable NAT and just use host network interfaces directly, bringing test closer to reality.

This should run in ephemeral container with no NAT:

Co-Authored-By: Marcin Rataj <lidel@lidel.org>
License: MIT
Signed-off-by: Oli Evans <oli@tableflip.io>
@olizilla olizilla mentioned this pull request May 15, 2019
@victorb
Copy link
Member

victorb commented May 15, 2019

This might be relevant for you:

https://github.com/open-services/public-registry-benchmarks

Basically, the project above creates bunch of different Dockerfiles for every combination of a list of CLIs and a list of public registries: https://github.com/open-services/public-registry-benchmarks/tree/master/tests

Then it runs them all a couple of times and logs the time for each one. The average and mean gets recorded as well, then you'll get a report.md in the end with all the times.

npm-on-ipfs (the deployed version) is already there and gets run with all the rest of the registries.

You might be able to reuse it somehow to run it within your own infrastructure. Otherwise I could try to focus some time on running it nightly, and you can just use the results as-is for figuring out improvements.

@olizilla
Copy link
Member Author

That is very cool @victorb. I'm assuming currently you run it add-hoc when you want to update the stats? Running it nightly or even weekly would be rad, then we could see the any improvements in published releases. I'll take a look and see what I can use to improve this test too.

@victorb
Copy link
Member

victorb commented May 15, 2019

@olizilla indeed. I just opened up a issue about running it each night: open-services/public-registry-benchmarks#1

@victorb
Copy link
Member

victorb commented May 23, 2019

Update: open-services/public-registry-benchmarks now runs the benchmarks once a day and updates the README with the latest results + stores each report in a directory in the git repository

If you can't figure out how to add a new test-case for the benchmarks, I'd be happy to help you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants