You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add a benchmark suite to test against our builds (ex. pnpm benchmarks)
Create a build pipeline/process, separate from the cli builds, to run the benchmarks & generate a report (use GitHub actions if possible)
Requirements
Requirements
Implement benchmark/ folder with script to run benchmarking commands [R]
create execution pattern
create scenario's paradigm
Create github action to run benchmark script [R]
link current commit of repo to global node_modules/[R]
run benchmark script which will run scenarios [R]
output benchmark results in console [R]
Make output from benchmark run more usable [R]
write output of scenarios to a results.yml file [R]
write benchmark output to comment in pull-request [R]
update the same comment in pull-request [S]
Integrate benchmarking run into release process [R]
write benchmark results to file and commit it [R]
Legend
[R]: required [S]: stretch goal
Todos
TODO:
update comments in scenarios.js[DOCS]
The information about order is incorrect. The ordering doesn't matter as the result of each run is a full cache, full node_modules, and full lockfile. Each next scenario begins with "everything", and must remove the things it doesn't need to complete itself.
Description of what a "scenario" is would be helpful for adding new scenarios in the future. Commented code was a previous iteration and can be removed.
collect results from each scenario into an object in index.js; write this object into a yaml file in benchmarks/results [FEAT]
Currently just writing the results to stdout. We need to keep track of the results for comparison between releases.
update GitHub Action workflow to only run on pull-request [FIX]
Currently the workflow runs on push which is everything: branches and pull-requests. Need to narrow scope to just pull-requests so that we can be sure to be able to write the results to a comment on said pull-request.
write "results.yml" into a comment on a pull-request [FEAT]
Using actions/...@v1 should be able to write a comment for the pull-request this action ran for.
update release process to include creating "results.yml" and committing it to the repository [FEAT]
A final "results.yml" file should be generated at release time, the "released commit" has an idea of it's benchmark.
Stretch Goal Tasks
Stretch Goal Tasks
if pull-request comment exists; update said comment
We want to avoid spamming pull-requests with all the results from each commit pushed to a branch with a pull-request. We should update the same comment; if a comment exists, other write a new comment.
compare results from current run with those inside benchmarks/results
Need to handle the edge case where that file doesn't exist.
Requirements
Requirements
benchmark/
folder with script to run benchmarking commands [R]node_modules/
[R]results.yml
file [R]Legend
[R]: required
[S]: stretch goal
Todos
TODO:
scenarios.js
[DOCS]delay
function [CHORE]index.js
[DOCS]; remove commented code [CHORE]index.js
; write this object into a yaml file inbenchmarks/results
[FEAT]results.yml
" into a comment on a pull-request [FEAT]results.yml
" and committing it to the repository [FEAT]Stretch Goal Tasks
Stretch Goal Tasks
benchmarks/results
Pull-Requests
The text was updated successfully, but these errors were encountered: