-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add report summary CLI flag #1168
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1168 +/- ##
==========================================
+ Coverage 75.34% 75.34% +<.01%
==========================================
Files 148 148
Lines 10763 10832 +69
==========================================
+ Hits 8109 8161 +52
- Misses 2189 2204 +15
- Partials 465 467 +2
Continue to review full report at Codecov.
|
38ed5fd
to
3ae2927
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The implementation LGTM, but this needs some tests.
And would this PR also close #1120?
2415219
to
d426d9b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as I mentioned in the inline comment, since this option is for making a JSON report of the end-of-test summary, in my opinion it's better to call it SummaryReport
, not the current ReportSummary
(which to me implies that we're making a summary of some report)
though now that I think about it, "report" itself is the ambiguous part here, maybe we can replace it? so... --summary-export
? @cuonglm, @mstoykov , @imiric - thoughts?
/me +1 for |
I don't like "summary" to be honest :). |
"[End of test] summary" might not be the best name, but we're stuck with it anyway. It's referred so in the docs and the options |
To add another variation to consider 😄, how about |
👍, to me it seems slightly better than
As I mentioned in the previous comment, I feel like we should have |
fc854f2
to
029ca3e
Compare
ui/summary_test.go
Outdated
"extra": { | ||
"min": 1, | ||
"max": 1 | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I still think that we can move the data under extra
to the level above as there is zero value added from this and also probably will be less code ;)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed
ui/summary.go
Outdated
min := sink.Min | ||
max := sink.Max | ||
data["min"] = min | ||
data["max"] = max |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think you need neither min
nor max
as well as probably most of the variables in the other cases of this switch
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is still no support for thresholds and by the looks of it- it's going to be pretty simple to add.
You need to check that the metric has Thresholds and if it does ... just add them possibly as map
of Threshold.Source to Threshold.LastFailed to the metric itself ... Like:
"http_req_duration{url:http://httpbin.org/post}": {
"avg": 117.039762276,
"max": 228.920154,
"med": 116.05152949999999,
"min": 113.477511,
"p(90)": 117.4600813,
"p(95)": 118.38552544999999,
"thresholds": {
"max<1000": false
}
},
Although maybe it would be good idea to have a separate place for all the thresholds either in addition or just there ... cc @na--, @imiric ?
Fixed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, just remember to cleanup/squash before merge.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, though we might want to do some corner case tests when we merge it (different and nested groups, custom metrics, tagged thresholds, custom trend percentile columns, etc.). And yeah, squash the commits before merging.
Add --summary-export flag for "k6 run", the parameter is output file to write report summary in JSON format. Empty string means output is ignored. Close #355
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, although I think that a few more tests will be a good idea, but definitely a lot of manual testing with different scripts is a must before release.
Also IMO we should be using Metric.Summary, possibly after changing it a bit.
Thanks, I don't know about it. Scratch the surface looking, |
Add --summary-export flag for "k6 run", the parameter is output file to
write report summary in JSON format.
Empty string means output is ignored.
Close #355