Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Instrument and emit stats on program wall clock duration #76

Closed
cratelyn opened this issue Sep 24, 2021 · 8 comments
Closed

Instrument and emit stats on program wall clock duration #76

cratelyn opened this issue Sep 24, 2021 · 8 comments
Labels
feature-prod Related to matching the production C@E environment limits Relates to resource limits logging Logging and diagnostics

Comments

@cratelyn
Copy link
Contributor

cratelyn commented Sep 24, 2021

Fastly's UI provides stats on the total wall clock of a Compute@Edge program running in production. Viceroy should emit information about the wall clock duration of a single instance execution, so that users can reason about the behavior of their program in production.

@cratelyn cratelyn added feature-prod Related to matching the production C@E environment logging Logging and diagnostics limits Relates to resource limits labels Sep 24, 2021
@cratelyn
Copy link
Contributor Author

cratelyn commented Jan 25, 2022

cc: @djKooks, i was guessing based on your activity in #118 and #63 that you were looking for good starter issues in this repository. this issue might be a good one to pick up, if you are interested!

you can see where we currently log how much heap space was used by a Wasm program, here: https://github.com/fastly/Viceroy/blob/main/lib/src/execute.rs#L279-L282

we could use Instant::now and its duration_since method to find the time elapsed before/after instantiating and running the program. then, a call to tracing::info! should let us emit a log message showing the total wall clock duration. 🙂

feel free to ping me if you have any further questions 🙇‍♀️

@kination
Copy link
Contributor

@cratelyn great thanks! I'll look on it 🙏

@kination
Copy link
Contributor

kination commented Jan 27, 2022

@cratelyn sorry, I'm in middle of research, so want to ask some basic question.

It should be the task to emit the CPU/RAM usage. So before going on, could you let me know how can I visualize (through cli) the stats which is currently being supported?
(which command should I use through cargo run -- in cli directory)?

@kination
Copy link
Contributor

@cratelyn for now I've saw

Jan 28 21:42:57.618  INFO Listening on http://127.0.0.1:7878
Jan 28 21:43:04.845  INFO request{id=0}: handling request GET http://127.0.0.1:7878/
Jan 28 21:43:04.855  INFO request{id=0}: request completed using 1.4 MB of WebAssembly heap

by running Default starter for Rust in build.

So the purpose for task is to display RAM usage, CPU time in this log?

@cratelyn
Copy link
Contributor Author

cratelyn commented Jan 28, 2022

@cratelyn for now I've saw

Jan 28 21:42:57.618  INFO Listening on http://127.0.0.1:7878
Jan 28 21:43:04.845  INFO request{id=0}: handling request GET http://127.0.0.1:7878/
Jan 28 21:43:04.855  INFO request{id=0}: request completed using 1.4 MB of WebAssembly heap

by running Default starter for Rust in build.

So the purpose for task is to display RAM usage, CPU time in this log?

hi @djKooks, thank you for your patience, it looks like you found the right test to run. for reference, you can also use --nocapture to show logs when running tests, like so:

; cargo test -p viceroy request_works -- --nocapture

that logs at the TRACE level, so you'll see many more events, but the INFO events shown above will be included. this should be a helpful way to see any new logs you add.

So the purpose for task is to display RAM usage, CPU time in this log?

#78 and #79 cover RAM and CPU time, respectively. this issue is specifically focused on showing how long it took for the Wasm binary to finish running. RAM usage or CPU time telemetry can be added separately from this, and showing the total wall-clock time will be a wonderful addition.

using the logs you showed above as a starting point, i imagine we would want something like...

Jan 28 21:42:57.618  INFO Listening on http://127.0.0.1:7878
Jan 28 21:43:04.845  INFO request{id=0}: handling request GET http://127.0.0.1:7878/
Jan 28 21:43:04.855  INFO request{id=0}: request completed using 1.4 MB of WebAssembly heap
Jan 28 21:43:04.865  INFO request{id=0}: request completed in 120ms

in terms of code, the logic for this will be shaped like...

let start_time = Instant::now();
// run the Wasm binary
let wall_clock_time = Instance::now().duration_since(start_time);
tracing::info("WebAssembly program completed in {}ms", wall_clock_time.as_millis());

i hope that's helpful, the link in my comment above contains a link to where we'll want to add this logic.

finally, thank you so much for taking this on @djKooks 💌

@kination
Copy link
Contributor

kination commented Jan 29, 2022

Hello @cratelyn . Thanks for the guide.
I've updated PR for this, so could you check when available? 🙏
#121

...
Jan 29 19:03:31.162  INFO Listening on http://127.0.0.1:7878
Jan 29 19:03:33.158  INFO request{id=0}: handling request GET http://127.0.0.1:7878/
Jan 29 19:03:33.165  INFO request{id=0}: request completed using 1.4 MB of WebAssembly heap
Jan 29 19:03:33.165  INFO request{id=0}: request completed in 6.078166ms

(will it be better to make ms value rounded?)

@kination
Copy link
Contributor

kination commented Feb 8, 2022

@cratelyn please close this if it's okay. Also if there's some issue in minor priority, please suggest 🙏

@cratelyn
Copy link
Contributor Author

cratelyn commented Feb 9, 2022

of course! i just tagged you over in #23, which may be a good issue to tackle next. 💟

with that, i'm going to close this issue!

@cratelyn cratelyn closed this as completed Feb 9, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-prod Related to matching the production C@E environment limits Relates to resource limits logging Logging and diagnostics
Projects
None yet
Development

No branches or pull requests

2 participants