Skip to content
This repository has been archived by the owner on Aug 16, 2021. It is now read-only.

Backtraces kill the performance #129

Closed
golddranks opened this issue Feb 17, 2017 · 11 comments
Closed

Backtraces kill the performance #129

golddranks opened this issue Feb 17, 2017 · 11 comments

Comments

@golddranks
Copy link
Contributor

golddranks commented Feb 17, 2017

I'm not sure if this affected me before (I think not), or is it because of recent changes in upstream, but...

DEBUG:ganbare::helpers: src/helpers.rs:365 time_it try_auth_user took 0s 256ms!
DEBUG:ganbare: Request http://localhost:8081/login took 0s 258ms
DEBUG:ganbare::helpers: src/helpers.rs:365 time_it try_auth_user took 0s 255ms!
DEBUG:ganbare: Request http://localhost:8081/login took 0s 257ms
DEBUG:ganbare::helpers: src/helpers.rs:365 time_it try_auth_user took 0s 290ms!
DEBUG:ganbare: Request http://localhost:8081/login took 0s 292ms
DEBUG:ganbare::helpers: src/helpers.rs:365 time_it try_auth_user took 0s 270ms!
DEBUG:ganbare: Request http://localhost:8081/login took 0s 272ms

...seriously, this isn't an acceptable level of performance. My web app is all open to DoSsing if I it takes third of a second just to construct an Error. For reference, here's the figures with backtraces off:

DEBUG:ganbare::helpers: src/helpers.rs:365 time_it try_auth_user took 0s 0ms!
DEBUG:ganbare: Request http://localhost:8081/login took 0s 2ms
DEBUG:ganbare::helpers: src/helpers.rs:365 time_it try_auth_user took 0s 0ms!
DEBUG:ganbare: Request http://localhost:8081/login took 0s 3ms
DEBUG:ganbare::helpers: src/helpers.rs:365 time_it try_auth_user took 0s 0ms!
DEBUG:ganbare: Request http://localhost:8081/login took 0s 2ms
DEBUG:ganbare::helpers: src/helpers.rs:365 time_it try_auth_user took 0s 0ms!
DEBUG:ganbare: Request http://localhost:8081/login took 0s 2ms

This is on MacOS. Are others experiencing these levels of performance drops?

@golddranks
Copy link
Contributor Author

golddranks commented Feb 17, 2017

Okay, I've got to admit, with --release it shrinks to ~22ms, but still. I'm not sure if it used to be so performance heavy before.

@Yamakaky
Copy link
Contributor

What's your code ?

@golddranks
Copy link
Contributor Author

https://github.com/golddranks/ganbare/blob/08251563f4dea94029e538cef4b8d93bc4fc3ce9/ganbare_backend/src/session.rs#L106 Just a simple bail!(), no loops or nothing fancy, plus the stack isn't too deep. The code performs in ~0ms when not on erroneous path, but takes hundreds of ms when an error happens. But if I disable backtraces, both error and non-error path take close to zero time, so it quite clearly is about the backtraces.

@Yamakaky
Copy link
Contributor

Yamakaky commented Feb 17, 2017

Could you try to disable the backtrace feature to see if that's the problem? Also could you try to use a profiler to see what's so slow?

@golddranks
Copy link
Contributor Author

Disabling the backtrace feature solves the problem.
It'd be interesting to profile this, but I've never used a profiler with Rust - yet. Let's see what I can do.

@golddranks
Copy link
Contributor Author

golddranks commented Feb 21, 2017

I did a some tests. First of all, I made a new test project. I was surprised by the... inconsistent behaviour I got.

First of all, I got many times times that were comparable to my first report – something in the range of 150 - 250 ms.

After running the tests for multiple times, but without any changes in my environment, I witnessed a sudden improvement in the times: now creating the error took 15 ms.

After running the tests a bit more, I witnessed another change: the format of the backtrace changed from this:

Generating error!
Hello, world! Error backtrace: Some(stack backtrace:
   0:        0x10051b371 - backtrace::backtrace::trace::he6a49035fdfadc16
   1:        0x10051c7d4 - backtrace::capture::Backtrace::new::h8ca3ad60a3bf61a1
   2:        0x10051224c - error_chain::make_backtrace::hb4a3c519ab713709
   3:        0x100512307 - _$LT$error_chain..State$u20$as$u20$core..default..Default$GT$::default::h6349a4c6dd5f12f0
   4:        0x100510bd0 - error_perf::error::Error::from_kind::hd938a2456ebbcd5d
   5:        0x1005107cb - error_perf::test::h76fa5d3e6f062fbd
   6:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
   7:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
   8:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
   9:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
  10:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
  11:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
  12:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
  13:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
  14:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
  15:        0x10051081b - error_perf::test::h76fa5d3e6f062fbd
  16:        0x10051094e - error_perf::main::h7c38c82b380a6a55
  17:        0x10052c4ea - __rust_maybe_catch_panic
  18:        0x10052bcf6 - std::rt::lang_start::he28fbf70e98a2b18
  19:        0x100510cb9 - main), Time: 12

to this:

Generating error!
Hello, world! Error backtrace: Some(stack backtrace:
   0:        0x105919371 - backtrace::backtrace::trace<closure>
                        at /Users/drasa/.cargo/registry/src/gh.neting.cc-1ecc6299db9ec823/backtrace-0.3.0/src/backtrace/libunwind.rs:53
   1:        0x10591a7d4 - backtrace::capture::Backtrace::new::h8ca3ad60a3bf61a1
                        at /Users/drasa/repo/error_perf/target/debug/build/backtrace-17fa5380aeaf9437/out/capture.rs:79
   2:        0x10591024c - error_chain::make_backtrace::hb4a3c519ab713709
                        at /Users/drasa/.cargo/registry/src/gh.neting.cc-1ecc6299db9ec823/error-chain-0.9.0/src/lib.rs:413
   3:        0x105910307 - _$LT$error_chain..State$u20$as$u20$core..default..Default$GT$::default::h6349a4c6dd5f12f0
                        at /Users/drasa/.cargo/registry/src/gh.neting.cc-1ecc6299db9ec823/error-chain-0.9.0/src/lib.rs:494
   4:        0x10590ebd0 - error_perf::error::{{impl}}::from_kind
                        at /Users/drasa/repo/error_perf/<error_chain_processed macros>:49
   5:        0x10590e7cb - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:15
   6:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
   7:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
   8:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
   9:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
  10:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
  11:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
  12:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
  13:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
  14:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
  15:        0x10590e81b - error_perf::test
                        at /Users/drasa/repo/error_perf/src/main.rs:17
  16:        0x10590e94e - error_perf::main
                        at /Users/drasa/repo/error_perf/src/main.rs:25
  17:        0x10592a4ea - __rust_maybe_catch_panic
                        at /Users/travis/build/rust-lang/rust/src/libpanic_unwind/lib.rs:98
  18:        0x105929cf6 - std::rt::lang_start::he28fbf70e98a2b18
                        at /Users/travis/build/rust-lang/rust/src/libstd/panicking.rs:429
  19:        0x10590ecb9 - main), Time: 13

It changes back to the top one every time I do some change and recompile, and after a while, it starts printing like the bottom one.

This makes me think that there is something very dynamical going on. I wonder which libraries the backtrace crate hooks to? I'm running this on MacOS.

@Yamakaky
Copy link
Contributor

Hum, strange...

@golddranks
Copy link
Contributor Author

On Linux, the backtraces behave consistently, looking always the same, and taking 7~8 ms on debug and 5 ms on release. (This is virtualized so the times between MacOS and Linux may not be comparable)

The backtrace crate uses something called coresymbolication on MacOS whereas on linux it uses GCC's libbacktrace. Maybe it's just about how coresymbolication behaves...

@Yamakaky
Copy link
Contributor

Oh, so it's only a pb on macos ? Is so, try to find if the API used on mac is known to be slow.

@mitsuhiko
Copy link

mitsuhiko commented Feb 23, 2017

CoreSymbolication is slow when it renders backtraces. There is nothing you can do about that because it uses spotlight and some other things to find debug symbols.

@golddranks
Copy link
Contributor Author

Well, that explains it. If the lazy-backtraces with deferred rendering helps, that would be great. Anyway, at least this behaviour is now explained. I'm going to close this.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants