Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Significant slowdown since version 0.1.10 #96

Closed
martinstein opened this issue Feb 27, 2019 · 3 comments
Closed

Significant slowdown since version 0.1.10 #96

martinstein opened this issue Feb 27, 2019 · 3 comments

Comments

@martinstein
Copy link

martinstein commented Feb 27, 2019

When I was running py-spy 0.1.8, there was no noticeable slowdown of the program that I was profiling (rate=700). In my case this was a web request in Pyramid that returns in 6 seconds.

After upgrading to py-spy 0.1.10 (still using rate=700), the performance of the profiled program drops significantly: the same web request now takes 18-20 seconds! When I use --nonblocking, the performance becomes fast again.

So it seems that between 0.1.8 to 0.1.10, a significant slowdown of the profiled program was introduced for the default blocking mode.

System: Windows
Python: 3.7

But anyway, thanks a lot for such a great tool! Py-spy has been immensely helpful.

@benfred
Copy link
Owner

benfred commented Feb 28, 2019

Hmm - that doesn't seem ideal. When I tested this I was seeing a much much smaller slowdown - on the order of ~2% instead.

How many threads are in your target process? There is a potential performance optimization that we could do: instead of locking the whole process it might be possible to just lock one thread at a time, which might help alleviate this.

Also - it seems like you should be able to drop your sampling rate down to have less of an impact.

@martinstein
Copy link
Author

In my local dev setup where I ran the profiling, I used the waitress server that is the default for Pyramid. I think by default waitress spawns 4 threads on startup (https://docs.pylonsproject.org/projects/waitress/en/stable/arguments.html), but I only sent 1 request so only 1 thread was actively running.

I just ran a quick test where I reduced the rate from 700 to 100. The slowdown of the request went down from 300% to roughly 10% (much better, but still seems higher than previously).

@martinstein
Copy link
Author

On a completely unrelated note, just a quick thought: Py-spy is amazing and it gives me the best results out of all the Python-profilers that I've used so far. Right now, my profiling workflow for specific backend functionality is:

  1. Prepare the console-command that executes py-spy (figure out the correct PID first), but don't run it yet!
  2. Fire off an HTTP-request to my web application in Postman.
  3. Quickly switch to the console-window and hit Enter to start py-spy.
  4. Wait until the request has returned in Postman, then quickly hit Ctrl-C to stop py-spy.

If py-spy also had a programmatic API to be able to use it from Python directly, that would make the process way easier. Something along the lines of this, maybe (similar to cProfile and others):

from pyspy import Profiler

profiler = Profiler(rate=..., non_blocking=False)
profiler.start()

# Run whatever code you care about
...

profiler.stop()
profiler.write_results(format='flamegraph', output='my_profile.svg')

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants