Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kibana terribly slow on /apm/traces #176073

Closed
clyfish opened this issue Feb 1, 2024 · 1 comment · Fixed by #175177 or #178177
Closed

Kibana terribly slow on /apm/traces #176073

clyfish opened this issue Feb 1, 2024 · 1 comment · Fixed by #175177 or #178177
Labels
bug Fixes for quality problems that affect the customer experience Team:obs-ux-infra_services Observability Infrastructure & Services User Experience Team

Comments

@clyfish
Copy link
Contributor

clyfish commented Feb 1, 2024

Kibana version: 7.17.3

Elasticsearch version: 7.17.3

Server OS version: CentOS 7

Browser version: Chrome 100+

Browser OS version: Windows 11

Original install method (e.g. download page, yum, from source, etc.):: Helm

Describe the bug:
Quote from https://discuss.elastic.co/t/kibana-terribly-slow-on-apm-traces/312664

I found kibana's "traces" menu under APM is extremely slow while opening, if there are over 8000 trace keys and set the time range to "last 7 days", without any other kquery, like first click.
From what i can see, the backend of kibana, there is one node thread consuming 100%cpu all the time while opening that page, but es monitor data seems ok. what's worse is that all other requests are blocked until the node thread become normal.
I made some cpu profile, and found the root cause maybe at "x-pack/plugins/apm/common/utils/join_by_key/index.ts#L61", while calling "isEqual" of lodash. This function performance bad.
may be we could try choosing some other tools?

Steps to reproduce:

  1. prepare some apm data, like 20 services and 8000 trace keys
  2. open kibana, navigate to "traces"
  3. boom!

Expected behavior:
Traces page can be opened normally.

Screenshots (if relevant):

Errors in browser console (if relevant):

Provide logs and/or server output (if relevant):

Any additional context:
I've created a PR to fix it.
#175177

@clyfish clyfish added the bug Fixes for quality problems that affect the customer experience label Feb 1, 2024
@botelastic botelastic bot added the needs-team Issues missing a team label label Feb 1, 2024
@jsanz jsanz added the Team:obs-ux-infra_services Observability Infrastructure & Services User Experience Team label Feb 2, 2024
@elasticmachine
Copy link
Contributor

Pinging @elastic/obs-ux-infra_services-team (Team:obs-ux-infra_services)

@botelastic botelastic bot removed the needs-team Issues missing a team label label Feb 2, 2024
cauemarcondes added a commit that referenced this issue Mar 6, 2024
## Summary

Improve performance of join_by_key from O(n^2) to O(n).
closes #176073
Fix
https://discuss.elastic.co/t/kibana-terribly-slow-on-apm-traces/312664

---------

Co-authored-by: Cauê Marcondes <55978943+cauemarcondes@users.noreply.github.com>
cauemarcondes added a commit that referenced this issue Mar 7, 2024
## Summary

Improve performance of join_by_key from O(n^2) to O(n).
closes #176073
Fix
https://discuss.elastic.co/t/kibana-terribly-slow-on-apm-traces/312664

@cauemarcondes 
After solving the unit testing problem, the performance degraded back to
O (n^2).
This PR fix the performance problem, please review and test again.

---------

Co-authored-by: Cauê Marcondes <55978943+cauemarcondes@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience Team:obs-ux-infra_services Observability Infrastructure & Services User Experience Team
Projects
None yet
3 participants