-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance issue with a lot of rows (20k+) #1125
Comments
This might be related to #1053.
|
Please measure CPU/network usage too. My expectation is, your enabled compression option in MySQL cli. |
Thanks both for your answers 😃 @julienschmidt you say all results are transmitted and that
If all results are "downloaded" as soon as the (Note that my example codes are just for tests, behind the hood I definitely want to use the 20k rows) @methane since when requesting with the "raw" MySQL docker image takes also around I didn't specified any I didn't see anormal CPU/memory consumption with metrics charts, but I will dig into it with |
|
Just to follow up, and sorry for the late response I was working on other things. In the meantime I upgraded the CPU/memory of the database, but also the "client" service, and it made it more suitable for a production product. I cannot definitely say it's solved on my side, it would need more testing on my side. But a solution we have since it's to deal with multiple frontend charts: we're gonna apply an algorithm (Douglas-Peucker) to remove meaningless chart points in the database (it keeps extremes) (while keeping full data points in another dedicated database more suitable for this). Thank you for your answer, I mark this as closed since it probably does not come from your library. |
Issue description
Hi,
I know issue section is not for free support, I just present my problem because I think I might miss a concept when using the driver and I'm interested in having your thoughts (I tried on some Slack workspace but without much success).
Context : I need to query 20k+ rows in my database to show multiple charts, but when doing it with this library it takes around 6 seconds to get all results (but with a raw MySQL client only 80ms).
I did different tests listed below, if you have an idea what could improve performance I would appreciate.
Note that when mentioning below "cluster" I mean where I'm hosting the client, and the DB server is hosted with Google Cloud SQL. They both are in the same region/zone so network latency "should not be" part of this problem. And I made sure the client program has enough CPU/memory (same for the database).
Test 1
As mentioned above:
Duration:
6 seconds
Test 2
Duration:
0.08 second
Test 3
Duration:
0.4 second
Test 4
The weird thing compared to above Go test is that if I do:
Duration:
0.08 second
In this case, without dealing with
.Next()
, it catches the raw MySQL client query duration in the Test 2.Test 5
Duration:
6 seconds
In this case, using
.Exec()
should not expect any result as returned value, but it takes the same time than if I was using all.Next()
"Conclusion"
I'm a bit lost, I don't succeed in understanding what could cause such a difference between the raw MySQL client and using Golang. And also the difference between
.Query()
and.Exec()
.(I have only 9 columns, if I reduce the
SELECT
to just 1 column, indeed the duration is pulled down)If you have an advice, I would really appreciate 👍
Thank you,
Configuration
Driver version (or git SHA): v.1.5.0
Go version: go version go1.14.3 darwin/amd64
Server version: MySQL 5.7
Server OS: (hosted by Google Cloud SQL)
The text was updated successfully, but these errors were encountered: