-
Notifications
You must be signed in to change notification settings - Fork 129
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not enough resources for query planning - BigQuery connection breaks with too many tests #232
Comments
we're running into this as well. |
Same here. Is there any fix coming? |
We also have this issue, with the upload_models step as well. If anyone know any workaround besides skipping this step, I'll take it 😀 |
++ |
Hi @alberto-maurel . Thanks for raising this, and I'm really sorry for how long it has taken for someone to get back to you. We will look into this and get a fix in place. Thank you for your suggested approach - this is really helpful, and will definitely speed up a resolution 🤞
|
Thanks for your patience with this. We have just released 2.4.0 which should include a fix for this situation. Please let us know if it hasn't fixed it, and feel free to reopen the issue. |
Hi
We are facing the following problem when running
dbt_artifacts.upload_results(results)
in theon-run-end
hook:When checking that Job ID, the query executed is the following one:
As we have 1050+ tests, BigQuery is not able to process the query and spits out that error. I've performed some manual tests and it seems to start complaining between 850-900 tests.
Would it be possible to modify the way the data is inserted in the tables in BigQuery to a more efficient one? As an alternative, I've thought about substituting this for something like this:
This second approach is able to handle the 1050 tests seamlessly, and as side-effect, it runs 7 times faster (current left, proposed right):
Thanks a lot!
The text was updated successfully, but these errors were encountered: