-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Listing dbt models... never completes with high model coun #58
Comments
under the hood, it would execute 'dbt ls'. It should wait until that one completes. You can disable this behavior in the settings, see #38 , however it should still complete, there is no limit set in the extension. Could be of course some limit in vscode that I'm not aware of. Can you include the logs, maybe i can deduce something else from there?
|
Thanks for the response. I've included the vscode developer logs as requested. I am able to |
ok, I see, Actually what happens is every time you change a file and save it, it will actually schedule a dbt list. For most users it just takes a few seconds at most but not for you. You should disable this behavior as mentioned in #38. |
Now every project uses a single Python process. Therefore I would expect this issue to be resolved. If not please reopen. |
I love this extension, but I found a flaw. My dbt project has 1,100+ models in it. When I reached some model count threshold, the part of the extension that lists (sees all) models started to hang. It seems that, as such, the extension doesn't register newly added models (above the threshold?) and the parent/child DAG relationship features don't work for these newer models.
On the VS Code status bar, it shows "Listing dbt models..." with an infinitely spinning wait cursor - the operation never completes. I've looked to see if there is a config setting for this, but there isn't.
Is this a known bug? Solution? Work-around?
Thanks. Again, this is a really helpful extension!
Running:
Windows 10
VS Code 1.54.3
dbt 0.19.0
python 3.8.8 x64
The text was updated successfully, but these errors were encountered: