-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(ollama): resolve model list loading issue and add Pytest for component testing #3575
Conversation
The issue was that the url of the models: api/tags was not parsed correctly. It was having a // hence used urlencode to parse it properly. Th e correct apporach works only if the base_url is correct,i.e a valid ollama URL: for DS LF this must be a public ollama Server URL.
changed the get model to take in base url and the function will make the expected url for the model names. This makes the function better, than providing the model url as paramter. Added Pytest, 7 tests, 1 test excluded for future implememtstion: test_build_model_failure Make lint and Make format had touched multiple files
Pull Request Validation ReportThis comment is automatically generated by Conventional PR Whitelist Report
Result Pull request does not satisfy any enabled whitelist criteria. Pull request will be validated. Validation Report
Result Pull request is invalid. Reason
Last Modified at 27 Aug 24 15:11 UTC |
This pull request is automatically being deployed by Amplify Hosting (learn more). |
Thanks for this fix! Could you provide a more descriptive PR title that follows conventional-commits? |
Title Changed to : fix(ollama): resolve model list loading issue and add Pytest for component testing |
@@ -55,8 +55,9 @@ def update_build_config(self, build_config: dict, field_value: Any, field_name: | |||
|
|||
return build_config | |||
|
|||
def get_model(self, url: str) -> list[str]: | |||
def get_model(self, base_url_value: str) -> list[str]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: could just call this base_url
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I changed it such that the endpoint url of the models are formed inside the function.
Earlier the input to the function was the url to the tags/models, which is ok, but in this way it would be more clear that the model is loaded from the base url and the endpoint url is framed without error inside the get_model function. Should I revert back to the previous method and have the model url endpoint before calling the function ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nah, that's fine
removed unwanted print statements. make format, formatted a lot of .tsx files also
…onent testing (langflow-ai#3575) * Commit to solve Model not loading issue The issue was that the url of the models: api/tags was not parsed correctly. It was having a // hence used urlencode to parse it properly. Th e correct apporach works only if the base_url is correct,i.e a valid ollama URL: for DS LF this must be a public ollama Server URL. * updated the component Ollama Component changed the get model to take in base url and the function will make the expected url for the model names. This makes the function better, than providing the model url as paramter. Added Pytest, 7 tests, 1 test excluded for future implememtstion: test_build_model_failure Make lint and Make format had touched multiple files * removed unwanted print statements removed unwanted print statements. make format, formatted a lot of .tsx files also * removed skipped tests * [autofix.ci] apply automated fixes * [autofix.ci] apply automated fixes (attempt 2/3) --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Fixes #2885
@jordanrfrazier requesting for a review