-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add LLM output for the AI ML component to use ChatOpenAI base model #3230
Conversation
…del from langchain
This pull request is automatically being deployed by Amplify Hosting (learn more). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
name="stop_tokens", | ||
display_name="Stop Tokens", | ||
info="Comma-separated list of tokens to signal the model to stop generating text.", | ||
DictInput(name="model_kwargs", display_name="Model Kwargs", advanced=True), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @vasconceloscezar thanks for the improvements. Sorry, I was out of town last week or would have taken a look then.
You may be more familiar with the AI/ML API than I am, so just wanted to ask some clarifying questions here.
IIUC, the AI/ML API only supports certain parameters that are common to all supported models (https://docs.aimlapi.com/api-reference/parameters). Does allowing model_kwargs
here give the user the false(?) impression that model-specific args are allowed?
info="Comma-separated list of tokens to signal the model to stop generating text.", | ||
DictInput(name="model_kwargs", display_name="Model Kwargs", advanced=True), | ||
BoolInput( | ||
name="json_mode", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see this error using this change out of the box:
/Users/jordan.frazier/Documents/Langflow/langflow/.venv/lib/python3.11/site-packages/langchain_core/utils/utils.py:234: UserWarning: WARNING! json_mode is not default parameter.
json_mode was transferred to model_kwargs.
Please confirm that json_mode is what you intended.
warnings.warn(
[08/12/24 09:44:42] ERROR 2024-08-12 09:44:42 - ERROR - base - Completions.create() got an unexpected keyword argument 'json_mode' base.py:687
- I believe
json_mode
is set differently usingChatOpenAI
. - Did you test this and it worked for you? Just want to make sure I didn't accidentally change anything prior to testing.
advanced=True, | ||
info="The base URL of the OpenAI API. Defaults to https://api.aimlapi.com . You can change this to use other APIs like JinaChat, LocalAI e Prem.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe the intent of this component was to only hit the AI/ML API, so would lean towards removing this parameter. If users want to use a different base, we can add another component for that or use the generic OpenAI component. (I'm glad using the base_url
here works using the aiml base - I tried with the OpenAIEmbeddings
class and had issues).
SecretStrInput( | ||
name="api_key", | ||
display_name="AIML API Key", | ||
info="The AIML API Key to use for the OpenAI model.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't be specific to the OpenAI model
max_tokens = self.max_tokens | ||
model_kwargs = self.model_kwargs or {} | ||
aiml_api_base = self.aiml_api_base or "https://api.aimlapi.com" | ||
json_mode = bool(output_schema_dict) or self.json_mode |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems like output_schema
exists only to determine whether to set json_mode
. I do have the concern that even if fixed, the AI/ML API wouldn't support this (or, only would for specific models, which may be confusing for users).
else: | ||
raise TypeError(f"Expected Message type, saw: {type(self.input_value)}") | ||
if isinstance(aiml_api_key, SecretStr): | ||
openai_api_key = aiml_api_key.get_secret_value() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: var name should stay aiml_api_key
base_url=aiml_api_base, | ||
max_tokens=max_tokens or None, | ||
seed=seed, | ||
json_mode=json_mode, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as noted - json_mode
isn't an explicit parameter to ChatOpenAI
"repeat_penalty": self.repeat_penalty or 1.1, | ||
"stop_tokens": self.stop_tokens or None, | ||
} | ||
model = ChatOpenAI( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Was curious about this usage - does using OpenAI
's implementation have any compatibility issues using other models?
The current AI ML component only outputs Text, and all other AI Models output
Text
andLanguage Model
With this change, we can use models from AI ML in
Agents
and other components that use theLanguage Model
as input.