-
Notifications
You must be signed in to change notification settings - Fork 15.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support nested dicts #26414
support nested dicts #26414
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
k: _update_token_usage(overall_token_usage.get(k, 0), v) | ||
for k, v in new_usage.items() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would suggest inspecting the value to ensure it's an integer. The OpenAI API may add additional fields within usage
that are not integers in the future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Believe this actually handles what you're asking for! Recursively calls the same function, the adding only happens if it's an int. Otherwise hits the warning "else" case at the bottom
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed with @efriis , The best way to handle this by checking the instance types programmatically for int
and dict
. And the else
case will then take care of warnings.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense, thanks! I don't think you need to warn in the else condition though — we may add non-integer fields in usage
in the future, and that's not a breaking change or unexpected.
released in langchain-openai==0.1.24 community release will wait til tomorrow - github having some issues with actions, and there are some dependency failures in experimental that we'll debug before releasing |
needed for thinking tokens --------- Co-authored-by: Erick Friis <erick@langchain.dev>
needed for thinking tokens