-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ML] add support for question_answering NLP tasks #457
[ML] add support for question_answering NLP tasks #457
Conversation
if isinstance(response, tuple): | ||
return torch.stack(list(response), dim=0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
our internal BERT inference logic takes the first element of the tuple if the returned value is a tuple. This does not work for question answering models as its a tuple of the starts and ends token scores.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@benwtrent Looks like there's a merge conflict, can you fix this then I'll merge. |
i got it @sethmlarson don't want to merge until the ES side is all approved and merged. Thanks! |
Adds support for
question_answering
NLP models within the pytorch model uploader.Related: elastic/elasticsearch#85958