Skip to content

Confidence scores for each word? #284

Answered by jianfch
evanbrociner asked this question in Q&A
Discussion options

You must be logged in to vote

you add something like this to update() method of the decoder

token_logits = torch.stack([logits[k, next_tokens[k]] for k in range(next_tokens .shape[0])], dim=0)
# or use logprobs, the log softmax of the logits
# return it along with tokens and completed

you can use the above for the GreedyDecoder, but you will probably need to do bit more for the BeamSearchDecoder

Replies: 7 comments 20 replies

Comment options

You must be logged in to vote
6 replies
@jianfch
Comment options

@turnkit
Comment options

@Atefeh197
Comment options

@daxaxelrod
Comment options

@SinanAkkoyun
Comment options

Answer selected by jongwook
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
8 replies
@Infinitay
Comment options

@turnkit
Comment options

@adantart
Comment options

@RYucel
Comment options

@emanueleielo
Comment options

Comment options

You must be logged in to vote
1 reply
@emanueleielo
Comment options

Comment options

You must be logged in to vote
3 replies
@Atefeh197
Comment options

@ethanzrd
Comment options

@EgSam
Comment options

Comment options

You must be logged in to vote
2 replies
@tiennguyen12g
Comment options

@SinanAkkoyun
Comment options

Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet