Replies: 1 comment
-
You can take a look at the gpt2 model tracing script in #2509 's PR description. This should solve the problem. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How can I pass keyword arguments to
torch.jit.trace_module(model,...)
?I can pass positional arguments in the same way as the following document and example demonstrate.
For example, my model calls
generate(inputs,options,...,kwargs*)
instead offorward
.inputs
is a tensor of token_ids obtained fromEmbedding
. There are some keyword arguments such asmax_length
andtemperature
, which significantly improve inference performance.However, I couldn't find a way to pass keyword arguments to my model via
jit.trace_module
.I tried the following, but it won't work.
Beta Was this translation helpful? Give feedback.
All reactions