Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding inference_mode() causes "Inference tensors do not track version counter" #1210

Open
Slyne opened this issue Jun 12, 2023 · 2 comments

Comments

@Slyne
Copy link

Slyne commented Jun 12, 2023

Hi K2 experts,

I'm trying to add the k2 fast beam decoder to my acoustic model. It works great with torch.inference_mode() set to false. But it raises the below error when I set it to True.

File "/home/slyne/projects/zoom/zoom_asr_transducer/src/k2_beam_search.py", line 152, in fast_beam_search
    lattice = decoding_streams.format_output(encoder_out_lens.tolist())
  File "/home/slyne/anaconda3/envs/zoom_asr/lib/python3.8/site-packages/k2/rnnt_decode.py", line 186, in format_output
    fsa = Fsa(ragged_arcs)
  File "/home/slyne/anaconda3/envs/zoom_asr/lib/python3.8/site-packages/k2/fsa.py", line 230, in __init__
    self.labels_version = self._tensor_attr["labels"]._version
RuntimeError: Inference tensors do not track version counter.

This flag is important in my use case since it greatly accelerates the acoustic model inference speed. Not sure if this flag must be set False in k2?

Any suggestion is appreciated! Thanks!

@csukuangfj
Copy link
Collaborator

From https://pytorch.org/docs/stable/generated/torch.inference_mode.html

Code run under this mode gets better performance by disabling view tracking and version counter bumps

Looks like we need alternatives to deal with _version as it does not exist in inference mode.

I suggest that you use torch.no_grad() as a workaround for now.

@Slyne
Copy link
Author

Slyne commented Jun 13, 2023

From https://pytorch.org/docs/stable/generated/torch.inference_mode.html

Code run under this mode gets better performance by disabling view tracking and version counter bumps

Looks like we need alternatives to deal with _version as it does not exist in inference mode.

I suggest that you use torch.no_grad() as a workaround for now.

Sure.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants