absolute positional encoding [abs_pos] #8457
poyupaulchen
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi all,
First appreciate the team providing such a great tool.
Currently I would like to try some experiments on the positional encoding for conformer. As I saw in the coding (see below), if I configure absolute positional encoding using
self_attention_model: abs_pos,
it will call the 'MultiHeadAttention'
NeMo/nemo/collections/asr/parts/submodules/conformer_modules.py
Lines 124 to 127 in df5a395
And it seems like 'pos_emb' is never applied to the self attention part.
NeMo/nemo/collections/asr/parts/submodules/multi_head_attention.py
Lines 121 to 147 in df5a395
Does it mean that 'abs_pos' implies 'no positional encoding'? or I miss something.
Thank you,
Paul
Beta Was this translation helpful? Give feedback.
All reactions