You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I found an orphaned nn.Linear module that was presumably replaced by the PatchEmbed layer in the classification model. I believe this line should be removed.
Slightly related: in section 3.3 of the paper it is stated that P𝑖 → P'𝑖 ∈ ℝC⨯𝑝' with C being the number of channels. As far as I understand 𝑝' is the embedding size (--emb_dim in the code) and that is used to embed all dimensions into one so it should be P'𝑖 ∈ ℝ𝑝' dropping the channel dimension. I apologize if I missed something here that proves my remark as wrong.
The text was updated successfully, but these errors were encountered:
Good question!
Have you tried conducting experiments to explore the differences between using nn.Linear and not using it?
Whether there is a difference in the results of the experiment?
Good question!
Have you tried conducting experiments to explore the differences between using nn.Linear and not using it?
Whether there is a difference in the results of the experiment?
Hi, great work, thank you!
I found an orphaned
nn.Linear
module that was presumably replaced by thePatchEmbed
layer in the classification model. I believe this line should be removed.TSLANet/Classification/TSLANet_classification.py
Line 158 in ca0e884
Slightly related: in section 3.3 of the paper it is stated that P𝑖 → P'𝑖 ∈ ℝC⨯𝑝' with C being the number of channels. As far as I understand 𝑝' is the embedding size (
--emb_dim
in the code) and that is used to embed all dimensions into one so it should be P'𝑖 ∈ ℝ𝑝' dropping the channel dimension. I apologize if I missed something here that proves my remark as wrong.The text was updated successfully, but these errors were encountered: