Skip to content

Commit

Permalink
skip the projection layer for poolformer
Browse files Browse the repository at this point in the history
  • Loading branch information
blefaudeux committed Jul 3, 2022
1 parent 67ba72f commit b2c3e40
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions xformers/components/attention/pooling.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,7 @@ def __init__(

# This "attention" (token mixing) skips the multihead attention altogether
self.requires_skip_multi_head = True
self.requires_input_projection = False

# This operator does not really handle q,k,v
self.requires_same_k_q_dimensions = True
Expand Down

0 comments on commit b2c3e40

Please sign in to comment.