You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
According to code in detection\denseclip\models.py#525 to #532
x = x.permute(1, 0, 2) # NLD -> LND
features = []
for i, blk in enumerate(self.transformer.resblocks):
x = blk(x)
if i in self.out_indices:
xp = x[:, 1:, :].permute(0, 2, 1).reshape(B, -1, H, W)
features.append(xp.contiguous())
batch_first is not initialized(so it should be False as default) when defining self.attn in class ResidualAttentionBlock.
Question: Why the shape of x is (B, H*W+1, C) so that the shape of xp can be (B, C, H, W)?
The text was updated successfully, but these errors were encountered:
According to code in detection\denseclip\models.py#525 to #532
and code in detection\denseclip\models.py#322
batch_first
is not initialized(so it should beFalse
as default) when definingself.attn
in classResidualAttentionBlock
.Question: Why the shape of
x
is (B, H*W+1, C) so that the shape ofxp
can be (B, C, H, W)?The text was updated successfully, but these errors were encountered: