Skip to content

Commit

Permalink
Fix issue #654 (#655)
Browse files Browse the repository at this point in the history
Avoid two softmax calls in LogisticRegression

Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
  • Loading branch information
pierrenodet and mergify[bot] authored Jun 16, 2021
1 parent 50969a8 commit b087892
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion pl_bolts/models/regression/logistic_regression.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ def training_step(self, batch, batch_idx):
# flatten any input
x = x.view(x.size(0), -1)

y_hat = self(x)
y_hat = self.linear(x)

# PyTorch cross_entropy function combines log_softmax and nll_loss in single function
loss = F.cross_entropy(y_hat, y, reduction='sum')
Expand Down

0 comments on commit b087892

Please sign in to comment.