Skip to content

Commit

Permalink
use linear layer for all models (should not hurt addition)
Browse files Browse the repository at this point in the history
  • Loading branch information
hoedt committed Jan 14, 2021
1 parent a05f10c commit e97a7a3
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions stable_nalu/network/simple_function_recurrent.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,9 @@ def __init__(self, unit_name, input_size=10, writer=None, **kwargs):
name='recurrent_layer',
**kwargs)
self.output_layer = GeneralizedLayer(self.hidden_size, 1,
'linear'
if unit_name in {'GRU', 'LSTM', 'MCLSTM', 'RNN-tanh', 'RNN-ReLU'}
else unit_name,
'linear',
# if unit_name in {'GRU', 'LSTM', 'MCLSTM', 'RNN-tanh', 'RNN-ReLU'}
# else unit_name,
writer=self.writer,
name='output_layer',
**kwargs)
Expand Down

0 comments on commit e97a7a3

Please sign in to comment.