Skip to content

Commit

Permalink
fix typo
Browse files Browse the repository at this point in the history
  • Loading branch information
patrickvonplaten committed May 7, 2020
1 parent 67f02c0 commit 4e7252a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/configuration_reformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ class ReformerConfig(PretrainedConfig):
The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
is_decoder (:obj:`bool`, optional, defaults to False):
If `is_decoder` is True, a causal mask is used in addition to `attention_mask`.
When using the Reformer for casaul language modeling, `is_decoder` is set to `True`.
When using the Reformer for causal language modeling, `is_decoder` is set to `True`.
layer_norm_eps (:obj:`float`, optional, defaults to 1e-12):
The epsilon used by the layer normalization layers.
local_chunk_length (:obj:`int`, optional, defaults to 64):
Expand Down

0 comments on commit 4e7252a

Please sign in to comment.