Fix in Reformer Config documentation (#5138)

This commit is contained in:
Erick Rocha Fonseca 2020-06-19 14:41:31 +01:00 коммит произвёл GitHub
Родитель 84be482f66
Коммит e33929ef1e
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
1 изменённых файлов: 1 добавлений и 1 удалений

Просмотреть файл

@ -97,7 +97,7 @@ class ReformerConfig(PretrainedConfig):
Number of following neighbouring chunks to attend to in LocalSelfAttention layer in addition to itself.
local_attention_probs_dropout_prob (:obj:`float`, optional, defaults to 0.1):
The dropout ratio for the attention probabilities in LocalSelfAttention.
lsh_chunk_length (:obj:`int`, optional, defaults to 64):
lsh_attn_chunk_length (:obj:`int`, optional, defaults to 64):
Length of chunk which attends to itself in LSHSelfAttention. Chunking reduces memory complexity from sequence length x sequence length (self attention) to chunk length x chunk length x sequence length / chunk length (chunked self attention).
lsh_num_chunks_before (:obj:`int`, optional, defaults to 1):
Number of previous neighbouring chunks to attend to in LSHSelfAttention layer to itself.