fix deprecated ref to `tokenizer.max_len` (#10220)

This is to fix deprecated reference to `tokenizer.max_len` with `tokenizer.model_max_length` - similar to [issue 8739](https://github.com/huggingface/transformers/issues/8739) and [PR 8604](https://github.com/huggingface/transformers/pull/8604). 
Example [here](https://colab.research.google.com/gist/poedator/f8776349e5c625ce287fc6fcd312fa1e/tokenizer-max_len-error-in-transformers_glue.ipynb). The error happens when `glue_convert_examples_to_features` is called without `max_length` parameter specified. In that case line 119 with wrong reference gets called. This simple fix should  do it.
This commit is contained in:
Poedator 2021-02-24 17:01:28 +03:00 коммит произвёл GitHub
Родитель cdcdd5f03a
Коммит 5f2a3d721c
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
1 изменённых файлов: 1 добавлений и 1 удалений

Просмотреть файл

@ -116,7 +116,7 @@ def _glue_convert_examples_to_features(
output_mode=None,
):
if max_length is None:
max_length = tokenizer.max_len
max_length = tokenizer.model_max_length
if task is not None:
processor = glue_processors[task]()