This commit is contained in:
Dayiheng Liu (刘大一恒) 2020-11-30 11:53:47 +08:00 коммит произвёл GitHub
Родитель 02278a69a6
Коммит 1d36bc5c4f
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
1 изменённых файлов: 1 добавлений и 1 удалений

Просмотреть файл

@ -2,7 +2,7 @@
This repo provides the code for reproducing the experiments in [*ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training*](https://arxiv.org/pdf/2001.04063). In the paper, we propose a new pre-trained language model called ProphetNet for sequence-to-sequence learning with a novel self-supervised objective called future n-gram prediction.
We have released the ProphetNet baselines for [GLGE](https://arxiv.org/abs/2011.11928) benchmark (A New General Language Generation Evaluation Benchmark) in [here](./GLGE_baselines). Have a try! :)
We have released the ProphetNet baselines for [GLGE](https://github.com/microsoft/glge) benchmark ([A New General Language Generation Evaluation Benchmark](https://arxiv.org/abs/2011.11928)) in [here](./GLGE_baselines). Have a try! :)
## Dependency
- pip install torch==1.3.0