update the download link for pre-trained model weights.

This commit is contained in:
SivilTaram 2021-08-30 13:48:11 +08:00
Родитель 751d84604f
Коммит baf858e970
2 изменённых файлов: 5 добавлений и 5 удалений

Просмотреть файл

@ -92,7 +92,7 @@ Once `tapex` is successfully installed, you could go into [examples](examples) t
## Pre-training Corpus
Our synthetic pre-training corpus which includes nearly **5,000,000** tuples of (*SQL queries*, *flattened tables*, *SQL execution results*) can be downloaded from [here](https://github.com/microsoft/Table-Pretraining/releases/download/v1.1/tapex_pretrain.tar.gz). You can use it for research purpose, but you should be careful about the [data license](LICENSE-Data).
Our synthetic pre-training corpus which includes nearly **5,000,000** tuples of (*SQL queries*, *flattened tables*, *SQL execution results*) can be downloaded from [here](https://github.com/microsoft/Table-Pretraining/releases/download/pretraining-corpus/tapex_pretrain.tar.gz). You can use it for research purpose, but you should be careful about the [data license](LICENSE-Data).
Below is an example from the pre-training corpus:
@ -114,8 +114,8 @@ The pre-trained models trained on the above pre-training corpus.
Model | Description | # Params | Download
---|---|---|---
`tapex.base` | 6 encoder and decoder layers | 140M | [tapex.base.tar.gz](https://github.com/microsoft/Table-Pretraining/releases/download/v1.0/tapex.base.tar.gz)
`tapex.large` | 12 encoder and decoder layers | 400M | [tapex.large.tar.gz](https://github.com/microsoft/Table-Pretraining/releases/download/v1.0/tapex.large.tar.gz)
`tapex.base` | 6 encoder and decoder layers | 140M | [tapex.base.tar.gz](https://github.com/microsoft/Table-Pretraining/releases/download/pretrained-model/tapex.base.tar.gz)
`tapex.large` | 12 encoder and decoder layers | 400M | [tapex.large.tar.gz](https://github.com/microsoft/Table-Pretraining/releases/download/pretrained-model/tapex.large.tar.gz)
## Fine-tuned Models

Просмотреть файл

@ -13,9 +13,9 @@ logger = logging.getLogger(__name__)
# Resources are obtained and modified from https://github.com/pytorch/fairseq/tree/master/examples/bart
RESOURCE_DICT = {
"bart.large": "https://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz",
"tapex.large": "https://github.com/microsoft/Table-Pretraining/releases/download/v1.0/tapex.large.tar.gz",
"tapex.large": "https://github.com/microsoft/Table-Pretraining/releases/download/pretrained-model/tapex.base.tar.gz",
"bart.base": "https://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz",
"tapex.base": "https://github.com/microsoft/Table-Pretraining/releases/download/v1.0/tapex.base.tar.gz"
"tapex.base": "https://github.com/microsoft/Table-Pretraining/releases/download/pretrained-model/tapex.large.tar.gz"
}
DEFAULT_ENCODER_JSON = "https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json"