From 79241dcd860e53aac10338079fbda5192afeb2f9 Mon Sep 17 00:00:00 2001 From: subhomj <35510654+subhomj@users.noreply.github.com> Date: Mon, 4 Jan 2021 18:35:30 -0800 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 9dd9f0d..59c8a9a 100644 --- a/README.md +++ b/README.md @@ -15,7 +15,7 @@ Releasing [**XtremeDistilTransformers**] with Tensorflow 2.3 and [HuggingFace Tr *Install requirements* ```pip install -r requirements.txt``` -Initialize *XtremeDistilTransformer* with [MiniLM](https://github.com/microsoft/unilm/tree/master/minilm) ([6/384 pre-trained checkpoint](https://1drv.ms/u/s!AscVo8BbvciKgRqua1395a44gr23?e=2C3XcY)) or [TinyBERT] ([4/312 pre-trained checkpoint](https://huggingface.co/nreimers/TinyBERT_L-4_H-312_v2 +Initialize *XtremeDistilTransformer* with [MiniLM](https://github.com/microsoft/unilm/tree/master/minilm) (6/384 pre-trained checkpoint) or [TinyBERT] ([4/312 pre-trained checkpoint](https://huggingface.co/nreimers/TinyBERT_L-4_H-312_v2 )) *Sample usages for distilling different pre-trained language models (tested with Python 3.6.9 and CUDA 10.2)*