DeBERTa/experiments/glue
Pengcheng He c794b711ed 1. Add code for DeBERTaV3 pre-training; 2. Fix error in torch 1.11; 3. Add code for ONNX export 2023-03-18 23:14:19 -07:00
..
README.md 1. Add code for DeBERTaV3 pre-training; 2. Fix error in torch 1.11; 3. Add code for ONNX export 2023-03-18 23:14:19 -07:00
cola.sh 1. Fix issues with pre-training bert model 2021-11-17 02:02:29 -08:00
config.json First release of DeBERTa 2020-06-15 18:42:48 -04:00
download_data.sh 1. DeBERTa v2 2021-02-09 10:50:16 -05:00
mnli.sh Add script for XSmall model fine-tuning. Optimize the code for better accuracy. 2021-12-08 13:26:39 -08:00
mrpc.sh 1. Fix issues with pre-training bert model 2021-11-17 02:02:29 -08:00
patch.diff 1. DeBERTa v2 2021-02-09 10:50:16 -05:00
qnli.sh 1. Fix issues with pre-training bert model 2021-11-17 02:02:29 -08:00
qqp.sh 1. Fix issues with pre-training bert model 2021-11-17 02:02:29 -08:00
rte.sh 1. Fix issues with pre-training bert model 2021-11-17 02:02:29 -08:00
sst2.sh 1. Fix issues with pre-training bert model 2021-11-17 02:02:29 -08:00
stsb.sh 1. Fix issues with pre-training bert model 2021-11-17 02:02:29 -08:00

README.md

GLUE fine-tuning task

To run the experiment, you need to

run ./mnli.sh for fine-tuning mnli base model,

run ./mnli.sh for fine-tuning mnli large model.

run ./cola.sh for fine-tuning cola large model.

run ./sst2.sh for fine-tuning sst2 large model.

run ./stsb.sh for fine-tuning stsb large model.

run ./rte.sh for fine-tuning rte large model.

run ./qqp.sh for fine-tuning qqp large model.

run ./qnli.sh for fine-tuning qnli large model.

run ./mrpc.sh for fine-tuning mrpc large model.

Export model to ONNX format and quantization

To export model to onnx format during evaluation, use argument --export_ort_model True. To export quantized model, use --fp16 False --export_ort_model True. The exported model will be under output folder, and end with <prefix>__onnx_fp16.bin if fp16 is True, otherwise the outputs will be <prefix>__onnx_fp32.bin and <prefix>__onnx_qt.bin.

Please check ONNX document for more details.