NeuronBlocks/autotest
boshining aa03c7a400 update auto_test 2019-05-30 19:51:37 +08:00
..
conf add autotest files (#52) 2019-05-28 12:05:32 +08:00
dataset add autotest files (#52) 2019-05-28 12:05:32 +08:00
tools update auto_test 2019-05-30 19:51:37 +08:00
README.md Update Contributing.md README (#58) 2019-05-28 18:40:51 +08:00

README.md

NeuronBlocks AUTOTEST

  1. Please download GloVe firstly via following commands.
cd PROJECT_ROOT/dataset
sh get_glove.sh
  1. Please download the 20 Newsgroups data set. You can run the following data downloading and preprocessing script.
cd PROJECT_ROOT/dataset
python get_20_newsgroups.py
  1. Please run the autotest script.
sh autotest.sh A B

where, parameter A indicates single process or multiple processes, the default is single process. When A is Y, it stands for multiple processes. Parameter B indicates using GPU or CPU to test, the default is using CPU. When B is not empty, you need to specify which GPUs to use. You can choose any one of the following scripts according to your needs.

# Using multiple processes with GPU 0,1
sh autotest.sh Y 0,1
# Using multiple processes with GPU 0
sh autotest.sh Y 0
# Using multiple processes with CPU
sh autotest.sh Y
# Using single process with GPU 0
sh autotest.sh N 0
# Using single process with CPU
sh autotest.sh N
  1. Finally, you can get the contrast_results.txt in PROJECT_ROOT/autotest, which stores the results of your model. You can compare the results of column {accuracy/new AUC} versus column {old accuracy/AUC}. If there are significant metric regression, you need to check your pull request.
tasks                   GPU/CPU old accuracy/AUC    new accuracy/AUC 
english_text_matching       GPU     0.96655         0.97375
english_text_matching       CPU     0.96655         0.97375
chinese_text_matching       GPU     0.70001         0.7 
chinese_text_matching       CPU     0.70001         0.7 
quora_question_pairs        GPU     0.72596         0.727864
quora_question_pairs        CPU     0.72596         0.727864
knowledge_distillation      CPU     0.66329         0.6695541666666667