updated readmes and torch version

This commit is contained in:
Subhro Roy 2022-06-23 15:17:28 -07:00
Родитель af8f8c9ff0
Коммит 68a937f248
3 изменённых файлов: 9 добавлений и 3 удалений

Просмотреть файл

@ -7,7 +7,7 @@ parsing datasets, with splits of different data sizes, as well as grammars which
used to constrain decoding of semantic parses.
### BenchClamp Datasets
The current benchmark supports 4 datasets:
The current benchmark supports 6 datasets:
1. CalFlowV2
2. TreeDST (in LispressV2 format)
3. MTOP (all languages)
@ -22,7 +22,7 @@ split (`all`).
### Fine-tune a Language Model
1. You can edit `benchclamp_config.py` to add your LM to the `TRAIN_MODEL_CONFIGS`
list. In the committed file, we have only a T5-base model
list. We have already added some popular LMs to the list.
```
TRAIN_MODEL_CONFIGS: List[ClampModelConfig] = [
T5ModelConfig(

Просмотреть файл

@ -46,9 +46,11 @@ export TRAINED_MODEL_DIR=trained_models/
for domain in "basketball" "blocks" "calendar" "housing" "publications" "recipes" "restaurants" "socialnetwork"; do
python -m semantic_parsing_with_constrained_lm.finetune.lm_finetune \
--config-name semantic_parsing_with_constrained_lm.finetune.configs.emnlp_train_config \
--exp-names overnight_${domain}_utterance
python -m semantic_parsing_with_constrained_lm.finetune.lm_finetune \
--config-name semantic_parsing_with_constrained_lm.finetune.configs.emnlp_train_config \
--exp-names overnight_${domain}_meaningRepresentation
done
```
@ -150,9 +152,11 @@ export PRETRAINED_MODEL_DIR=facebook/bart-large
export TRAINED_MODEL_DIR=trained_models/
python -m semantic_parsing_with_constrained_lm.finetune.lm_finetune \
--config-name semantic_parsing_with_constrained_lm.finetune.configs.emnlp_train_config \
--exp-names break_nested
python -m semantic_parsing_with_constrained_lm.finetune.lm_finetune \
--config-name semantic_parsing_with_constrained_lm.finetune.configs.emnlp_train_config \
--exp-names break_QDMR
```
@ -238,9 +242,11 @@ export PRETRAINED_MODEL_DIR=facebook/bart-large
export TRAINED_MODEL_DIR=trained_models/
python -m semantic_parsing_with_constrained_lm.finetune.lm_finetune \
--config-name semantic_parsing_with_constrained_lm.finetune.configs.emnlp_train_config \
--exp-names calflow_canonicalUtterance
python -m semantic_parsing_with_constrained_lm.finetune.lm_finetune \
--config-name semantic_parsing_with_constrained_lm.finetune.configs.emnlp_train_config \
--exp-names calflow_lispress
```

Просмотреть файл

@ -11,7 +11,7 @@ url = "https://smpypi.z5.web.core.windows.net/"
[tool.poetry.dependencies]
python = "^3.7"
matplotlib = "^3.1.0"
torch = "1.6.0"
torch = "1.10.2"
pydantic = "^1.4"
lark-parser = "^0.8.2"
requests = "^2.20.1"