Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Перейти к файлу
Debadeepta Dey 73f9a41817 Added license info to all code files. 2019-05-30 14:46:36 -07:00
anytime_models init 2019-05-30 16:33:22 -04:00
petridish Added license info to all code files. 2019-05-30 14:46:36 -07:00
result_net_info init 2019-05-30 16:33:22 -04:00
tensorpack init 2019-05-30 16:33:22 -04:00
.gitignore init 2019-05-30 16:33:22 -04:00
CONTRIBUTING.md Create CONTRIBUTING.md 2019-05-30 14:34:38 -07:00
LICENSE.txt Updated license text. 2019-05-30 14:21:35 -07:00
Microsoft_Departing_Intern_License.txt init 2019-05-30 16:33:22 -04:00
README.md Update README.md 2019-05-30 14:37:17 -07:00
__init__.py Added license info to all code files. 2019-05-30 14:46:36 -07:00
petridishnn.code-workspace init 2019-05-30 16:33:22 -04:00
requirements.txt init 2019-05-30 16:33:22 -04:00

README.md

Project Petridish: Efficient Forward Architecture Search

Code for Efficient Forward Neural Architecture Search.

Conduct and Privacy

Petridishnn has adopted the Microsoft Open Source Code of Conduct. For more information on this code of conduct, see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments. Read Microsofts statement on Privacy & Cookies

Installation on development machine

We have developed and tested Petridish on Ubuntu 16.04 LTS (64-bit), Anaconda python distribution and Tensorflow

Installing the software

  1. Install Anaconda python distribution for Ubuntu
  2. Create a python 3.6 environment conda create python=3.6 -n py36
  3. Follow instructions to install a recent Tensorflow (TF) version. 1.12 is tested.
  4. Clone the repo: git clone petridishnn
  5. Install dependency packages python -m pip install -r <path_to_petridishnn>/requirements.txt
  6. Petridish needs some environment variables: GLOBAL_LOG_DIR: directory where logs will be written to by jobs running locally. GLOBAL_MODEL_DIR: directory where models will be written to by jobs running locally. GLOBAL_DATA_DIR: directory from where local jobs will read data. Set them to appropriate values in your bashrc. E.g. export GLOBAL_MODEL_DIR="/home/dedey/data"

Getting the data

Petridish code assumes datasets are in certain format (e.g. we transform ImageNet raw data to lmdb format). While one can always download the raw data of standard datasets and use the relevant scripts in petridishnn/petridish/data to convert them Debadeepta Dey dedey@microsoft.com maintains an Azure blob with all the data in the converted format. (For Microsoft employees only) Please email him for access.

Running a sample search job on cifar

Before doing full scale search on Azure it is common to check everything is running on local machine. An example job script is at petridishnn/scripts/test_distributed.sh. Make sure you have all the environment variables used in this script. Run this from root folder of petridishn as bash scripts/test_distributed.sh. This will output somethings to stdout but will output models and logs to the corresponding folders. If this succeeds you have a working installation. Yay!

Post-search Analysis

We provide a number of scripts to analyze and post-process the search results in the directory petridish/analysis. We also provide a script to generate training scripts to train the found models. We list them in the order of usage as follows. Please refer to the header of each linked file for usage.

  1. Inspect the search log
  2. Generate scripts to train found models
  3. Check Performance of model training

Contacts:

Contributing

Please read the contributing policy