43e500a01a | ||
---|---|---|
.github | ||
.vscode | ||
archai | ||
confs | ||
devices | ||
dockers | ||
docs | ||
models/darts | ||
scripts | ||
tests | ||
tools | ||
.amltignore | ||
.flake8 | ||
.gitattributes | ||
.gitignore | ||
AUTHORS.md | ||
CODE_OF_CONDUCT.md | ||
CONTRIBUTING.md | ||
LICENSE | ||
NOTICE.md | ||
README.md | ||
SECURITY.md | ||
install.bat | ||
install.sh | ||
setup.cfg | ||
setup.py |
README.md
Archai: Platform for Neural Architecture Search
Archai is a platform for Neural Network Search (NAS) that allows you to generate efficient deep networks for your applications. It offers the following advantages:
- 🔬 Easy mix-and-match between different algorithms;
- 📈 Self-documented hyper-parameters and fair comparison;
- ⚡ Extensible and modular to allow rapid experimentation;
- 📂 Powerful configuration system and easy-to-use tools.
Please refer to the documentation for more information.
Archai is compatible with: Python 3.6+ and PyTorch 1.2+.
Table of contents
Quickstart
Installation
There are many alternatives to installing Archai, ranging from PyPI to Docker. Feel free to choose your preferred, but note that regardless of choice, we recommend using it within a virtual environment, such as conda
or pyenv
.
Archai is compatible with Windows, Linux, and MacOS.
PyPI
PyPI provides a fantastic source of ready-to-go packages, and it is the easiest way to install a new package:
pip install archai
Source (development)
Alternatively, one can clone this repository and install the bleeding-edge version:
git clone https://github.com/microsoft/archai.git
cd archai
install.sh # on Windows, use install.bat
Please refer to the installation guide for more information.
Running Experiments
To run a specific NAS algorithm, specify it by the --algos
switch:
python scripts/main.py --algos darts --full
Please refer to running algorithms for more information on available switches and algorithms.
Tutorials
The best way to familiarize yourself with Archai is to take a quick tour through our 30-minute tutorial. Additionally, one can dive into the Petridish tutorial developed at Microsoft Research and available at Archai.
Visual Studio Code
We highly recommend Visual Studio Code to take advantage of predefined run configurations and interactive debugging.
From the archai directory, launch Visual Studio Code. Select the Run button (Ctrl+Shift+D), choose the run configuration you want, and click on the Play icon.
AzureML Experiments
To run NAS experiments at scale, you can use Archai on Azure.
Support
Join Archai on Facebook to stay up-to-date or ask questions.
Contributions
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repositories using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Team
Archai has been created and maintained by Shital Shah, Debadeepta Dey, Gustavo de Rosa, Caio Cesar Teodoro Mendes, Piero Kauffmann in the Reinforcement Learning Group at Microsoft Research, Redmond, USA. Archai has benefited immensely from discussions with Ofer Dekel, John Langford, Rich Caruana, Eric Horvitz and Alekh Agarwal.
Credits
Archai builds on several open source codebases. These includes: Fast AutoAugment, pt.darts, DARTS-PyTorch, DARTS, petridishnn, PyTorch CIFAR-10 Models, NVidia DeepLearning Examples, PyTorch Warmup Scheduler, NAS Evaluation is Frustratingly Hard, NASBench-PyTorch.
Please see install_requires
section in setup.py for up-to-date dependencies list. If you feel credit to any material is missing, please let us know by filing an issue.
License
This project is released under the MIT License. Please review the file for more details.
Trademark
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.