Accelerate your Neural Architecture Search (NAS) through fast, reproducible and modular research.
Перейти к файлу
Debadeepta Dey c6396d3ec6 Added ratio of common architectures for ranking. 2022-12-16 18:20:53 -03:00
.github
.vscode Got aggregate plots to be decent. 2022-12-16 18:20:42 -03:00
archai Implemented phased freeze training method from SVCCA paper. 2022-12-16 18:20:05 -03:00
azure
confs Improved plotting of aggregate plots. 2022-12-16 18:20:42 -03:00
devices
docker
docs
models/darts
scripts Added ratio of common architectures for ranking. 2022-12-16 18:20:53 -03:00
tests
.amltignore
.gitattributes
.gitignore
AUTHORS.md
CODE_OF_CONDUCT.md
CONTRIBUTING.md
LICENSE
NOTICE.md
README.md
SECURITY.md
install.bat
install.sh
pyproject.toml
requirements.txt
run_all_ft_analysis.bat Added ratio of common architectures for ranking. 2022-12-16 18:20:53 -03:00
setup.cfg
setup.py

README.md

archai_logo_black_bg_cropped

Archai: Platform for Neural Architecture Search

License Issues Latest release

Archai is a Neural Network Search (NAS) platform that allows you to generate efficient deep networks for your applications. It offers the following advantages:

  • 🔬 Easy mix-and-match between different algorithms;
  • 📈 Self-documented hyper-parameters and fair comparison;
  • Extensible and modular to allow rapid experimentation;
  • 📂 Powerful configuration system and easy-to-use tools.

Please refer to the documentation for more information.

Package compatibility: Python 3.7+ and PyTorch 1.2.0+.

OS compatibility: Windows, Linux and MacOS.

Table of contents

Quickstart

Installation

There are many alternatives to installing Archai, but note that regardless of choice, we recommend using it within a virtual environment, such as conda or pyenv.

PyPI

PyPI provides a fantastic source of ready-to-go packages, and it is the easiest way to install a new package:

pip install archai

Source (development)

Alternatively, one can clone this repository and install the bleeding-edge version:

git clone https://github.com/microsoft/archai.git
cd archai
install.sh # on Windows, use install.bat

Please refer to the installation guide for more information.

Running an Algorithm

To run a specific NAS algorithm, specify it by the --algos switch:

python scripts/main.py --algos darts --full

Please refer to running algorithms for more information on available switches and algorithms.

Tutorials

The best way to familiarize yourself with Archai is to take a quick tour through our 30-minute tutorial. Additionally, one can dive into the Petridish tutorial developed at Microsoft Research and available at Archai.

We highly recommend Visual Studio Code to take advantage of predefined run configurations and interactive debugging. From the archai directory, launch Visual Studio Code, select Run (Ctrl+Shift+D), choose the configuration and click on Play.

On the other hand, you can use Archai on Azure to run NAS experiments at scale.

Support

Contributions

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repositories using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Team

Archai has been created and maintained by Shital Shah, Debadeepta Dey, Gustavo de Rosa, Caio Mendes, Piero Kauffmann, and Ofer Dekel at Microsoft Research.

Credits

Archai builds on several open-source codebases. These includes: Fast AutoAugment, pt.darts, DARTS-PyTorch, DARTS, petridishnn, PyTorch CIFAR-10 Models, NVidia DeepLearning Examples, PyTorch Warmup Scheduler, NAS Evaluation is Frustratingly Hard, NASBench-PyTorch.

Please see install_requires section in setup.py for up-to-date dependencies list. If you feel credit to any material is missing, please let us know by filing an issue.

License

This project is released under the MIT License. Please review the file for more details.

Trademark

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.