An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Перейти к файлу
liuzhe-lz 0ea9459026
Bump release pipeline image (#5530)
2023-04-28 11:20:46 +08:00
.github
dependencies [Compression] Add quantization tutorial (#5454) 2023-04-23 16:30:21 +08:00
docs NAS doc update (#5529) 2023-04-26 12:23:32 +08:00
examples [Compression] Quantization Preview Doc (#5516) 2023-04-25 18:11:47 +08:00
nni Fix training service bugs (#5528) 2023-04-26 13:05:21 +08:00
nni_assets
pipelines Bump release pipeline image (#5530) 2023-04-28 11:20:46 +08:00
test [Hotfix] update mmdet to 3.0 (#5515) 2023-04-12 10:27:46 +08:00
ts [Hot-fix] fix link click error (#5532) 2023-04-28 10:41:07 +08:00
.gitattributes
.gitignore
.readthedocs.yaml
CITATION.cff
Dockerfile
LICENSE
README.md
SECURITY.md
crowdin.yml
pylintrc Enable and fix pipeline issues in NAS (#5439) 2023-03-16 17:15:02 +08:00
pyrightconfig.json Enable and fix pipeline issues in NAS (#5439) 2023-03-16 17:15:02 +08:00
setup.py Bump deps version and migrate to npm (#5443) 2023-03-16 04:16:51 +08:00
setup_ts.py Bump deps version and migrate to npm (#5443) 2023-03-16 04:16:51 +08:00

README.md


MIT licensed Issues Bugs Pull Requests Version Documentation Status

NNI automates feature engineering, neural architecture search, hyperparameter tuning, and model compression for deep learning. Find the latest features, API, examples and tutorials in our official documentation (简体中文版点这里).

What's NEW!  

Installation

See the NNI installation guide to install from pip, or build from source.

To install the current release:

$ pip install nni

To update NNI to the latest version, add --upgrade flag to the above commands.

NNI capabilities in a glance

Hyperparameter Tuning Neural Architecture Search Model Compression
Algorithms
Supported Frameworks Training Services Tutorials
Supports
  • PyTorch
  • TensorFlow
  • Scikit-learn
  • XGBoost
  • LightGBM
  • MXNet
  • Caffe2
  • More...
webui

Resources

Contribution guidelines

If you want to contribute to NNI, be sure to review the contribution guidelines, which includes instructions of submitting feedbacks, best coding practices, and code of conduct.

We use GitHub issues to track tracking requests and bugs. Please use NNI Discussion for general questions and new ideas. For questions of specific use cases, please go to Stack Overflow.

Participating discussions via the following IM groups is also welcomed.

Gitter WeChat
image OR image

Over the past few years, NNI has received thousands of feedbacks on GitHub issues, and pull requests from hundreds of contributors. We appreciate all contributions from community to make NNI thrive.

Test status

Essentials

Type Status
Fast test Build Status
Full test - HPO Build Status
Full test - NAS Build Status
Full test - compression Build Status

Training services

Type Status
Local - linux Build Status
Local - windows Build Status
Remote - linux to linux Build Status
Remote - windows to windows Build Status
OpenPAI Build Status
Frameworkcontroller Build Status
Kubeflow Build Status
Hybrid Build Status
AzureML Build Status

Targeting at openness and advancing state-of-art technology, Microsoft Research (MSR) had also released few other open source projects.

  • OpenPAI : an open source platform that provides complete AI model training and resource management capabilities, it is easy to extend and supports on-premise, cloud and hybrid environments in various scale.
  • FrameworkController : an open source general-purpose Kubernetes Pod Controller that orchestrate all kinds of applications on Kubernetes by a single controller.
  • MMdnn : A comprehensive, cross-framework solution to convert, visualize and diagnose deep neural network models. The "MM" in MMdnn stands for model management and "dnn" is an acronym for deep neural network.
  • SPTAG : Space Partition Tree And Graph (SPTAG) is an open source library for large scale vector approximate nearest neighbor search scenario.
  • nn-Meter : An accurate inference latency predictor for DNN models on diverse edge devices.

We encourage researchers and students leverage these projects to accelerate the AI development and research.

License

The entire codebase is under MIT license.