This repository provides code for machine learning algorithms for edge devices developed at Microsoft Research India.
Перейти к файлу
Aditya Kusupati ec47b9f0b3
Harsha approved title for the BibTex entry
2019-07-02 19:45:58 +05:30
Applications/GesturePod README: Fix url, and punctuation consistency 2019-05-30 12:08:25 +05:30
Tools/SeeDot Feature: Automatic quantization using SeeDot (#88) 2019-05-28 22:31:30 +05:30
cpp Free allocated memory after reading data. 2019-03-15 16:08:25 +05:30
docs/publications Camera Ready seedot paper 2019-06-17 15:03:39 +05:30
pytorch Adding BibTeX to README 2019-06-30 11:00:01 +00:00
tf Cleanup 2019-06-21 23:05:42 +05:30
.gitattributes Normalize all the line endings 2017-09-04 18:45:23 +05:30
.gitignore fix python code so it runs on windows. (#77) 2019-03-16 09:44:53 +05:30
License.txt Moved Back License files 2018-10-02 18:56:04 +05:30
README.md Harsha approved title for the BibTex entry 2019-07-02 19:45:58 +05:30
ThirdPartyNotice.txt Moved Back License files 2018-10-02 18:56:04 +05:30

README.md

Edge Machine Learning

This repository provides code for machine learning algorithms for edge devices developed at Microsoft Research India.

Machine learning models for edge devices need to have a small footprint in terms of storage, prediction latency, and energy. One example of a ubiquitous real-world application where such models are desirable is resource-scarce devices and sensors in the Internet of Things (IoT) setting. Making real-time predictions locally on IoT devices without connecting to the cloud requires models that fit in a few kilobytes.

This repository contains algorithms that shine in this setting in terms of both model size and compute, namely:

  • Bonsai: Strong and shallow non-linear tree based classifier.
  • ProtoNN: Prototype based k-nearest neighbors (kNN) classifier.
  • EMI-RNN: Training routine to recover the critical signature from time series data for faster and accurate RNN predictions.
  • FastRNN & FastGRNN - FastCells: Fast, Accurate, Stable and Tiny (Gated) RNN cells.
  • SeeDot: Floating-point to fixed-point quantization tool.

These algorithms can train models for classical supervised learning problems with memory requirements that are orders of magnitude lower than other modern ML algorithms. The trained models can be loaded onto edge devices such as IoT devices/sensors, and used to make fast and accurate predictions completely offline.

The tf directory contains code, examples and scripts for all these algorithms in TensorFlow. The pytorch directory contains code, examples and scripts for all these algorithms in PyTorch. The cpp directory has training and inference code for Bonsai and ProtoNN algorithms in C++. Please see install/run instruction in the Readme pages within these directories. The applications directory has code/demonstrations of applications of the EdgeML algorithms. The Tools/SeeDot directory has the quantization tool to generate fixed-point inference code.

For details, please see our wiki page and our ICML'17 publications on Bonsai and ProtoNN algorithms, NeurIPS'18 publications on EMI-RNN and FastGRNN, PLDI'19 publication on SeeDot.

Core Contributors:

We welcome contributions, comments, and criticism. For questions, please email us.

People who have contributed to this project.

If you use the EdgeML library in your projects or publications, please do cite us:

 D. Dennis, S. Gopinath, C. Gupta, A. Kumar, A. Kusupati, S. Patil and H.V. Simhadri.
 EdgeML: Machine Learning for resource-constrained Edge devices.
 https://github.com/Microsoft/EdgeML

The following BibTeX can also be used.

@online{edgemlcode,
 title = {{EdgeML: Machine Learning for resource-constrained Edge devices}},
 author={Dennis, Don and Gopinath, Sridhar and Gupta, Chirag and Kumar, Ashish and Kusupati, Aditya and Patil, Shishir and Simhadri, Harsha Vardhan},
 year = 2017,
 url = {https://github.com/Microsoft/EdgeML}
}

Microsoft Open Source Code of Conduct This project has adopted the

Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.