This repository provides code for machine learning algorithms for edge devices developed at Microsoft Research India.
Перейти к файлу
Suresh Parthasarathy 2af1ce6874 Adding Cmake files for Bonsai and ProtoNN libs 2017-11-16 09:50:05 +05:30
docs Merge branch 'master' into ProtoNN-dev 2017-11-16 08:33:02 +05:30
eigen Normalize all the line endings 2017-09-04 18:45:23 +05:30
publications publications uploaded 2017-08-30 14:12:44 +05:30
src Adding Cmake files for Bonsai and ProtoNN libs 2017-11-16 09:50:05 +05:30
.gitattributes Normalize all the line endings 2017-09-04 18:45:23 +05:30
.gitignore Initial commit 2017-08-02 04:27:52 -07:00
BonsaiIngestTest.cpp Update BonsaiIngestTest.cpp 2017-09-27 12:35:56 +05:30
BonsaiLocalDriver.cpp Bonsai compiles; not tested 2017-11-14 06:05:11 +00:00
BonsaiPredictDriver.cpp Stand alone trainer and predictor for Bonsai 2017-11-15 23:42:05 +05:30
BonsaiTrainDriver.cpp Stand alone trainer and predictor for Bonsai 2017-11-15 23:42:05 +05:30
CMakeLists.txt Adding Cmake files for Bonsai and ProtoNN libs 2017-11-16 09:50:05 +05:30
License.txt Update License.txt 2017-09-02 02:30:54 +05:30
Makefile Stand alone trainer and predictor for Bonsai 2017-11-15 23:42:05 +05:30
ProtoNNIngestTest.cpp removed some unnecessary files; reviewed and tested previous edits; ProtoNNPredictor now works for binary, multiclass data, libsvm, tsv format; minMaxNormalization tested; needs testing for multilabel, l2-normalization 2017-11-13 16:25:13 +00:00
ProtoNNPredictDriver.cpp moved metrics to common 2017-11-15 14:12:55 +00:00
ProtoNNTrainDriver.cpp changed protonn driver file names; change protoNN trainer and predictor constructor interface 2017-11-15 11:18:55 +00:00
README.md minor changes in code; major changes to README to include ProtoNNPredictor support 2017-11-14 19:47:36 +00:00
ThirdPartyNotice.txt Rename ThirdPartyNotices.txt to ThirdPartyNotice.txt 2017-08-23 22:51:21 +05:30
config.mk populating root directory 2017-09-02 01:55:56 +05:30
run_BonsaiPredict_usps10.sh Added shel lscripts for Train and Predict 2017-11-16 00:02:54 +05:30
run_BonsaiTrain_usps10.sh Correcting the executable name for Bonsai Trainer 2017-11-16 08:15:37 +05:30
run_Bonsai_usps10.sh Stand alone trainer and predictor for Bonsai 2017-11-15 23:42:05 +05:30
run_ProtoNNPredict_usps10.sh added support for point-wise prediction; resolved bug in nBatches evaluation using ceil; Readme updated 2017-11-15 10:46:34 +00:00
run_ProtoNNTrain_usps10.sh moved metrics to common 2017-11-15 14:12:55 +00:00

README.md

Edge Machine Learning

This repository provides code for machine learning algorithms for edge devices developed at Microsoft Research India.

Machine learning models for edge devices need to have a small footprint in terms of storage, prediction latency and energy. One example of a ubiquitous real-world application where such models are desirable is resource-scarce devices and sensors in the Internet of Things (IoT) setting. Making real-time predictions locally on IoT devices without connecting to the cloud requires models that fit in a few kilobytes.

This repository contains two such algorithms Bonsai and ProtoNN that shine in this setting. These algorithms can train models for classical supervised learning problems with memory requirements that are orders of magnitude lower than other modern ML algorithms. The trained models can be loaded onto edge devices such as IoT devices/sensors, and used to make fast and accurate predictions completely offline.

For details, please see the ICML'17 publications on Bonsai and ProtoNN algorithms.

Initial Code Contributors: Chirag Gupta, Aditya Kusupati, Ashish Kumar, and Harsha Simhadri.

We welcome contributions, comments and criticism. For questions, please email Harsha.

People who have contributed to this project.

Requirements

  • Linux. We developed the code on Ubuntu 16.04LTS. For Windows 10 Anniversary Update or later, one can also use the Windows Subsystem for Linux. The code can also be compiled in Windows with Visual Studio, but this release does not include necessary makefiles yet.
  • gcc version 5.4. Other gcc versions above 5.0 could also work.
  • An implementation of BLAS, sparseBLAS and vector math calls. We link with the implementation provided by the Intel(R) Math Kernel Library. Please download later versions (2017v3+) of MKL as far as possible. The code can be made to work with other math libraries with a few modifications.

Building

After cloning this repository, set compiler and flags appropriately in config.mk. Then execute the following in bash:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:<MKL_PATH>:<EDGEML_ROOT>
make -j

Typically, MKL_PATH = /opt/intel/mkl/lib/intel64_lin/, and EDGEML_ROOT is '.'.

This will build two executables Bonsai and ProtoNN. Sample data to try these executables is not included in this repository, but instructions to do so are given below.

Download a sample dataset

Follow the bash commands given below to download a sample dataset, USPS10 to the repository. Bonsai and ProtoNN come with sample scripts to run on the usps10 dataset. EDGEML_ROOT is defined in the previous section.

cd <EDGEML_ROOT>
mkdir usps10
cd usps10
wget http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass/usps.bz2
wget http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass/usps.t.bz2
bzip2 -d usps.bz2
bzip2 -d usps.t.bz2
mv usps train.txt
mv usps.t test.txt
mkdir ProtoNNResults
cd <EDGEML_ROOT>

This will create a sample train and test dataset, on which you can train and test Bonsai and ProtoNN algorithms. As specified, we create an output folder for ProtoNN. Bonsai on the other hand creates its own output folder. For instructions to actually run the algorithms, see Bonsai Readme and ProtoNN Readme.

Makefile flags

You could change the behavior of the code by setting these flags in config.mk and rebuilding with make -Bj. All these flags can be set for both ProtoNN and Bonsai. The following are supported currently by both ProtoNN and Bonsai.

SINGLE/DOUBLE:  Single/Double precision floating-point. Single is most often sufficient. Double might help with reproducibility.
ZERO_BASED_IO:  Read datasets with 0-based labels and indices instead of the default 1-based. 
TIMER:          Timer logs. Print running time of various calls.
CONCISE:        To be used with TIMER to limit the information printed to those deltas above a threshold.

The following currently only change the behavior of ProtoNN, but one can write corresponding code for Bonsai.

LOGGER:         Debugging logs. Currently prints min, max and norm of matrices.
LIGHT_LOGGER:   Less verbose version of LOGGER. Can be used to track call flow. 
XML:            Enable training with large sparse datasets with many labels. This is in beta.
VERBOSE:        Print additional informative output to stdout.
DUMP:           Dump models after each optimization iteration instead of just in the end.
VERIFY:         Legacy verification code for comparison with Matlab version.

Additionally, there is one of two flags that has to be set in the Makefile:

MKL_PAR_LDFLAGS: Linking with parallel version of MKL.
MKL_SEQ_LDFLAGS: Linking with sequential version of MKL.

Microsoft Open Source Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.