Merge branch 'ksaur/readmeedits' of https://github.com/microsoft/hummingbird into mainterl-updatedoc

This commit is contained in:
Matteo Interlandi 2020-05-05 13:46:33 -07:00
Родитель b842c4389e bf7b31c06f
Коммит ab8e6d8230
1 изменённых файлов: 7 добавлений и 7 удалений

Просмотреть файл

@ -7,10 +7,10 @@
[![Gitter](https://badges.gitter.im/hummingbird-ml/community.svg)](https://gitter.im/hummingbird-ml/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
## Introduction
*Hummingbird* converts trained traditional Machine Learning models into [PyTorch](https://pytorch.org/). Once in the PyTorch format, <!--you can further convert to [ONNX](https://github.com/onnx/onnx) or [TorchScript](https://pytorch.org/docs/stable/jit.html), and --> you can run the models on GPU for high performance native scoring. For full details, see [our papers](#documentation).
*Hummingbird* is a library for accelerating inference (scoring/prediction) in traditional machine learning (ML) models. It compiles traditional ML pipelines into tensor computations, which allows users to seamlessly leverage hardware acceleration without having to re-engineer their models.
Currently we support [these](https://github.com/microsoft/hummingbird/blob/develop/hummingbird/_supported_operators.py#L26) tree-based classifiers and regressors. These models include
[scikit-learn](https://scikit-learn.org/stable/) models such as Decision Trees and Random Forest, and also [LightGBM](https://github.com/Microsoft/LightGBM) and [XGBoost](https://github.com/dmlc/xgboost) Classifiers/Regressors.
You can use *Hummingbird* to convert your trained traditional ML models into [PyTorch](https://pytorch.org/). Currently we support a variety of tree-based classifiers and regressors (list [here](https://github.com/microsoft/hummingbird/wiki/Supported-Operators)). These models include
[scikit-learn](https://scikit-learn.org/stable/) Decision Trees and Random Forest, and also [LightGBM](https://github.com/Microsoft/LightGBM) and [XGBoost](https://github.com/dmlc/xgboost) Classifiers/Regressors.
## Installation
@ -44,11 +44,11 @@ X = np.array(np.random.rand(100000, 28), dtype=np.float32)
y = np.random.randint(num_classes, size=100000)
# Create and train a model (LightGBM in this case)
model = lgb.LGBMClassifier()
model.fit(X, y)
lgb_model = lgb.LGBMClassifier()
lgb_model.fit(X, y)
# Use Hummingbird to convert the model to pytorch
model = model.to('pytorch')
# Use Hummingbird to convert the model to PyTorch
model = lgb_model.to('pytorch')
# Run predictions on CPU
model.predict(X)