hummingbird/setup.py

73 строки
2.2 KiB
Python
Исходник Обычный вид История

from setuptools import find_packages, setup
First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
import os
this = os.path.dirname(__file__)
packages = find_packages()
assert packages
# read version from the package file.
with (open(os.path.join(this, "hummingbird/__init__.py"), "r")) as f:
line = [_ for _ in [_.strip("\r\n ") for _ in f.readlines()] if _.startswith("__version__")]
if len(line) > 0:
version_str = line[0].split("=")[1].strip('" ')
README = os.path.join(os.getcwd(), "README.md")
with open(README) as f:
long_description = f.read()
start_pos = long_description.find("## Introduction")
if start_pos >= 0:
long_description = long_description[start_pos:]
install_requires = [
Remove constraint on dependencies version (#523) * remove constraint on dependencies version Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix attributeerror Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * uninstall hummingbird-ml Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * uninstall hummingbird-ml Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * newer version of scikit-learn Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix xgboost Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * lint Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix xgb Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * lint Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * add ONNXMLSub Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * update Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix index Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix missing import Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * tr Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * none Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * discrepencies Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * ci Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * yml Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * lint Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * replace by string.format Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * comment Signed-off-by: xavier dupré <xavier.dupre@gmail.com> Co-authored-by: xavier dupré <xavier.dupre@gmail.com>
2021-06-18 22:00:19 +03:00
"numpy>=1.15",
2022-08-09 07:04:42 +03:00
"onnxconverter-common>=1.6.0",
Remove constraint on dependencies version (#523) * remove constraint on dependencies version Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix attributeerror Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * uninstall hummingbird-ml Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * uninstall hummingbird-ml Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * newer version of scikit-learn Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix xgboost Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * lint Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix xgb Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * lint Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * add ONNXMLSub Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * update Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix index Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * fix missing import Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * tr Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * none Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * discrepencies Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * ci Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * yml Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * lint Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * replace by string.format Signed-off-by: xavier dupré <xavier.dupre@gmail.com> * comment Signed-off-by: xavier dupré <xavier.dupre@gmail.com> Co-authored-by: xavier dupré <xavier.dupre@gmail.com>
2021-06-18 22:00:19 +03:00
"scipy",
"scikit-learn",
"torch>1.7.0",
"psutil",
"dill",
"protobuf>=3.20.2",
]
onnx_requires = [
2023-10-13 18:09:00 +03:00
"onnxruntime>=1.0.0",
"onnxmltools>=1.6.0,<=1.11.0",
"skl2onnx>=1.7.0",
]
extra_requires = [
# The need each for these depends on which libraries you plan to convert from
2023-09-19 18:52:21 +03:00
"xgboost>=0.90,<2.0.0",
"lightgbm>=2.2,<=3.3.5",
2023-05-16 02:19:23 +03:00
"holidays==0.24",
2022-07-28 20:36:05 +03:00
"prophet==1.1",
]
First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
setup(
2020-05-07 21:40:39 +03:00
name="hummingbird-ml",
First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
version=version_str,
2020-05-06 02:34:12 +03:00
description="Convert trained traditional machine learning models into tensor computations",
long_description=long_description,
long_description_content_type="text/markdown",
First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
license="MIT License",
author="Microsoft Corporation",
author_email="hummingbird-dev@microsoft.com",
url="https://github.com/microsoft/hummingbird",
packages=packages,
include_package_data=True,
install_requires=install_requires,
extras_require={
"tests": ["flake8", "pytest", "coverage", "pre-commit"],
"sparkml": ["pyspark>=2.4.4", "pyarrow>1.0"],
"onnx": onnx_requires,
"extra": extra_requires,
"benchmark": onnx_requires + extra_requires + ["memory-profiler", "psutil"],
},
First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
classifiers=[
"Environment :: Console",
"Intended Audience :: Developers",
"Programming Language :: Python",
2020-05-07 21:40:39 +03:00
"Operating System :: OS Independent",
First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
"License :: OSI Approved :: MIT License",
],
python_requires=">=3.8",
First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
)