hummingbird/tests/tree_utils.py

21 строка
1.1 KiB
Python
Исходник Обычный вид История

First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
"""
Collection of utils for testing tree converters.
"""
gbdt_implementation_map = {
"tree_trav": "<class 'hummingbird.ml.operator_converters._tree_implementations.TreeTraversalGBDTImpl'>",
"perf_tree_trav": "<class 'hummingbird.ml.operator_converters._tree_implementations.PerfectTreeTraversalGBDTImpl'>",
"gemm": "<class 'hummingbird.ml.operator_converters._tree_implementations.GEMMGBDTImpl'>",
First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
}
dt_implementation_map = {
"tree_trav": "<class 'hummingbird.ml.operator_converters._tree_implementations.TreeTraversalDecisionTreeImpl'>",
"perf_tree_trav": "<class 'hummingbird.ml.operator_converters._tree_implementations.PerfectTreeTraversalDecisionTreeImpl'>",
"gemm": "<class 'hummingbird.ml.operator_converters._tree_implementations.GEMMDecisionTreeImpl'>",
First commit into master (#38) * initial upload * dev branch workflow * Update README.md * starting to setup coverage * flake err cleanup * deleted more unused code * can't find a good githubactions coverage * can't find a good githubactions coverage * bug fixes * consolidating tests * XGB Regressor is failing * commiting lgbm regressor tests * using params * fixing lgbm max_depth bug * better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute * adding failure case test. TODO: why does RF not have extra_config in regressor * pinning to xgboost .90 for now * refactoring tree's extra_config for xgb and lgbm * fixing reversed param * adding gbdt test file * refactoring beam params functions * making all beam params as numpy * increasing coverege by shifting label starts and by deleting unused model.infer_initial_types() * extra config for rf reg * flake8 * more error testing * using onnxconverter types instead of copypaste * more consolidation * more test coverage * first step in refactor * cleaning up batch params * adding beam++ to node size 1 test * there is a bug, documenting * renaming trees to match paper * test * adding precommit hooks * README.md * readme update * commit hooks * Fixing badge link to be relative * notebook for demo * notebook for demo * notebook params change * reveriting 2c95f488a4dc and reopening issue #9; this solution is too clunky * bumping pyt req * Fix pytorch requirements * Fix to brackets for alpha in xgboost * Few minor fixes to comments in tests * Removed unecessary regression tests * Add binary classification tests for gemm, tree_trav and perf_tree_trav * Fixes to whitespaces * updating readme * filling out contrib section * expanding readme example so that (1) it actually runs (2) it actually does a thing * cleaning notebook example * Fix to typo and update to the requirements * Fix to flake8 errors * readme changes from this morning * changes based on feedback * Few edits to contributing * Few edits in the README file * fixing mailto: syntax * Remove initial_types from the converter API * Rename Skl2PyTorch container into HBPyTorch * Add convert_xgboost and convert_lightgbm API * Fix to spacing * remove pandas check (for the moment) * fix import * Fix readme to use the new API * removed common directory * add some documentation * renamed few things * code refactoring for trees * refactor lightgbm and xgboost by moving stuff into gbdt_commons * done with a pass on gbdt after moving everything to _gbdt_common * final refactoring of gbdt classes * rename random forest stuff into decision tree * major refactoring for tree implementations * some renaming here and there * minor fix * Add test to validate that issue #7 is closed. * import container stuff from onnx-common * fix the parser to use the topology in onnx-common * remove unnecessary files * address first chunk of Karla's comments * fix typo in calibration * Another round of comments addressed * fix typo * these two lines seem unnecessary * moving notebooks from broken branch * adding notebooks with new API changes * removing comment * removed few unnecessary code and edited some documentation * Update CONTRIBUTING.md * remove . from git clone * Final pass over non-converters files documentation / API * add constants for converters * simplify a bit the API by using extra_config for optional parameters * Update CONTRIBUTING.md * done with documentation over public classes , methods * add contants and extra config management * addressing Karla's comments * pip install pdoc; pdoc --html hummingbird * pdoc3, using overrides to get extra doc if we want it * add few tests to check that we actually pick the correct implementation * Update README.md * Reformat doc * add HB logo to readme file * Add HB logo in doc * add assertion on model being not None Co-authored-by: Karla Saur <karla.saur@microsoft.com> Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
2020-04-29 19:09:41 +03:00
}
iforest_implementation_map = {
"tree_trav": "<class 'hummingbird.ml.operator_converters.sklearn.iforest.TreeTraversalIsolationForestImpl'>",
"perf_tree_trav": "<class 'hummingbird.ml.operator_converters.sklearn.iforest.PerfectTreeTraversalIsolationForestImpl'>",
"gemm": "<class 'hummingbird.ml.operator_converters.sklearn.iforest.GEMMIsolationForestImpl'>",
}