MLOS/mlos_core
Sergiy Matusevych 5daaa26399
Support for special tunable values outside of the range (#617)
Enables special values outside of the range (e.g., `-1` with a range of
`[0, 100]`).

To do we make use of "conditionals" in ConfigSpace to constrain the
space. This has a number of implementation implications, addressed
below:

* [x] Add support for special values to the `Tunable` class
* [x] Add unit tests for assigning special values outside of the range
to the `Tunable` objects
* [x] Add special values outside of the range to the unit tests for
`ConfigSpace` conversion
* [x] Implement proper `TunableGroups` to `ConfigSpace` conversion for
tunables with special values
* [x] Update `mlos_core` optimizers to support conditionals and special
values in `ConfigSpace`
* [x] Add more unit tests to check the conversion
* [x] Make LlamaTune adapter support conditionals in `ConfigSpace`

---------

Co-authored-by: Brian Kroth <bpkroth@users.noreply.github.com>
2024-01-17 13:29:53 -08:00
..
mlos_core Support for special tunable values outside of the range (#617) 2024-01-17 13:29:53 -08:00
notebooks Remove Emukit Support to avoid GPy Issues (#428) 2023-07-05 16:59:59 +00:00
MANIFEST.in
README.md Readme tweaks (#627) 2024-01-10 12:37:30 -08:00
_version.py Support for special tunable values outside of the range (#617) 2024-01-17 13:29:53 -08:00
setup.py Pypi packaging (#626) 2024-01-09 22:40:15 +00:00

README.md

mlos-core

This directory contains the code for the mlos-core optimizer package.

It's available for pip install via the pypi repository at mlos-core.

Description

mlos-core is an optimizer package, wrapping other libraries like FLAML and SMAC to use techniques like Bayesian optimization and others to identify & sample tunable configuration parameters and propose optimal parameter values with a consistent API: suggest and register.

These can be evaluated by mlos-bench, generating and tracking experiment results (proposed parameters, benchmark results & telemetry) to update the optimization loop, or used independently.

Features

Since the tunable parameter search space is often extremely large, mlos-core automates the following steps to efficiently generate optimal task-specific kernel and application configurations.

  1. Reduce the search space by identifying a promising set of tunable parameters
    • Map out the configuration search space: Automatically track and manage the discovery of new Linux kernel parameters and their default values across versions. Filter out non-tunable parameters (e.g., not writable) and track which kernel parameters exist for a given kernel version.
    • Leverage parameter knowledge for optimization: Information on ranges, sampling intervals, parameter correlations, workload type sensitivities for tunable parameters are tracked and currently manually curated. In the future, this can be automatically maintained by scraping documentation pages on kernel parameters.
    • Tailored to application: Consider prior knowledge of the parameter's impact & an application's workload profile (e.g. network heavy, disk heavy, CPU bound, multi-threaded, latency sensitive, throughput oriented, etc.) to identify likely impactful candidates of tunable parameters, specific to a particular application.
  2. Sampling to warm-start optimization in a high dimensional search space
  3. Produce optimal configurations through Bayesian optimization
    • Support for various optimizer algorithms (default Bayesian optimizer, Flaml, SMAC, and random for baseline comparison), that handle multiple types of constraints. This includes cost-aware optimization, that considers experiment costs given current tunable parameters.
    • Integrated with mlos-bench, proposed configurations are logged and evaluated.