mlrose is a Python package for applying some of the most common randomized optimization and search algorithms to a range of different optimization problems, over both discrete- and continuous-valued parameter spaces.
mlrose was initially developed to support students of Georgia Tech's OMSCS/OMSA offering of CS 7641: Machine Learning.
It includes implementations of all randomized optimization algorithms taught in this course, as well as functionality to apply these algorithms to integer-string optimization problems, such as N-Queens and the Knapsack problem; continuous-valued optimization problems, such as the neural network weight problem; and tour optimization problems, such as the Travelling Salesperson problem. It also has the flexibility to solve user-defined optimization problems.
At the time of development, there did not exist a single Python package that collected all of this functionality together in the one location.
- Implementations of: hill climbing, randomized hill climbing, simulated annealing, genetic algorithm and (discrete) MIMIC;
- Solve both maximization and minimization problems;
- Define the algorithm's initial state or start from a random state;
- Define your own simulated annealing decay schedule or use one of three pre-defined, customizable decay schedules: geometric decay, arithmetic decay or exponential decay.
- Solve discrete-value (bit-string and integer-string), continuous-value and tour optimization (travelling salesperson) problems;
- Define your own fitness function for optimization or use a pre-defined function.
- Pre-defined fitness functions exist for solving the: One Max, Flip Flop, Four Peaks, Six Peaks, Continuous Peaks, Knapsack, Travelling Salesperson, N-Queens and Max-K Color optimization problems.
- Optimize the weights of neural networks, linear regression models and logistic regression models using randomized hill climbing, simulated annealing, the genetic algorithm or gradient descent;
- Supports classification and regression neural networks.
mlrose was written in Python 3 and requires NumPy, SciPy and Scikit-Learn (sklearn).
The latest version can be installed using pip
:
pip install mlrose-hiive
Once it is installed, simply import it like so:
import mlrose_hiive
The official mlrose documentation can be found here.
A Jupyter notebook containing the examples used in the documentation is also available here.
mlrose was written by Genevieve Hayes and is distributed under the 3-Clause BSD license.
You can cite mlrose in research publications and reports as follows:
- Rollings, A. (2020). mlrose: Machine Learning, Randomized Optimization and SEarch package for Python, hiive extended remix. https://github.com/hiive/mlrose. Accessed: day month year.
Please also keep the original author's citation:
- Hayes, G. (2019). mlrose: Machine Learning, Randomized Optimization and SEarch package for Python. https://github.com/gkhayes/mlrose. Accessed: day month year.
You can cite this fork in a similar way, but please be sure to reference the original work. Thanks to David S. Park for the MIMIC enhancements (from https://github.com/parkds/mlrose).
BibTeX entry:
@misc{Hayes19,
author = {Hayes, G},
title = {{mlrose: Machine Learning, Randomized Optimization and SEarch package for Python}},
year = 2019,
howpublished = {\url{https://github.com/gkhayes/mlrose}},
note = {Accessed: day month year}
}
@misc{Rollings20,
author = {Rollings, A.},
title = {{mlrose: Machine Learning, Randomized Optimization and SEarch package for Python, hiive extended remix}},
year = 2020,
howpublished = {\url{https://github.com/hiive/mlrose}},
note = {Accessed: day month year}
}