http://rmcantin.github.io/bayesopt/html/bopttheory.html
(Disclaimer) With the leaning “(Deprecated)” in the subtitles, I mean it looks haven't been updated or maintained for a long time. I didn't check or verify it. These are very subjective judgments to help me clear up the mind. You are welcomed to send me an email for any discussion.
Package | Language | Active? | Open-source | Version | Last Commit [note-1] | Surrogate | GP | Type-II ML optim. | MCMC | Acquisition Func. | Batch of points | Multi-Objective |
---|---|---|---|---|---|---|---|---|---|---|---|---|
pyGPGO | python | low | yes | v0.4.0.dev1 (11/02/2017) | 11/23/2020 | GP, tSP, RF, GBM | Native | MML, | pyMC3 | PI, EI, UCB, Entropy | ||
* scikit-optimize | python | yes | yes | v0.8.1 (09/04/2020) | 09/29/2020 | GP, RF, GBM | scikit-learn | MML, | No | PI, EI, UCB | ||
BayesianOptimization (bayes_opt) | python | yes | yes | v1.2.0 (05/16/2020) | 07/13/2020 | GP | scikit-learn | No | No | PI, EI, UCB | ||
* GPyOpt | python, GPy | ended | yes | v1.2.6 (03/19/2020) | 11/05/2020 | GPP, RF, WGP | GPy | MML, | Yes | PI, EI, UCB | ||
Trieste | python, gpflow 2 | very | yes | v0.5.0 (06/10/2021) | 06/10/2021 | GPR, SGPR, SVGP | gpflow 2 | ? | No | EI, LCB, PF, | ||
BayesOpt | C++; Python, MATLAB | yes | yes | 0.9 (08/28/2018) | 05/14/2020 | GP, STP | Native | MML, ML, MAP, etc. | No | EI, LCB, MI, PI, Aopt, etc. | ||
Emukit (Amazon) | python, GPy | yes | yes | v0.4.7 (13/03/2019) | 11/30/2020 | GPy model wrapper | GPy | ? | No, almost | EI, LCB, PF, PI, etc. | ||
Spearmint | python, MongoDB | ended | yes | |||||||||
Whetlab (Twitter) ←-Spearmint | no? | NO | ||||||||||
pyBO | python | ended | yes | v0.1 (09/18/2015) | 09/20/2015 | |||||||
MOE (Yelp) | python | ended | ||||||||||
SigOpt ←- MOE | NO | |||||||||||
GPflowOpt | python, gpflow 1 | ended | ||||||||||
BoTorch (Facebook) | python, GPyTorch | yes, very | v0.3.3 (12/08/2020) | (12/18/2020) | GPyTorch model wrapper | GPyTorch | Yes | EI, UCB, PI, | yes |
Not finished yet. Missing a lot of things.
go over MLE, MLL, MAP, …
[note-1] As of the last time I update this data.
What does “Type II ML optimization” mean in the original table from pyGPGO?
It should mean how the hyperparameters of the surrogate model is trained. “Type II ML optimization” indicates an empirical Bayes capability, while a full Bayes approach requires MCMC generally.
What does “Integrated acq. function” mean in the original table?
It is an “integrated version” of the acquisition function used in MCMC. I still don't understand what “integrated” means.
https://github.com/SheffieldML/GPyOpt
https://github.com/fmfn/BayesianOptimization
scikit-optimize
https://github.com/josejimenezluna/pyGPGO (Python)
https://github.com/rmcantin/bayesopt (C++)
sGaussianProcess
, sGaussianProcessML
, sGaussianProcessNormal
, sStudentTProcessJef
, sStudentTProcessNIG
https://github.com/secondmind-labs/trieste
https://botorch.org/ (Python, PyTorch)
GPyTorchModel
provides a base class for conveniently wrapping GPyTorch models.
https://github.com/scikit-optimize/scikit-optimize (In my opinion, a really bad name but anyway the package is named skopt
. Maybe they intend to do something bigger than just BO.)
scikit-learning
.forest_minimize
gbrt_minimize
(这是什么?)gp_minimize
uses Optimizer
under the hood.LCB
, EI
, PI
at each iterationskopt.learning.GaussianProcessRegressor
(The implementation is based on Algorithm 2.1 of Gaussian Processes for Machine Learning (GPML) by Rasmussen and Williams.)ask()
and tell()
paradigm allows control over detailed steps.scikit-learn
的,感觉比较简单可靠,默认GP和其它3种但可以接受其它estimator,可控制优化过程的步骤细节。考虑使用https://github.com/EmuKit/emukit
EntropySearch
, MultiInformationSourceEntropySearch
, ExpectedImprovement
, LocalPenalization
, MaxValueEntropySearch
, LowerConfidenceBound
, ProbabilityOfFeasibility
, ProbabilityOfImprovement
, etc.https://github.com/Yelp/MOE MOE by Yelp is deployed by various companies in production settings. A downside however is that development stopped in 2017.
Any successor? Yes.
SigOpt was founded by the creator of the Metric Optimization Engine (MOE) and our research… (See: https://sigopt.com/solution/why-sigopt/)
Why MOE is abandoned? Or, did it become close sourced? See above.
https://github.com/wujian16/Cornell-MOE
How does Cornell-MOE relate to the MOE BayesOpt package?
Cornell-MOE is based on the MOE BayesOpt package, developed at Yelp. MOE is extremely fast, but can be difficult to install and use. Cornell-MOE is designed to address these usability issues, focusing on ease of installation. Cornell-MOE also adds algorithmic improvements (e.g., Bayesian treatment of hyperparamters in GP regression, which improves robustness) and support for several new BayesOpt algorithms: an extension of the batch expected improvement (q-EI) to the setting where derivative information is available (d-EI, Wu et al, 2017); and batch knowledge gradient with (d-KG, Wu et al, 2017) and without (q-KG, Wu and Frazier, 2016) derivative information.
(comment from BayesOpt's docs)
Spearmint (Python): A library based on [Snoek2012]. It is more oriented to cluster computing. It provides time based criteria. Implemented in Python with interface also to Matlab https://github.com/JasperSnoek/spearmint
Becomes Whetlab, then acquired by Twitter, then… disappeared? (source-1 and source-2) set up a new subtitle if find the eventual successor.
https://github.com/GPflow/GPflowOpt
Requires GPFlow 1
which depends on 'TF 1.x' and thus its development is stopped.
Should be based on MOE.
(comment from BayesOpt's docs)
pybo (Python): A Python package for modular Bayesian optimization. https://github.com/mwhoffman/pybo
(comment from BayesOpt's docs)
DiceOptim (R): It adds optimization capabilities to DiceKriging. See [Roustant2012] and http://cran.r-project.org/web/packages/DiceOptim/index.html
https://github.com/resibots/limbo
Limbo (LIbrary for Model-Based Optimization) is an open-source C++11 library for Gaussian Processes and data-efficient optimization (e.g., Bayesian optimization) that is designed to be both highly flexible and very fast. It can be used as a state-of-the-art optimization library or to experiment with novel algorithms with “plugin” components.
Limbo is partly funded by the ResiBots ERC Project (http://www.resibots.eu).
* develop and study a novel family of such learning algorithms that make it possible for autonomous robots to quickly discover compensatory behaviors.
(comment from BayesOpt's docs)
Limbo (C++11): A lightweight framework for Bayesian and model-based optimisation of black-box functions (C++11). https://github.com/jbmouret/limbo
Some other packages not including BO, but something similar.
https://github.com/chrisstroemel/Simple