Table of Contents

Bayesian Optimization Packages

Online Tutorials

https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb, mentioned:

http://rmcantin.github.io/bayesopt/html/bopttheory.html

Online Discussions

https://www.reddit.com/r/datascience/comments/cpyer5/bayesian_optimization_libraries_python/

Packages (as of 12/18/2020)

(Disclaimer) With the leaning “(Deprecated)” in the subtitles, I mean it looks haven't been updated or maintained for a long time. I didn't check or verify it. These are very subjective judgments to help me clear up the mind. You are welcomed to send me an email for any discussion.

Comparison table

Package Language Active? Open-source Version Last Commit [note-1] Surrogate GP Type-II ML optim. MCMC Acquisition Func. Batch of points Multi-Objective
pyGPGO python low yes v0.4.0.dev1 (11/02/2017) 11/23/2020 GP, tSP, RF, GBM Native MML, pyMC3 PI, EI, UCB, Entropy
* scikit-optimize python yes yes v0.8.1 (09/04/2020) 09/29/2020 GP, RF, GBM scikit-learn MML, No PI, EI, UCB
BayesianOptimization (bayes_opt) python yes yes v1.2.0 (05/16/2020) 07/13/2020 GP scikit-learn No No PI, EI, UCB
* GPyOpt python, GPy ended yes v1.2.6 (03/19/2020) 11/05/2020 GPP, RF, WGP GPy MML, Yes PI, EI, UCB
Trieste python, gpflow 2 very yes v0.5.0 (06/10/2021) 06/10/2021 GPR, SGPR, SVGP gpflow 2 ? No EI, LCB, PF,
BayesOpt C++; Python, MATLAB yes yes 0.9 (08/28/2018) 05/14/2020 GP, STP Native MML, ML, MAP, etc. No EI, LCB, MI, PI, Aopt, etc.
Emukit (Amazon) python, GPy yes yes v0.4.7 (13/03/2019) 11/30/2020 GPy model wrapper GPy ? No, almost EI, LCB, PF, PI, etc.
Spearmint python, MongoDB ended yes
Whetlab (Twitter) ←-Spearmint no? NO
pyBO python ended yes v0.1 (09/18/2015) 09/20/2015
MOE (Yelp) python ended
SigOpt ←- MOE NO
GPflowOpt python, gpflow 1 ended
BoTorch (Facebook) python, GPyTorch yes, very v0.3.3 (12/08/2020) (12/18/2020) GPyTorch model wrapper GPyTorch Yes EI, UCB, PI, yes

Notes about the table

:TODO: Not finished yet. Missing a lot of things.
:todo: go over MLE, MLL, MAP, …

[note-1] As of the last time I update this data.

What does “Type II ML optimization” mean in the original table from pyGPGO?
It should mean how the hyperparameters of the surrogate model is trained. “Type II ML optimization” indicates an empirical Bayes capability, while a full Bayes approach requires MCMC generally.

What does “Integrated acq. function” mean in the original table?
It is an “integrated version” of the acquisition function used in MCMC. :todo: I still don't understand what “integrated” means.


(Deprecated) GPyOpt (Python, GPy) (Sheffield)

https://github.com/SheffieldML/GPyOpt

BayesianOptimization, bayes_opt (Python)

https://github.com/fmfn/BayesianOptimization

pyGPGO (python; scikit-learn, pyMC3, theano)

https://github.com/josejimenezluna/pyGPGO (Python)

BayesOpt (Cpp, Python/ MATLAB API)

https://github.com/rmcantin/bayesopt (C++)

trieste (python, GPFlow 2.x)

https://github.com/secondmind-labs/trieste

BoTorch (Python, PyTorch) (Facebook)

https://botorch.org/ (Python, PyTorch)

scikit-optimize (python, scikit-learn)

https://github.com/scikit-optimize/scikit-optimize (In my opinion, a really bad name but anyway the package is named skopt. Maybe they intend to do something bigger than just BO.)

Emukit (Python, GPy) (Amazon backed?)

https://github.com/EmuKit/emukit

(Deprecated) MOE (python) (Yelp)

https://github.com/Yelp/MOE MOE by Yelp is deployed by various companies in production settings. A downside however is that development stopped in 2017.

:TODO: Any successor? Yes.

SigOpt was founded by the creator of the Metric Optimization Engine (MOE) and our research… (See: https://sigopt.com/solution/why-sigopt/)

:TODO: Why MOE is abandoned? Or, did it become close sourced? See above.

(Deprecated) Cornell-MOE (python)

https://github.com/wujian16/Cornell-MOE

How does Cornell-MOE relate to the MOE BayesOpt package?
Cornell-MOE is based on the MOE BayesOpt package, developed at Yelp. MOE is extremely fast, but can be difficult to install and use. Cornell-MOE is designed to address these usability issues, focusing on ease of installation. Cornell-MOE also adds algorithmic improvements (e.g., Bayesian treatment of hyperparamters in GP regression, which improves robustness) and support for several new BayesOpt algorithms: an extension of the batch expected improvement (q-EI) to the setting where derivative information is available (d-EI, Wu et al, 2017); and batch knowledge gradient with (d-KG, Wu et al, 2017) and without (q-KG, Wu and Frazier, 2016) derivative information.

(Deprecated) Spearmint (Python 2.7, MongoDB)

(comment from BayesOpt's docs)
Spearmint (Python): A library based on [Snoek2012]. It is more oriented to cluster computing. It provides time based criteria. Implemented in Python with interface also to Matlab https://github.com/JasperSnoek/spearmint

Becomes Whetlab, then acquired by Twitter, then… disappeared? (source-1 and source-2) :TODO: set up a new subtitle if find the eventual successor.

(Deprecated) GPflowOpt (python, TF 1.x)

https://github.com/GPflow/GPflowOpt

Requires GPFlow 1 which depends on 'TF 1.x' and thus its development is stopped.

https://github.com/secondmind-labs/trieste

(Commercial) SigOpt <-- MOE

Should be based on MOE.

Not skimmed yet

pybo (Python)

(comment from BayesOpt's docs)
pybo (Python): A Python package for modular Bayesian optimization. https://github.com/mwhoffman/pybo

DiceOptim (R)

(comment from BayesOpt's docs)
DiceOptim (R): It adds optimization capabilities to DiceKriging. See [Roustant2012] and http://cran.r-project.org/web/packages/DiceOptim/index.html

Limbo (Cpp)

https://github.com/resibots/limbo

Limbo (LIbrary for Model-Based Optimization) is an open-source C++11 library for Gaussian Processes and data-efficient optimization (e.g., Bayesian optimization) that is designed to be both highly flexible and very fast. It can be used as a state-of-the-art optimization library or to experiment with novel algorithms with “plugin” components.
Limbo is partly funded by the ResiBots ERC Project (http://www.resibots.eu).
* develop and study a novel family of such learning algorithms that make it possible for autonomous robots to quickly discover compensatory behaviors.
(comment from BayesOpt's docs)
Limbo (C++11): A lightweight framework for Bayesian and model-based optimisation of black-box functions (C++11). https://github.com/jbmouret/limbo

TS-EMO (MATLAB)

https://github.com/Eric-Bradford/TS-EMO

mlrMBO (R)

https://github.com/mlr-org/mlrMBO

BO Alternatives

Some other packages not including BO, but something similar.

Simple (Python)

https://github.com/chrisstroemel/Simple