Algorithms & Publications

Quasi-Newton Methods
Stochastic Optimization
Global Optimization

People

Philipp Hennig
Christian Schuler
Martin Kiefel

contact

Entropy-Search: Information-Efficient Global Optimization

Paper: [preprint] [BibTeX]
Hennig, P. & Schuler, C.: Entropy Search for Information-Efficient Global Optimization
Journal of Machine Learning Research (JMLR) Vol. 13, 2012, in press. Please cite when you use this algorithm.
Code:
EntropySearch.zip. After download, refer to the README.txt file
Note added in 2017: This code package here is mostly for legacy purposes. An updated version (which runs significantly faster) with updated documentation, can be found on github.

In numerical optimization, where the input space can have thousands or millions of dimensions, computational cost is the bottle-neck. But in experimental design, also known as global optimization, where the objective function itself is unknown, and often has a physical cost, sample cost dominates, and optimization algorithms should aim to make as much use of available information as possible. Entropy Search was specifically designed for this purpose. It is a global, black box optimization algorithm taking noisy evaluations as inputs, and actively guiding the experimental optimization process. Entropy Search is a way of turning the physical problem of finding good experimental parameters into a numerical problem of maximizing information gain. It is not numerically cheap - choosing each new evaluation point takes several seconds on a contemporary machine. But it is highly sample efficient, thus saving valuable experimentation time and resources.