HesBO

所属分类:嵌入式/单片机/硬件编程
开发工具:Python
文件大小:0KB
下载次数:0
上传日期:2019-06-08 07:38:30
上 传 者sh-1993
说明:  “嵌入式子空间贝叶斯优化框架”中的高维贝叶斯Opt算法,
(The High-dimensional BayesOpt algorithms from "A Framework for Bayesian Optimization in Embedded Subspaces,)

文件列表:
BLOSSOM/ (0, 2019-06-08)
BLOSSOM/blossom_run.py (1457, 2019-06-08)
BLOSSOM/embd_functions.py (7749, 2019-06-08)
Branin_D100_d4.jpg (43304, 2019-06-08)
KG/ (0, 2019-06-08)
KG/embd_functions.py (9935, 2019-06-08)
KG/examples/ (0, 2019-06-08)
KG/examples/bayesian_optimization.py (5910, 2019-06-08)
KG/examples/functions_vanilla.py (8462, 2019-06-08)
KG/moe_run.py (16381, 2019-06-08)
REMBO.py (6535, 2019-06-08)
count_sketch.py (5812, 2019-06-08)
experiments.py (13631, 2019-06-08)
functions.py (6859, 2019-06-08)
kernel_inputs.py (557, 2019-06-08)
projection_matrix.py (608, 2019-06-08)
projections.py (1801, 2019-06-08)
requirements.txt (38, 2019-06-08)

# A Framework for Bayesian Optimization in Embedded Subspaces ## What is high-dimensional Bayesian optimization? Bayesian optimization (BO) has recently emerged as powerful method for the global optimization of expensive-to-evaluate black-box functions. However, these methods are usually limited to about 15 input parameters (levers). In the paper "A Framework for Bayesian Optimization in Embedded Subspaces" (to appear at ICML'19), [Munteanu](https://www.statistik.tu-dortmund.de/munteanu.html "Alexander Munteanu"), Nayebi, and [Poloczek](http://www.sie.arizona.edu/poloczek "Matthias Poloczek") propose a non-adaptive probabilistic subspace embedding that can be combined with many BO algorithms to enable them to higher dimensional problems. This repository provides Python implementations of several algorithms that extend BO to problems with high input dimensions: * The HeSBO algorithm proposed by Munteanu, Nayebi, and Poloczek (ICML '19) (see below for the citation) combined with * The Knowledge Gradient (KG) algorithm of [Cornell-MOE](https://github.com/wujian16/Cornell-MOE "Cornell-MOE") (Wu & Frazier NIPS'16; Wu, Poloczek, Wilson, and Frazier NIPS'17) * The [BLOSSOM algorithm](https://github.com/markm541374/gpbo "BLOSSOM") of McLeod, Osborne, and Roberts (ICML '18) * Expected improvement, e.g., see Jones, Schonlau, and Welch (JGO '98) * The REMBO method using * the KXand Ky kernels of Wang et al. (JMLR '18) and * the K kernel of Binois, Ginsbourger and Roustant (LION '15). ## Installing the requirements The codes are written in python 3.6, so it is recommended to use this version of python to run the scripts. To install the requirements one can simply use this line: ```bash pip3 install -r requirements.txt ``` ## Running different BO methods There are HeSBO and three different variants of REMBO implemented in this code. Three REMBO variants are called Ky, KX, and K . These algorithms can be run as follows. ```bash python experiments.py [algorithm] [first_job_id] [last_job_id] [test_function] [num_of_steps] [low_dim] [high_dim] [num_of_initial_sample] [noise_variance] [REMBO_variant] ``` To determine the algorithm, use `REMBO` or `HeSBO` input for the python script. If REMBO algorithm is selected to be run, the REMBO variant must be determined by `X`, `Y`, or `psi` as the last argument. If none of those variants is picked, all of those variants will be run. Here is an example of running HeSBO-EI on 100 dim noise-free Branin with 4 low dimensions: ```bash python experiments.py HeSBO 1 1 Branin 80 4 100 10 0 ``` To collect the output data, you must have a folder named "results". Here is a plot for running REMBO-K and HeSBO-EI on the Branin function.
## Citation ```bash @inproceedings{HeSBO19, author = {Alex Munteanu and Amin Nayebi and Matthias Poloczek}, title = {A Framework for Bayesian Optimization in Embedded Subspaces}, booktitle = {Proceedings of the 36th International Conference on Machine Learning, {(ICML)}}, year = {2019}, note={Accepted for publication. The code is available at https://github.com/aminnayebi/HesBO.} } ```

近期下载者

相关文件


收藏者