1 of 36

Seunghun Lee*, Jaewon Chu*, Sihyeon Kim*,

Juyeon Ko, Hyunwoo J. Kim†

MLV Lab, Korea University

Advancing Bayesian Optimization via

Learning Correlated Latent Space

1

Korea University

MLV Lab

NeurIPS 2023

2 of 36

Bayesian optimization

Black-box function

�https://en.wikipedia.org/wiki/Black_box

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

3 of 36

Bayesian optimization

Black-box function

Function evaluation (O), derivative/gradient (X)

�https://en.wikipedia.org/wiki/Black_box

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

4 of 36

Bayesian optimization

Black-box function

Function evaluation (O), derivative/gradient (X)

�https://en.wikipedia.org/wiki/Black_box

Optimization problem

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

5 of 36

Bayesian optimization

Black-box function

Function evaluation (O), derivative/gradient (X)

�https://en.wikipedia.org/wiki/Black_box

Optimization problem

Bayesian optimization

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

6 of 36

Bayesian optimization

Black-box function

Function evaluation (O), derivative/gradient (X)

�https://en.wikipedia.org/wiki/Black_box

Automated Design

Drug Discovery

Optimization problem

Bayesian optimization

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

7 of 36

Bayesian optimization

Black-box function

Function evaluation (O), derivative/gradient (X)

�https://en.wikipedia.org/wiki/Black_box

Automated Design

Drug Discovery

Optimization problem

Bayesian optimization

    • Feasible set is often Complex

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

8 of 36

Bayesian optimization

Black-box function

Function evaluation (O), derivative/gradient (X)

�https://en.wikipedia.org/wiki/Black_box

Automated Design

Drug Discovery

Optimization problem

Bayesian optimization

    • Feasible set is often Complex

(High dimensional/structured/discrete)

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

9 of 36

Bayesian optimization

Black-box function

Function evaluation (O), derivative/gradient (X)

�https://en.wikipedia.org/wiki/Black_box

Automated Design

Drug Discovery

Optimization problem

Bayesian optimization

    • Feasible set is often Complex

(High dimensional/structured/discrete)

Latent space Bayesian optimization

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

10 of 36

Latent space Bayesian optimization

�https://en.wikipedia.org/wiki/Black_box

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

11 of 36

Latent space Bayesian optimization

�https://en.wikipedia.org/wiki/Black_box

Input space

Dataset

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

12 of 36

Latent space Bayesian optimization

�https://en.wikipedia.org/wiki/Black_box

Encoder

Decoder

Input space

Dataset

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

13 of 36

Latent space Bayesian optimization

�https://en.wikipedia.org/wiki/Black_box

Encoder

Decoder

Input space

Latent space

Dataset

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

14 of 36

Latent space Bayesian optimization

�https://en.wikipedia.org/wiki/Black_box

Encoder

Decoder

Input space

Surrogate function

Latent space

Dataset

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

15 of 36

Latent space Bayesian optimization

�https://en.wikipedia.org/wiki/Black_box

Encoder

Decoder

Input space

Surrogate function

Latent space

Dataset

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

16 of 36

Latent space Bayesian optimization

�https://en.wikipedia.org/wiki/Black_box

Encoder

Decoder

Input space

Candidate

Surrogate function

Latent space

Dataset

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

17 of 36

Latent space Bayesian optimization

Encoder

Decoder

Candidate

Surrogate function

Latent space

Dataset

Input space

New data

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

18 of 36

Latent space Bayesian optimization

Encoder

Decoder

Input space

Candidate

Surrogate function

Latent space

Dataset

New data

Black-box�objective function

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

19 of 36

Latent space Bayesian optimization

Encoder

Decoder

Input space

Candidate

Surrogate function

Latent space

Dataset

New data

Black-box�objective function

Gap (1)

Trust region

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

20 of 36

Latent space Bayesian optimization

Encoder

Decoder

Input space

Candidate

Surrogate function

Latent space

Dataset

New data

Black-box�objective function

Gap (2)

Trust region

Gap (1)

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

21 of 36

Latent space Bayesian optimization

Encoder

Decoder

Input space

Candidate

Surrogate function

Latent space

Dataset

New data

Black-box�objective function

Gap (2)

Gap (1)

Trust region

CoBO

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

22 of 36

  • We propose two regularizations

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the objective function [Gap (1)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

23 of 36

  • We propose two regularizations

1.

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the objective function [Gap (1)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

24 of 36

  • We propose two regularizations

1.

- Improve the correlation between distances of the latent vectors z and differences of the objective function values y by encouraging L-Lipschitz continuity.

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the objective function [Gap (1)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

25 of 36

  • We propose two regularizations

1.

- Improve the correlation between distances of the latent vectors z and differences of the objective function values y by encouraging L-Lipschitz continuity.

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the objective function [Gap (1)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

26 of 36

  • We propose two regularizations

1.

2.

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the objective function [Gap (1)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

27 of 36

  • We propose two regularizations

1.

2.

- Prevents a trivial solution where the scale of the latent space merely grows by Lipschitz regularization.

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the objective function [Gap (1)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

28 of 36

  • We propose two regularizations

1.

2.

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the objective function [Gap (1)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

29 of 36

  • We suggest the loss weighting to align promising areas of the latent space with the input space.

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the input space [Gap (2)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

30 of 36

  • We suggest the loss weighting to align promising areas of the latent space with the input space.

  • The weighting function is based on the cumulative density function of the Gaussian distribution.

  • Yq : a specific quantile of the distribution of Y

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the input space [Gap (2)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

31 of 36

  • We suggest the loss weighting to align promising areas of the latent space with the input space.

Advancing Bayesian Optimization via Learning Correlated Latent Space

> Aligning the latent space with the input space [Gap (2)]

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

32 of 36

Experiments

  • Molecule design tasks (Guacamol benchmark, TDC benchmark),
  • Arithmetic expression fitting task

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

33 of 36

Analysis

Correlation Improvement

  • Models with Lalig show a consistently higher Pearson correlation between latent space distances and objective value differences

The Pearson correlation value with L_align (orange) and without L_align (blue).

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

34 of 36

Analysis

Impact of Llign​ on Latent Space

  • Lalign enhances smooth landscape of the latent space, which encourages more efficient optimization.

The plot represent the latent vectors with objective values. The colorbar indicates the normalized objective value.

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

35 of 36

Pseudocode

MLV Lab

Korea University

MLV Lab

NeurIPS 2023

36 of 36

Conclusion

Korea University

  • We propose Correlated latent space Bayesian Optimization (CoBO) to bridge the inherent gap in latent space Bayesian optimization.

  • We introduce two regularizations to align the latent space with the black-box objective function based on increasing the lower bound of the correlation.

  • We present a loss weighting scheme based on the objective values of input points, aiming to close the gap between the input space and the latent space focused on promising areas.�
  • We demonstrate extensive experimental results and analyses on nine tasks using three benchmark datasets on molecule design and arithmetic expression fitting and achieve stateof-the-art in all nine tasks.

MLV Lab

MLV Lab

Korea University

MLV Lab

NeurIPS 2023