1 of 40

Bayesian Machine Learning

2 of 40

Bayesian Decision Theory

2

3 of 40

Linear Classification

  • When we first studied classification, we began by assuming that the decision boundary is linear
  • This assumption served as a starting point, allowing us to construct simple and interpretable models such as SVM and logistic regression

  • However, at that stage, the linearity of the boundary was merely a modeling choice, not something rigorously derived from the underlying probabilistic structure of the data

  • We will mathematically investigate when and why this assumption is valid

3

4 of 40

Binary Classification with Gaussian

  •  

4

5 of 40

Optimal Boundary for Classes

  •  

5

6 of 40

Minimum Error Rate Classification

  • Minimize

  • We take derivatives

6

7 of 40

Posterior Probabilities

  •  

7

8 of 40

Boundaries for Gaussian

  •  

8

9 of 40

Equal Covariance

  •  

9

10 of 40

Not Equal Covariance

  •  

10

11 of 40

Examples of Gaussian Decision Regions

  • When the covariances are all equal, the separating surfaces are hyperplanes

  • When the covariances are not equal, the separating surfaces are quadratic functions

11

12 of 40

Python Implementation in 1D

  •  

12

13 of 40

Bayesian Classifier for Four Scenarios

  • Equal variance and equal prior

  • Equal variance and not equal prior

  • Not equal variance and equal prior

  • Not equal variance and not equal prior

13

14 of 40

Case 1: Equal Variance and Equal Prior

14

The optimal decision boundary lies at the midpoint between the two means

15 of 40

Case 2: Equal Variance and Not Equal Prior

15

It shifts toward the less probable class (class 0), because we require more evidence to classify a sample as belonging to the less likely class

16 of 40

Case 1: Equal Variance and Equal Prior

16

17 of 40

Case 3: Not Equal Variance and Equal Prior

17

The classification boundary becomes quadratic in this case. In one dimension, this implies that the decision rule involves two thresholds

18 of 40

Back to Logistic Regression

  •  

18

19 of 40

Probability Density Estimation

19

20 of 40

Probability Density Estimation

  • Probability Density Estimation refers to the task of estimating an unknown probability distribution from observed data.
  • Instead of producing a single estimate (e.g., a mean), the goal is to recover the entire probability density function (pdf), allowing for a more complete representation of uncertainty

  • There are two major approaches:
    • Kernel Density Estimation
    • Bayesian Density Estimation

20

21 of 40

Kernel Density Estimation

  •  

21

22 of 40

Kernel Density Estimation

22

23 of 40

Bayesian Density Estimation

  •  

23

24 of 40

Hidden State

  •  

24

25 of 40

Likelihood

25

26 of 40

Posterior

26

27 of 40

Combining Multiple Evidences

  • Compute posterior probability
  • Assume conditional independence

27

28 of 40

Recursive Bayesian Estimation

  •  

28

29 of 40

Recursive Bayesian Estimation

  • Let us define the sequence of posterior distributions:

  • Thus, each posterior can be recursively computed as:

29

30 of 40

Recursive Bayesian Estimation

  • Recursive

  • Core idea of recursive Bayesian estimation: 
    • "the current posterior becomes the prior for the next update."

  • Recursive Bayesian Estimation

30

 

31 of 40

Recursive Bayesian Estimation

  •  

31

(1) Prior

(2) Prior + data = posterior

(4) Prior + data = new posterior

(3) Posterior becomes new prior

(5) Posterior becomes new prior, ···

32 of 40

Example 1: Bernoulli Model

32

33 of 40

Bernoulli Model

33

34 of 40

Recursive Bayesian Estimation

34

35 of 40

Recursive Bayesian Estimation

35

36 of 40

Example 2: Gaussian Model

36

37 of 40

Posterior Probability

37

38 of 40

Recursive Bayesian Estimation

38

39 of 40

Recursive Bayesian Estimation

39

40 of 40

Summary

  • Bayesian Machine Learning

  • Bayesian Classifier

  • Bayesian Density Estimation

  • Bayes’ Rule
    • Prior
    • Likelihood
    • Posterior

40

Object Tracking in Computer Vision