Introduction to Bayesian Regression
AI4Fusion Summer School - W&M, 2024
Cristiano Fanelli
06/12/2024
Sample/Tune
2
One of the most immediate improvements you can make to Hamiltonian Monte Carlo (HMC) is to implement step size adaptation, which gives you fewer parameters to tune, and adds in the concept of “warmup” or “tuning” for your sampler.
https://colcarroll.github.io/hmc_tuning_talk/
Bayesian Linear Regression
3
Bayesian Linear Regression
4
disturbance
Bayesian Linear Regression
5
disturbance
Μ expressed as “deterministic” — see code
Preprocessing
6
Class: what happens in this case?
Bayesian Linear Regression: FAQ
7
I am familiar with linear regression models already, and I know methods for fitting, e.g., least square. Why should I use Bayesian linear regression?
In problems where we have limited data or have some prior knowledge that we want to use in our model, the Bayesian Linear Regression approach can both incorporate prior information and show our uncertainty. Bayesian Linear Regression reflects the Bayesian framework: we form an initial estimate and improve our estimate as we gather more data.
Logistic (Sigmoid) Function
9
Hypothesis Representation:
Sigmoid:
features
You can decide a threshold to make a decision (e.g., everything above 0.5 is dog, anything below is cat)
Logistic vs Bayesian Logistic
10
Traditional logistic regression typically involves minimizing a cost function using gradient descent, Bayesian logistic regression involves using MCMC or similar methods to sample from the posterior distribution of the parameters, thus providing a probabilistic understanding of the model parameters.
Logistic Regression:
Bayesian Logistic Regression:
Types of Logistic Regression
11
12
Spares