1 of 11

Bayesian Reasoning

In Data Science

Cristiano Fanelli

18/10/2022 - Lectures 13

2 of 11

Outline

2

  • Bayesian Logistic Regression — notebook mod2_part2_Bayesian_Logistic_Regression
  • Integrated notes on Logistic Regression

3 of 11

Logistic Regression

3

Credits: University of Toronto

Credits: references [1], [2], [3]

Likely, you are familiar with logistic regression, a Machine Learning classification algorithm used to assign observations to a discrete set of classes.

Example

4 of 11

Linear VS Logistic Regression

4

Credits: references [1], [2], [3]

5 of 11

Logistic (Sigmoid) Function

5

Credits: references [1], [2], [3]

Hypothesis Representation:

Sigmoid:

features

You can decide a threshold to make a decision (e.g., everything above 0.5 is dog, anything below is cat)

6 of 11

Cost Function and Gradient Descent

6

Credits: references [1], [2], [3]

The cost function represents and optimization objective. i.e., we create a cost function and minimize:

7 of 11

Cost Function and Gradient Descent

7

to minimize our cost function we need to run the gradient descent function on each parameter, i.e.:

Gradient Descent Simplified | Image: Andrew Ng Course

Credits: references [1], [2], [3]

8 of 11

Now, let’s take a look at Bayesian Logistic Regression

8

Credits: references [1], [2], [3]

9 of 11

Useful References

9

10 of 11

References of our course

10

https://cfteach.github.io/brds/referencesmd.html

11 of 11

Backup