Special Course in Machine Learning:
CS236 Deep Generative Models
Mikhail Papkov
11.02.2020
Slides available here: eid.ee/4yo
Agenda
Logistics
Delta-1022, Tuesday, 16:15 - 17:45 (we can stay longer or leave earlier)
Do we need to discuss changing time and place?
(Wednesday or Thursday evening are also fine)
Prerequisites
Resources
Organization
This is a reverse classroom seminar! I am also a student and present only today
How to pass?
Do you need more motivation?
Questions?
About me
About you
(this would probably help us to define directions + I’ll try to remember you)
(it’s fine if you do not yet have a clear perspective or take this course for fun!)
Background
Summary
Data
Hereinafter, CS236 materials were used (by Stefano Ermon and Aditya Grover, MIT License)
Generation
Tasks
Distribution
You kick the black box, it gives you some value according to certain rules:
Can be joint (RGB)
Are all the pixels (parameters) independent? We can assume so, but most likely not
Conditional independence
Rules
Bayes rule terminology
P(A) — prior
P(A|B) — posterior
P(B|A) — likelihood
Bayes network
We can assume that some events are conditionally independent
Figure from Meelis Kull: Fall 2019 MTAT.03.227 Machine Learning Lecture 11, slide 32
Naive Bayes
Assume features to be conditionally independent given label
Generative vs Discriminative
Naive Bayes is a generative model (although we use it for classification)
Logistic regression (discriminative)
Parametrized with α
Logistic regression (discriminative)
Parametrized with α
From logistic regression to neural network
Why using generative models?
Kahoot!