Introduction to Probability
Jacky Baltes
Educational Robotics Center
National Taiwan Normal University
jacky.baltes@ntnu.edu.tw
Probabilistic Robotics
Probabilistic Robotics
Axioms of Probability Theory
(Kolmogorov 1933)
Axiom 3
B
A
B
Lemmas
And:
If A and B are statistically independent (PB|A)=P(B)
Discrete Random Variables
X denotes a random variable.
X can take on a countable number of values in {x1, x2, …, xn}.
P(X=xi), or P(xi), is the probability that the random variable X takes on value xi.
P( ) is called probability mass function.�
E.g.
Discrete Random Variables
P(sum=xi): probability of sum s when rolling a pair of fair dices�
Continuous Random Variables
x
p(x)
Joint And Conditional Probabilities
P(X=x and Y=y) = P(x,y)
If X and Y are independent then � P(x,y) = P(x) P(y)
P(x | y) is the probability of x given y� P(x | y) = P(x,y) / P(y)� P(x,y) = P(x | y) P(y)
If X and Y are independent then� P(x | y) = P(x)
Law of Total Probability, Marginals
Discrete case
Continuous case
Question
Monty's Dilemma
Monty's Dilemma
Monty's Dilemma
Basic Probability and Bayes Theorem
Basic Probability and Bayes Theorem
Basic Probability and Bayes Theorem
Basic Probability and Bayes Theorem
A+B+C+D = 1.00
A +C = 0.75
A +D = 0.60
Basic Probability and Bayes Theorem
Basic Probability and Bayes Theorem
Basic Probability and Bayes Theorem
Basic Probability and Bayes Theorem
Basic Probability and Bayes Theorem
Bayes Formula
Normalization
Algorithm:
Conditioning
Conditioning
Simple Example of State Estimation
Causal vs Diagnostic Reasoning
count frequencies!
Example
Markov assumption: zn is independent of z1,...,zn-1 if we know x.
Underlying Assumptions
Xt-1
Xt
Xt+1
Zt-1
Zt
Zt+1
ut-1
ut
Bayes
z = observation
u = action
x = state
Markov
Markov
Total prob.
References