CSC-343
Artificial Intelligence
Lecture 5.1.
Probability vs. Logic
Language | What exists in the world? | What an agent believes about facts? |
Propositional Logic | Facts | True / False / Unknown |
First-order logic | Facts, Objects, Relations | True / False / Unknown |
Probability theory | Facts | Degree of belief ∈ [0, 1] * |
* meaning a number between 0 and 1
Sample Space Ω
Coin Flip 1 | Coin Flip 2 |
H | H |
H | T |
T | H |
T | T |
Ω = {HH, HT, TH, TT}
Sample Space Ω and ω
Possible world | Coin Flip 1 | Coin Flip 2 |
ω1 | H | H |
ω2 | H | T |
ω3 | T | H |
ω4 | T | T |
Ω = {HH, HT, TH, TT}
Ω = {ω1, ω2, ω3 ,ω4} ω (lowercase omega) refers to a particular possible world
Probability Model P(ω)
Possible world | Coin Flip 1 | Coin Flip 2 | P( ωi ) |
ω1 | H | H | 0.25 |
ω2 | H | T | 0.25 |
ω3 | T | H | 0.25 |
ω4 | T | T | 0.25 |
0 ≤ P(ωi) ≤ 1 for every ωi
∑ ω ∈ Ω P(ωi) = 1
Probability Distribution as a Pie Chart
P(ω1)
P(ω3)
P(ω2)
P(ω4)
P(ω5)
0: 0% non existent / impossible
1: 100 % complete monopoly / certain
All possibilities should add up to 1 or 100%
Probability Distribution as a Histogram
1
0.75
0.5
0.25
0
HH HT TH TT
ω1 ω2 ω3 ω4
P(ωi)
Events ɸ
Possible world | Coin Flip 1 | Coin Flip 2 | P( ωi ) |
ω1 | H | H | 0.25 |
ω2 | H | T | 0.25 |
ω3 | T | H | 0.25 |
ω4 | T | T | 0.25 |
Events ɸ
Possible world | Coin Flip 1 | Coin Flip 2 | P( ωi ) |
ω1 | H | H | 0.25 |
ω2 | H | T | 0.25 |
ω3 | T | H | 0.25 |
ω4 | T | T | 0.25 |
Probability of an Event P(ɸ)
Possible world | Coin Flip 1 | Coin Flip 2 | P( ωi ) |
ω1 | H | H | 0.25 |
ω2 | H | T | 0.25 |
ω3 | T | H | 0.25 |
ω4 | T | T | 0.25 |
Random Variables
Possible world | Coin Flip 1 | Coin Flip 2 | P( ωi ) |
ω1 | H | H | 0.25 |
ω2 | H | T | 0.25 |
ω3 | T | H | 0.25 |
ω4 | T | T | 0.25 |
Conditional Probability P(a|b)
Possible world | Coin Flip 1 | Coin Flip 2 | P( ωi ) |
ω1 | H | H | 0.25 |
ω2 | H | T | 0.25 |
ω3 | T | H | 0.25 |
ω4 | T | T | 0.25 |
P(CoinFlip2=H ∧ CoinFlip1=T) |
P(CoinFlip1=T) |
=
0.25 |
0.5 |
Conditional Probability and Product Rule
P(X=x1 | Y=y1) =
P(X=x1 ∧ Y=y1) = P(X=x1 | Y=y1) * P(Y = y1)
P(X=x1 ∧ Y=y1) |
P(Y = y1) |
Conditional Probability P(a|b)
P(X=x1)
P(Y=y1)
P(X=x1 ∧ Y=y1) |
P(Y = y1) |
Inclusion-Exclusion Principle P(a v b)
Possible world | Coin Flip 1 | Coin Flip 2 | P( ωi ) |
ω1 | H | H | 0.25 |
ω2 | H | T | 0.25 |
ω3 | T | H | 0.25 |
ω4 | T | T | 0.25 |
P(CoinFlip1=H v CoinFlip2=T) = P(CoinFlip2=H) + P(CoinFlip1=T) - P(CoinFlip2=H ∧ CoinFlip1=T)
Inclusion-Exclusion Principle P(a v b)
P(X=x1)
P(Y=y1)
P(X=x1 v Y=y1) = P(X=x1) + P(Y=y1) - P(X=x1 ∧ Y=y1) = + -
Librarian or Farmer?
Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.
Is Steve more likely to be a librarian or a farmer?
Librarian = 3
Farmer = 13
Librarian or Farmer?
20 : 1 ratio
the 20:1 ratio
“Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.”
4 |
4 + 20 |
20 |
4 + 20 |
Bayes theorem
e.g.
P(Librarian | Description)
P(Librarian | Description)
This is called Prior
This is called Likelihood
+
P(Librarian | Description)
P(Librarian) * P(Librarian |Description)
[P(Librarian) * P(Description |Librarian)]
+
[P(¬Librarian)*P(Description|¬Librarian)]
+
Bayes Theorem
Evidence
P(Evidence)
P(B)