Discussion 12b
Attendance code: livingWage
Step-by-step walkthrough
How to use
Probabilistic Bounds
Prove or disprove, given:
Does X always have to be able to have its E(X) as a possible outcome?
No! ->
To prove P(X=2)>0 is not necessarily true, try to come up with a counterexample.
Lots of possible solutions – in this example, we make it so 2 is the midpoint of two equally likely options.
Probabilistic Bounds
Prove or disprove, given:
E(X2) looks familiar from the variance formula!
Let’s try to use that.
Plugging in what we know already
Solving for E(X2) shows us this has to be true
Probabilistic Bounds
Prove or disprove, given:
Does the probability of being at least the mean have to = probability of being at most the mean?
Hint: try to find a simple counterexample. (ie. not a lot of different outcomes!)
Probabilistic Bounds
Prove or disprove, given:
It’s an inequality! We should probably see if this is what we get from any of our concentration inequalities (Markov & Chebyshev)
The inequality goes in the wrong direction compared to the question! How should we fix it?
Don’t forget we need Y>0 to use Markov’s inequality
So the inequality is true.
Probabilistic Bounds
Prove or disprove, given:
Another inequality!
Law of Large Numbers
LLN says that our average value across the samples approaches the mean of the random variable as we do more and more samples.
(this is the basis of basically all of statistics!)
What is the average of all the coin flips approaching?
½ (that is, 50% Heads) of course! That’s the expected value of a coin toss.
So which do we prefer, 100 tosses or 10?
Law of Large Numbers
Law of Large Numbers
What is the problem if we try to use the same logic as before?
If the intuition is shaky, we’d better try to calculate it out exactly.
General formula for getting 50% heads in n trials.
What happens when n gets bigger?
Continuous Computations
Continuous Computations
Continuous Computations
What’s the expected value for a continuous random variable?