Probability
External
- Mathematics for Computer Science Eric Lehman and Tom Leighton Chapters 18 - Chapter 25.
- https://www.coursera.org/learn/algorithms-divide-conquer/lecture/UXerT/probability-review-i
- https://www.coursera.org/learn/algorithms-divide-conquer/lecture/cPGDy/probability-review-ii
Internal
Overview
All concepts discussed in this page are discrete probability concepts.
Sample Space
A sample space is the collection of all things that could happen, the universe in which we are going to discuss the probability of events. The sample space contains all possible outcomes. It is represented with Ω (big omega). In case of discrete probabilities, the sample space is a finite set.
Probability Space
Outcome
Each outcome i∈Ω has a probability p(i) ≥ 0.
The constraint on all outcome probabilities is that the sum of all probabilities is over the sample space is 1:
∑ p(i) = 1 i∈Ω
For two sets of dice, all possible outcomes of rolling the dice are 36 pairs: {(1,1),(2,1), ...(6,6)}, where each pair is an outcome. For tossing a coin, there are only two outcomes (heads or tail). For well crafted dice or fair coins, the probability of each outcome is equally likely 1/36 and 1/2 respectively.
Event
An events is a subset of all things that could happen, or a subset of the sample space S ⊆ Ω. An event consists of distinct outcomes. An example of an event is the subset of outcomes when the sum of the dice equals 7.
Probability of an Event
The probability of an event is the sum of all the probabilities of all outcomes contained in that event.
∑ p(i) i∈S
The probability of the event when the sum of the dice is 7 is P((1,6))+P((2,5))+P((3,4))+P((4,3))+P((5,3))+P((6,1))=6*1/36=1/6.
Probability of an event A is written P[A].
Random Variable
A random variable is a statistic measure of what happens in a random outcome. It is a real value function defined on the sample space Ω to the real numbers. Once you know what happened, you have a real number.
X:Ω → ℝ
If we're rolling dice, an example of a random variable is the sum of the dice values.
Indicator Random Variable
A random variable that indicates whether an event happened or not - can be only 0 and 1.
Expectation of a Random Variable
The expectation of a random variable is the average of the random variable value weighted with the probability of each outcome.
Let X:Ω → ℝ be a random variable. The expectation (or the expected value) of the random variable is:
E[X] = ∑ X(i) p(i) i∈S
Linearity of Expectation
Linearity of expectation is a property (theorem) of random variables that consists in the following claim.
Let X1, X2 ... Xn be random variables defined on Ω. Then:
┌ n ┐ n E│ ∑ Xj│ = ∑ E[Xj] └ j=1 ┘ j=1
Expected value of the sum of random variables is equal to the sum of expectation of individual variables. This holds even if the random variables are NOT independent.
In case of two dice, the expected sum of two dice is the sum of expected value of each dice, which is the same of reach dice, and it is 1/6(1+2+3+4+5+6]=3.5, so the expected sum of two dice is 7.
Conditional Probability
One discusses the probability of one event given a second event. Let X, Y ⊆ Ω be two events. Then probability of the event X given the event Y is:
Pr[X∩Y] Pr[X|Y] = ─────── P[Y]
Independent Events
Two events are independent if and only if the following equation holds:
Pr[X∩Y] = Pr[X]*Pr[Y]
If we express the conditional probability of X given that Y happens and we substitute the above equation:
Pr[X∩Y] Pr[X]*Pr[Y] Pr[X|Y] = ─────── = ─────────── = P[X] Pr[Y] Pr[Y]
Intuitively, knowing that Y happened gives no information on probability of whether X happened.
Independence of Random Variables
Informally, knowing the value taken on by one of the random variables tells you nothing about what value is taken by the other random variable.
If A, B random variable are independent, then expectation:
E[A⋅B] = E[A]⋅E[B]
This only holds for independent variable, it does not work for dependent variables.
Decomposition Principle
Relevant for the analysis of randomized algorithms. Return to lecture.