Upload
vuongnga
View
212
Download
0
Embed Size (px)
Citation preview
STAT 302 Intro to Probability
Instructor: Alexandre Bouchard
www.stat.ubc.ca/~bouchard/courses/stat302-sp2017-18/
Plan for today:
• Axioms of probability.
• Reliability, continued.
• Probability tree diagram.
• More examples.
Logistics• Let me know if you cannot access website
• Last reminders:
• Office hours: Contact tab.
• Clickers (see links in Syllabus tab)
• Piazza under Contact tab.
• Readings: posted in Schedule tab
• Slides under Files tab
• Additional practice problems: Syllabus tab
http://www.stat.ubc.ca/~bouchard/courses/stat302-fa2015-16/
Logistics
• I will start using clickers today,
• but I am not taking grade into account (yet)
• We cannot have a larger room unfortunately, but keep monitoring the enrolment!
http://www.stat.ubc.ca/~bouchard/courses/stat302-fa2015-16/
Disclaimer
• Workload and difficulty increases in the second half of semester (continuous probability)
• Make sure you have the time to stay on top of the material
• Good things to review from pre-requisite courses:
• set theory notation
• bivariate integration
Review• Outcome: a scenario, s =
• Sample space S: all the possible scenarios
• Event: a set of outcomes, e.g. E = {s ∈ S : s is red}
• Probability: in the discrete, equally weighted case, P(E) = |E| / |S|
• Properties:
• P(E ⋃ F) = P(E) + P(F) when E & F are disjoint (non-overlapping)
• P(Ec) = 1 - P(E)
Discrete, equally weighted
P(E) = |E| / |S|
a) 0 ≤ P(E) ≤ 1b) P(S) = 1c) P(E ⋃ F) = P(E) + P(F) if E and F are disjoint
Definition (1):
Properties (2):
Not equally weighted (or not discrete)
Example: soon - redundancy problem
(ex. 11)
General rules of probability (Kolmogorov’s insight)
a) 0 ≤ P(E) ≤ 1b) P(S) = 1c) P(E ⋃ F ⋃ ...) = P(E) + P(F) + ... if E, F, ... are all disjoint
Assume:These are called the axioms of probability
***
Def. 3
Redundancy is not always bad
• Redundant systems:
• create one or more duplicates of an important part of a machine
• as long as one of the copies works, the machine works as a whole
• the machine only fails when both copies break
• Many examples in biology (kidneys), engineering
Redundant server power
supply
Example of a typical reliability problem
• Known:
• Power supply #1 works 60% of the time at delivery
• Power supply #2 works 70% of the time at delivery
• You also observed that both power supplies work at delivery 40% of the time
• Question: what is the probability that both power supply are broken at delivery?
Ex. 11
Strategy
1.Define a probability space S
2.Define the known information as events, ex.: W1 = power supplies #1 works
3.Define the goal in terms of the probability of an event
4.Use properties 1 and 2 to find that probability (see next 2 slides as well to make your life easier)
Prop. 3 De Morgan’s law: Distributing complements
DeMorgan’s Laws
We have
(ni=1Ei )c =
n
i=1Ec
i
(ni=1Ei )c =
n
i=1Ec
i
Proof: Suppose x is an outcome of (ni=1Ei )
c , then x is not in
(ni=1Ei ). Thus it is not in any of the event Ei , i = 1, ..., n. Hence it
is in Ecifor all i = 1, ..., n. Hence this proves (n
i=1Ei )c
n
i=1Ec
i.
Suppose x is an outcome of ni=1E
c
i, hence it is in Ec
ifor all
i = 1, ..., n. Hence it is in none of the event Ei , i = 1, ..., n. Thus it is(n
i=1Ei )c and we have proven that n
i=1Ec
i (n
i=1Ei )c . The result
(ni=1Ei )
c = ni=1E
c
ifollows.
To prove (ni=1Ei )
c = ni=1E
c
i, we remark that
(ni=1Ec
i )c = ni=1 (E
c
i )c
previous result applied to events {E ci }
= ni=1Ei as (Ec
i )c = Ei .
Thus by taking the complement on both sides, we obtain the result.
AD () Jan. 2010 7 / 8
union complement
intersectioncomplement
P(E ⋃ F) = P(E) + P(F) if E and F are disjoint, i.e. E ⋂ F = ∅
From earlier:
What if: we want the prob. of non-disjoint unions?
= + -
P(E ⋃ F) P(E) P(F) P(E ⋂ F)
Prop. 4 Inclusion-exclusion: Going between unions and intersections
Ex. 11
• Known:
• Power supply #1 works 60% of the time at delivery
• Power supply #2 works 70% of the time at delivery
• You also observed that both power supplies work at delivery 40% of the time
• Question: what is the probability that both power supply are broken at delivery?
De Morgan’s law:Distributing complements
DeMorgan’s Laws
We have
(ni=1Ei )c =
n
i=1Ec
i
(ni=1Ei )c =
n
i=1Ec
i
Proof: Suppose x is an outcome of (ni=1Ei )
c , then x is not in
(ni=1Ei ). Thus it is not in any of the event Ei , i = 1, ..., n. Hence it
is in Ecifor all i = 1, ..., n. Hence this proves (n
i=1Ei )c
n
i=1Ec
i.
Suppose x is an outcome of ni=1E
c
i, hence it is in Ec
ifor all
i = 1, ..., n. Hence it is in none of the event Ei , i = 1, ..., n. Thus it is(n
i=1Ei )c and we have proven that n
i=1Ec
i (n
i=1Ei )c . The result
(ni=1Ei )
c = ni=1E
c
ifollows.
To prove (ni=1Ei )
c = ni=1E
c
i, we remark that
(ni=1Ec
i )c = ni=1 (E
c
i )c
previous result applied to events {E ci }
= ni=1Ei as (Ec
i )c = Ei .
Thus by taking the complement on both sides, we obtain the result.
AD () Jan. 2010 7 / 8
union complement
intersectioncomplement
P(E ⋃ F) = P(E) + P(F) if E and F are disjoint, i.e. E ⋂ F = ∅
From earlier:
What if: we want the prob. of non-disjoint unions?
= + -
P(E ⋃ F) P(E) P(F) P(E ⋂ F)
Prop. 4 Inclusion-exclusion:Going between unions and intersections
A. 8%B. 10%C. 12%D. 14%
Def. 4
Partition of an eventF
E1
E2E3
E4
1. The union of the Ei ’s is equal to F: ⋃i Ei = F
2. The Ei ’s are disjoint: if i ≠ j, then Ei ⋂ Ej = ∅
The events Ei form a partition of the event F if:
Note: as consequence of 1, 2 and the axioms of probability, P(F) = P(E1) + P(E2) + P(E3) + P(E4)
Why is this useful?
{
W1
W1c
{s11}
{s10}
{s01}
{s00}
Def. 5
Edges (arrows) coming out of an event are
partitions of that event
Probability tree diagram
Leaves contain a single outcome
Nodes are events,
containing all the other
ones below
Ex. 12 Probability that n coins are all tails
Recall:
Probability when outcomes are equally
likely:
# of outcomes of interest# of outcomes
P(E) = |E| / |S|
Here: |E| = ?|S| = ?
Ex. 12 Probability that n coins are all tails
Recall:
Probability when outcomes are equally
likely:
# of outcomes of interest# of outcomes
P(E) = |E| / |S|
Here: |E| = 1|S| = ?
Ex. 12 Probability that n coins are all tails: finding |S|
*,*,*
H,*,*
T,*,*
H,H,*
H,T,*
T,H,*
T,T,*
H,H,H
H,H,T
H,T,H
H,T,TT,H,HT,H,T
T,T,HT,T,T
depth = n = 3
|S| = 2|E3| = 2x22
= 2depth
|E3| = 2|E2| = 2x2
|E2| = 2|E1| = 2
|E1| = 1
Probability of winning the lottery
Ex. 13
• You pick your number when you buy a ticket
• The lottery company draws at random from an urn containing n numbered balls {1, 2, ..., n} [example: n=5]
• k times [example: k=3]
• without replacement (each number is either picked 0 or 1 time, not more)
Probability of winning the lottery (without replacement)
Ex. 13
• You win if the numbers you picked match those from the draw.
• See example on the right, do you win in this case?
Your picks: 12, 31, 10, 23, 8, 45
Draw result:
a) if order matters, NOb) if order does not matter, YES
Note: most lottery use (b), but let’s do (a) first---it is simpler
Probability of winning the lottery (order matters, without replacement)
Ex. 13a
Recall:
Probability when outcomes are equally
likely:
# of outcomes of interest# of outcomes
P(E) = |E| / |S|
Here: |E| = 1|S| = ?
Find |S|:Ex. 13a
k = number of draws = 3n = number of balls in urn = 5
* * *
2 * *
3 * *
4 * *
5 * *
1 * *
1 2 *
1 3 *
1 4 *
1 5 *
2 1 *
2 3 *
2 4 *
2 5 *
3 1 *
2 1 32 1 4
2 1 5
......
Hint: the probability tree looks like:
A) 243B) 125C) 60D) 15
Find |S|:Ex. 13a
k = number of draws = 3n = number of balls in urn = 5
* * *
2 * *
3 * *
4 * *
5 * *
1 * *
1 2 *
1 3 *
1 4 *
1 5 *
2 1 *
2 3 *
2 4 *
2 5 *
3 1 *
2 1 32 1 4
2 1 5
......
Hint: the probability tree is
A) 243B) 125C) 60D) 15
|S|, without replacement, order mattersEx. 13a
* * *
1 * *
1 2 *1 2 31 2 4
1 2 5
|E1| = 1|E2| = 3|E1| = 3
|E3| = 4|E2| = 4x3
|S| = 5|E3| = 5x4x3
|S| = 5x4x3x2x1
2x1
= n!
(n-k)!
k = number of draws = 3n = number of balls in urn = 5
Probability of winning the lottery (without replacement)
Ex. 13b
Your picks: 12, 31, 10, 23, 8, 45
Draw result:a) if order matters.
b) if order does not matter:
|S| = n!
(n-k)!
|S| = ?
|S|, without replacement, order does not matterEx. 13b
2 *1 2 31 2 4
1 2 5
- Group the leaves that are equivalent when order is ignored
- How many in each group?
Group 1{1, 2, 3}
Group 2{1, 2, 4}2 *
2 1 32 1 4
2 1 5
......
6 in each group
more generally, k!
...
k = number of draws = 3n = number of balls in urn = 5
|S|, without replacement, order does not matterEx. 13b
2 *1 2 31 2 4
1 2 5Group 1{1, 2, 3}
Group 2{1, 2, 4}2 *
2 1 32 1 4
2 1 5
......
...
k = number of draws = 3n = number of balls in urn = 5
# leaves= n!
(n-k)!# leaves per
group = k!
Therefore, # groups =
n!
k!(n-k)!
DIY: important related problem / exercise
• You have a DNA sequenceACATG
• how many distinct genomes can you form with these 5 nucleotides (letters)
• answer is not 5!, because we have 2 A’s
• it is 5!/2 = 5! / (2! 1! 1! 1!)
• What is the general formula for a genome with nA A’s, nC C’s, nG G’s and nT T’s?
• Idea is similar to what we just saw (overcounting and dividing)
Probability of winning the lottery (without replacement)
Ex. 13b
Your picks: 12, 31, 10, 23, 8, 45
Draw result:a) if order matters.
b) if order does not matter:
|S| = n!
(n-k)!
|S| = n!
k!(n-k)!
Probability of winning the lottery (without replacement)
Ex. 13b
a) if order matters.
b) if order does not matter:
|S| = n!
(n-k)!
|S| = n!
k!(n-k)!
Notation:
n!
k!(n-k)!nk
Note:List where- order does not matter - items appear at most once
Set
=( )
‘Elementary permutations and combinations’
Def 6
a) if order matters.
b) if order does not matter:
|S| = n!
(n-k)!
|S| = n!
k!(n-k)!
‘Number of permutations’ Notation: nPk
# of list of size k, where each object is taken without replacement from n
possible objects’
‘Number of combinations’ Notation: nCk
# of sets of size k, where each object is taken without replacement from n
possible objects’