Lecture _10 -- Part 2

Embed Size (px)

Citation preview

  • 7/31/2019 Lecture _10 -- Part 2

    1/31

    Modeling & Simulation

    Lecture 10

    Part 2

    Generating Random

    Variates

    Instructor:

    Eng. Ghada Al-MashaqbehThe Hashemite UniversityComputer Engineering Department

  • 7/31/2019 Lecture _10 -- Part 2

    2/31

    The Hashemite University 2

    Outline

    Introduction.

    Generating continuous randomvariates.

    Generating empirical randomvariates.

    Generating discrete random variates. Generating arrivals processes.

  • 7/31/2019 Lecture _10 -- Part 2

    3/31

    The Hashemite University 3

    Introduction

    We have explored the general techniquesto generate random variates.

    In this lecture we will explore the commonalgorithms used with the theoretical

    distributions (both continuous anddiscrete) that we have dealt with before.

    So, just pick the distribution and itsalgorithm and work with it.

    In addition, we will deal with empiricalrandom variates generation and arrivalprocesses.

  • 7/31/2019 Lecture _10 -- Part 2

    4/31

    The Hashemite University 4

    Continuous Distributions

    Uniform

    Use the inverse transform algorithm.

    Exponential Use the inverse transform algorithm.

    Weibull Use the inverse transform algorithm.

    UabaX )(

    UX ln

    /1)]1ln([ UX

  • 7/31/2019 Lecture _10 -- Part 2

    5/31

    The Hashemite University 5

    Continuous Distributions Gamma I

    Pdf and cdf function

    Where is the shape parameter and is thescale parameter.

    otherwise0

    0!

    /1

    )(

    1

    0

    /x

    j

    xe

    xFj

    j

    x

    otherwise0

    0)()(

    /1

    xex

    xf

    x

  • 7/31/2019 Lecture _10 -- Part 2

    6/31

    The Hashemite University 6

    Continuous Distributions Gamma II

    No closed-form inverse cannot use theinverse transform method.

    if

    Gamma(1,1) is exponential with mean 1. For this purpose we will consider Gamma

    distribution with = 1 and

    ),(~ gammaX)1,(~ gammaX

    1,10

  • 7/31/2019 Lecture _10 -- Part 2

    7/31

    The Hashemite University 7

    Gamma(,1) Density

  • 7/31/2019 Lecture _10 -- Part 2

    8/31

    The Hashemite University 8

    Continuous Distributions Gamma(0

  • 7/31/2019 Lecture _10 -- Part 2

    9/31

    The Hashemite University 9

    Continuous Distributions Gamma(0

  • 7/31/2019 Lecture _10 -- Part 2

    10/31

    The Hashemite University 10

    Continuous Distributions Gamma(0

  • 7/31/2019 Lecture _10 -- Part 2

    11/31

    The Hashemite University 11

    Continuous Distributions Gamma(0

  • 7/31/2019 Lecture _10 -- Part 2

    12/31

    The Hashemite University 12

    Continuous Distributions Gamma(>1) I

    Acceptance-rejection with the followingt(x) and use inverse transform togenerate Y from r(x):

    )(

    4

    12

    2

    1

    ec

    x

    xxt

    otherwise

    xx

    xxR

    ,0

    0,)(

    10,1

    )(

    /1

    1

    u

    u

    uXuR

  • 7/31/2019 Lecture _10 -- Part 2

    13/31

  • 7/31/2019 Lecture _10 -- Part 2

    14/31

    The Hashemite University 14

    Continuous Distributions Normal

    Distribution function does not have closed form (soneither does the inverse) Can use numerical methods for inverse-transform Note that

    If we can generate unit normal (or called thestandard normal distribution), then we can generateany normal.

    Many algorithms were proposed to generate randomvariates from the normal distribution which differ intheir speed and accuracy.

    We will discuss two algorithms: Box and Muller algorithm. Polar algorithm.

    ),(~

    )1,0(~

    NX

    NX

  • 7/31/2019 Lecture _10 -- Part 2

    15/31

    The Hashemite University 15

    Continuous Distributions Normal:Box-Muller Method I

    Algorithm

    call.functionnextfor thesaveandReturn.3

    2sinln2

    ,2cosln2Set2.

    )1,0(~,tindependenGenerate.1

    21

    212

    211

    21

    XX

    UUX

    UUX

    UUU

  • 7/31/2019 Lecture _10 -- Part 2

    16/31

    The Hashemite University 16

    Continuous Distributions Normal:Box-Muller Method II

    Technically, independent N(0,1), butserious problem if used with LCGs:

    U1 and U2 if from the same stream could be

    adjacent resulting in having X1 and X2 to bedependent.

    Solution:

    Generate U1 and U2 from different streams

    (i.e. start with different seed values).

  • 7/31/2019 Lecture _10 -- Part 2

    17/31

    The Hashemite University 17

    Continuous Distributions Normal:Polar Method

    An improved version of the Box and Muller algorithm Faster than Box and Muller algorithm.

    Also generates random variates in pairs.

    Algorithm

    YVX

    YVX

    WWY

    VVWUVUV

    UUU

    22

    11

    22

    212211

    21

    /)ln2(

    letOtherwise,1.steptogo1WIf2.

    and,12and12Let

    ).1,0(~,tindependenGenerate.1

  • 7/31/2019 Lecture _10 -- Part 2

    18/31

    The Hashemite University 18

    Continuous Empirical Distributions

    You have two cases: You have an equation for the empirical distribution. Or you use the actual observation.

    For the first case you can use the inverse transformmethod.

    For the second case you can use the following

    algorithm exploits such fact which is an equivalentto the inverse transform method.

    Algorithm (first you must rank the data):

    n is the sample size (i.e. total number of empirical

    Xs).

    )()1()( )1(Return.2

    1letand

    ,1Let).1,0(~Generate.1

    III XXIPXX

    PI

    )U(n-PUU

  • 7/31/2019 Lecture _10 -- Part 2

    19/31

    The Hashemite University 19

    Empirical Distribution Function

    For example, for this distribution n = 6

  • 7/31/2019 Lecture _10 -- Part 2

    20/31

    The Hashemite University 20

    Discrete Distributions

    Can always use the inverse-transform methodbut it may not be the most efficient for alldistributions. Since it involve searching.

    Another general method is the alias method,which works for every finite range discretedistribution Requires some initial steps and extra storage. Will not be discussed here.

    We will explore how to generate random variates

    from common discrete theoretical distributionswhich are mainly based on inverse transformmethod.

  • 7/31/2019 Lecture _10 -- Part 2

    21/31

    The Hashemite University 21

    Bernoulli

    Mass function

    Algorithm: Discrete Inverse Transform

    0returnOtherwise.1returnIf.2

    )1,0(~Generate.1

    XXpU

    UU

    otherwise0

    1

    01

    )( xp

    xp

    xp

  • 7/31/2019 Lecture _10 -- Part 2

    22/31

    The Hashemite University 22

    Binomial

    Mass function

    Use the fact that ifX~binomial(t,p) then

    So use the convolution method which is based onthe discrete inverse transform of Bernoulli.

    otherwise0

    },...,1,0{)1()(

    txppx

    t

    xpxtx

    )(Bernoulli~

    ...,21

    pY

    YYYX

    i

    t

  • 7/31/2019 Lecture _10 -- Part 2

    23/31

    The Hashemite University 23

    Geometric

    Mass function

    Use inverse-transform

    )1ln()1ln(

    Return.2

    )1,0(~Generate.1

    pU

    X

    UU

    otherwise0

    },...,1,0{)1()(

    txppxp

    x

  • 7/31/2019 Lecture _10 -- Part 2

    24/31

    The Hashemite University24

    Negative Binomial

    Mass function

    Note thatX~negbin(s,p) iff

    So use the convolution method which is based onthe discrete inverse transform of Geometric.

    otherwise0

    },...,1,0{)1(

    1

    1

    )(txpp

    s

    x

    xpxs

    )(Geometric~...,21pYYYYX

    i

    t

  • 7/31/2019 Lecture _10 -- Part 2

    25/31

    The Hashemite University25

    Poisson

    Mass function

    Algorithm

    This algorithm is based on the relation betweenPoisson and exponential distributions.

    Rather slow especially for large values of . Novery good algorithm for Poisson distribution.

    otherwise0

    },...,1,0{

    !)(

    tx

    x

    exp

    x

    1stepback togoand1Let.3

    3.steptogoOtherwise.return,If

    .byreplaceand)1,0(~Generate2.

    0,1,Let.1

    11

    ii

    iXab

    bUbUU

    ibea

    ii

  • 7/31/2019 Lecture _10 -- Part 2

    26/31

    The Hashemite University26

    Generating Arrival Processes

    Remember that we model the arrivalprocess as a Poisson process and batchprocess based on the number of arrivalsper an arrival event.

    For the Poisson process we have: Stationary Poisson Process (SPP): the arrival

    rate is constant.

    Non-Stationary Poisson Process (NSPP): the

    arrival rate varies with time.

    We will explore algorithms used togenerate arrival times from each type.

  • 7/31/2019 Lecture _10 -- Part 2

    27/31

    The Hashemite University27

    Generating Arrival Times inSPP

    Stationary with rate >0

    Time between eventsAi=ti-ti-1 are IIDexponential

    Algorithm

    The above algorithm can be used with anyIID distribution (not only exponential) justfind the correct inverse of its cdf.

    UttUU

    ii ln1Return.2

    )1,0(~Generate.1

    1

  • 7/31/2019 Lecture _10 -- Part 2

    28/31

    The Hashemite University28

    Generating Arrival Times in NSPP

    Remember that a NSPP has an arrival rate that varies with

    time. Also remember the methods that we have used in Chapter 6

    to fit a NSPP to a stationary one. Thinning method. Inversion method.

    Based on these two methods we will generate arrival timesfrom a NSPP.

    (t)

    it 1it

  • 7/31/2019 Lecture _10 -- Part 2

    29/31

    The Hashemite University29

    Thinning Algorithm

    1. Set t=ti-1

    2. Generate U1, U

    2IID U(0,1)

    3. Replace tby

    4. If return ti= t. Otherwise,

    go back to step 2.

    As you see the above algorithm is similar to the acceptancerejection method.

    )(maxwhere,ln1 *1* tUtt

    *

    2 /)( tU

  • 7/31/2019 Lecture _10 -- Part 2

    30/31

    The Hashemite University30

    Inversion Algorithm

    1. Generate U~ U(0,1)

    2. Set ti=t

    i-1- ln U

    3. Return

    Where (see chapter 6):

    As you see the above algorithm is similar to the inversetransform method.

    )('1

    ii tt

    t

    (s)ds(t) 0

  • 7/31/2019 Lecture _10 -- Part 2

    31/31

    The Hashemite University31

    Additional Notes

    The lecture covers the followingsections from the textbook:

    Chapter 8

    Sections:

    8.3 (8.3.1, 8.3.2, 8.3.4, 8.3.5, 8.3.6, 8.3.16),

    8.4 (8.4.1, 8.4.4, 8.4.5, 8.4.6, 8.4.7),

    8.6 (8.6.1, 8.6.2)