bus 173 2nd hw

Embed Size (px)

Citation preview

  • 8/3/2019 bus 173 2nd hw

    1/22

    Homework: 2

    BUS173Prepared By:

    Name: Moh

    ammed

    Aftab

    ID# 101 0017 030 Section: 8

    Submitted To:

    M. Siddique Hossain (SQH)

    Lecturer

    School of Business

    North South University,Dhaka.

    Date of Submission: 21 November 2011

  • 8/3/2019 bus 173 2nd hw

    2/22

    2

    BUS 173

    For a large sample size the sampling distribution of chi square can be closely approximated by

    a continuous curve known as the chi-squared distribution. If we can assume that a population

    distribution is normal, then it can be shown that the sample variance and the population varianceare related through a probability distribution which is known as the chi-squared distribution.

    Multiplying s by (n-1) and dividing it by converts it into chi-squared random

    variable.

    It is possible to find the probabilities of this variable:

    = p [ (n-1)s/ ]

    When the number of degrees of freedom is n 1.

    o Importantproperties of chi-squared distribution:1. distribution is a continuous probability which has the value zero at its lower limit and

    extends to infinity in the positive direction.

    2. The exact shape of the distribution depends upon the number of degrees of freedom.When this value is small, the shape of the curve is skewed to the right and the distribution

    becomes more and more symmetrical as the value increases and thus can be

    approximated by the normal distribution.

    3. The mean of the chi-squared distribution is given by, E(x) = v and variance is given by,v(x) = 2v.The chi square distribution with (n-1) degrees of freedom is the distribution of

    the squares of (n-1) independent standard normal variables.

    Chi-squared distribution

  • 8/3/2019 bus 173 2nd hw

    3/22

    3

    4. As v gets larger, approaches the normal distribution with mean v and standarddeviation 2v.

    5. The sum of independent variables is also a variable.o Chi-square distribution of sample and population variances:

    Given a random sample of n observations from a normally distributed population whose

    population variance is and whose resulting sample variance is s. it can be shown that

    (n-1)s = (Xi x)

    has a distribution known as the chi-square distribution with n-1 degrees of freedom.

    We can use the properties of the chi-square distribution to find the variance of the samplingdistribution of the sample variance when the parent population is normal.

    Confidence interval for variance is based on the sampling distribution of

    which

    follows ^2 distribution with (n-1) degrees of freedom, 100(1-)% confidence interval for a is

    constructed by first obtaining an interval about

    A 100(1-) % confidence interval for (n-1)s/ is given by,

    p ( n - 1,1 - /2 < n-1 < n - 1,/2 ) = 1

    p ( n - 1,1 - /2 < (n-1)s/ < n - 1,/2 ) = 1

    p ( (n - 1)s / n - 1,/2 < < (n 1)s / n - 1,1 - /2 ) = 1

    o Confidence interval forthe difference between two normal population means: Dependent samples: for example in pharmaceutical industry say a drug is already in the

    market, working well for the people, but a new drug is about to be launched in the

    market. What we do is this test the effectiveness of the new drug under similar

    Confidence interval for variance

  • 8/3/2019 bus 173 2nd hw

    4/22

    4

    condition as the old drug. See whether there is any difference in the result. We consider

    samples to be dependent if the values in the other sample. In this scheme the sample

    members are chosen in pairs one from each population. Thus the procedure is often

    known as method pairs.

    INDEPENDENT SAMPLES: In this scheme samples are drawn independently fromtwo normally distributed populations with known population variances so that the

    membership of one of the sample is not influenced by the membership of the sample.

    We as business decision maker may wish to decide on the basis of sample data.

    1. Whether a new medicine is really effective in curing a disease.2. Whether one training procedure is better than another.3. Whether light bulb produced under systemA is better than the system B.

    Such decisions are called statistical decisions. Hypothesis tests are widely used in business and

    industries for making decisions.

    It is here probability and sampling theory plays an ever increasing role in constructing the

    criteria on which business decisions are made.

    The statistical testing of hypothesis is the most important technique in statistical inference

    A hypothesis is an assumption to tested.

    o Procedure for Hypothesis testing:1. Set up a hypothesis: the approach to hypothesis testing is not to construct single

    hypothesis about the population parameter but rather to set up two hypothesis. They are:

    Null hypothesis, denoted by Ho

    Alternative hypothesis, denoted by H1

    2. Set up a suitable significance level : the level of significance is generally specified beforeany samples are drawn. So that results obtained will not influence our choice.

    Test of hypothesis

  • 8/3/2019 bus 173 2nd hw

    5/22

    5

    3. Determination of a suitable test statistic; the third step is to determine a suitable teststatistic and its distribution. General form of test statistic is

    Test statistic = (sample statistic Hypothesized parameter) std. dev of the statistic

    4. Determine the critical region: it is important to specify before the sample is taken whichvalue of the test statistic will lead to a rejection of Ho and which lead to acceptance of

    Ho.

    5. Doing computations: we need to perform various computations from a random sample ofsize n necessary for the test statistic. Then we need to see whether sample result falls in

    the critical region or in the acceptance region.

    6. Making decisions; draw statistical conclusions and decision maker may take the decision.

    There are four possible results in a statistical hypothesis testing:

    1. the hypothesis is true but our test rejects it2. the hypothesis is false but our test accepts it3. the hypothesis is true and our test accepts it4. the hypothesis is false and our test rejects it

    The first two possibilities lead to error. If we reject a hypothesis when it should be accepted we

    say that type I error has been made.

    If we accept a hypothesis when it should be rejected, we say that a type II error has been made.

    Type Iand type II errors

  • 8/3/2019 bus 173 2nd hw

    6/22

    6

    DECISION HO:TRUE HO:FALSE

    ACCEPT HO CORRECTDECISION

    1-

    TYPE 2 ERROR

    REJECT HO TYPE 1 ERROR CORRECTDECISION (1-

    P)

  • 8/3/2019 bus 173 2nd hw

    7/22

    7

    The probability of committing a type 1 error is designated as and is called the level of

    significance.

    Though it is difficult to draw a clear cut line of demarcation between large and small samples,

    it is generally agreed that if the size of sample exceeds 30 if should be regarded as a large

    sample.

    o Tests of hypothesis involving large samples are based on the following assumptions:1. The sampling distribution of statistic is approximately normal.2. Values given by the samples are sufficiently close to the population value and can be

    used in its place for the standard error of the estimate.

    o Interpretation of the Probability values or P-values: the p-value provides more preciseinformation about the strength of the rejection of the null hypothesis that result from the

    observed sample mean.

    The p-value is the smallest significance level at which Ho can be rejected. Consider a random

    sample of size n observations from a population that has a normal distribution with mean and

    standard deviation and the resulting computed sample mean, x-bar.

    1. The farther the true mean is from the hypothesized mean, the greater is the power of thetest, everything else being equal.

    Tests of hypothesis concerning large numbers

    The Features of the Power Function

  • 8/3/2019 bus 173 2nd hw

    8/22

    8

    2. The smaller the significance level of the test, the smaller the power, everything also beingequal. Thus reducing the probability of type I error increases the probability of type II

    error, but reducing by 0.01 does not generally increase by 0.01. the change is not

    linear.

    3. The larger the population variance, the lower the power of the test, everything else beingequal.

    4. The larger the sample size, the greater the power of the test, everything else being equal.5. The power of the test at the critical value equals 0.5 because that a sample mean is above

    o=xc is, of course, 0.50.

    SUMSFROMTHEBOOK

    8.2

    Pairs Before After Dif.(Di)

    1 6 8 -2

    2 12 14

    -23 8 9 -1

    4 10 13 -3

    5 6 7 -1

    1 = 8.4 2 = 10.2 d = -1.8

    Std deviation = {(di d) (n -1) }

    = (2.8/4)

    = 0.8367

    ME = tn-1,/2 sd/n

    = t4,0.05 0.8367/5

  • 8/3/2019 bus 173 2nd hw

    9/22

    9

    = 2.132 x 0.8367/5

    = 0.798

    LCL = d tn-1,/2 sd/n UCL = d + tn-1,/2 sd/n

    = -2.598 = -1.002

    ME = t4,0.025 x sd/n

    = 2.776 x 0.8367/5

    = 1.04

    Width = 2 x ME = 2 x 1.04 = 2.08

    8.4

    Pairs Without passive

    solar

    With passive solar Dif.(Di)

    1 485 452 33

    2 423 386 37

    3 515 502 134 425 376 49

    5 653 605 48

    6 386 380 6

    7 426 395 31

    8 473 411 62

    9 454 415 39

    10 496 441 55

    1 = 473.6 2 = 436.3 37.3

    d = di / n = 37.3

    sd = {(di d) (n -1)}

  • 8/3/2019 bus 173 2nd hw

    10/22

    10

    = 17.658

    tn-1,/2 = t9,0.05 = 1.833

    Confidence interval for 90%:

    d tn-1,/2 sd/n 1 - 2 d + tn-1,/2 sd/n

    37.3 (1.833 x 17.658)/10 1 - 2 37.3 + (1.833 x 17.658)/10

    27.06 1 - 2 47.54

    8.6

    a. Degrees of freedom, n1 + n2 2 = 12 + 14 2 = 24

    b. degrees of freedom, n1 + n2 2 = 6 +7 2 = 11

    c. n1 + n2 2 = 9 + 12 2 = 19

    8.10

    a. degrees of freedom, v = [6/12 + 10/14] [ (6/12)/11 + (10/14)/13 ]

    = (289/196) (1737/28028)

    = 23.79 = 24

    b. v = [ (30/6 + 36/10) ] [ (30/6)/5 + (36/10)/9 ]

    = 73.96/(5+1.44) = 11.48 = 11

    c. v = [ (16/9 + 25/12) ] [ (16/9)/8 + (25/12)/11 ]

    = 18.88 = 19

    d. v = [ (30/6 + 36/7) ] [ (30/6)/5 + (36/7)/6 ]

    = 10.93 = 11

  • 8/3/2019 bus 173 2nd hw

    11/22

    11

    8.18

    ME = Z/2 [ p1(1-p1)/n1 + p2(1-p2)/n2 ]

    a. ME = Z/2 0.75 x 0.25/260 + 0.68 x 0.32/200= 0.0425 Z/2

    b. ME = Z/2 0.6 x 0.4/400 + 0.68 x 0.32/500= 0.0322 Z/2

    c. ME = Z/2 0.2 x 0.8/500 + 0.25 x 0.75/375= 0.0286 Z/2

    8.20

    n1 = 120 p1 = 0.708

    n2 = 163 p2 = 0.479

    = 0.02

    /2 = 0.01

    ME = Z/2 [ p1(1-p1)/n1 + p2(1-p2)/n2 ]

    = 2.33 0.708 x 0.292/120 + 0.479 x 0.521/163

    = 2.33 x 0.057 = 0.133

    Confidence interval = x 0.133 p1 p2 x + 0.133

    (0.708 0.479) 0.133 < < (0.706 0.479) + 0.133

    0.096 < < 0.362

  • 8/3/2019 bus 173 2nd hw

    12/22

    12

    8.26

    ME = (Z/2)/n

    a. n = (1.96)/0.03 = 4268.446 x = 4268.44 x 0.5 = 1067.11 = 1067b. n = (1.96)/0.05 = 1536.64 x = 1536.64 x 0.5 = 384.16 = 384c. As the margin of error increases, the sample size required decreases.

    8.28

    a. ME = 0.04 = 0.1 ME = (Z/2)/n

    n = (1.645 x 0.25)/0.04 = 16.91 = 17

    b. = 0.025

    n = (1.96 x 0.25)/0.04 = 600.25 = 600

    c. ME = 0.05

    = 0.02

    n = (2.33 x 0.25)/0.04 = 848.27 = 848

    8.30

    = 0.1 ME = 0.03 ME = (Z/2)/n

    n = (1.645 x 0.25)/0.03 = 751.67 = 752

  • 8/3/2019 bus 173 2nd hw

    13/22

    13

    8.40

    a. = 0.1

    /2 = 0.05

    x y = 480 520 = -40

    d.f. = nx + ny -2 = 10 + 12 2 = 20

    s = [(nx 1)sx + (ny 1)sy] (nx + ny 2)

    = (9 x 30 + 11 x 25)/20 = 748.75

    Confidence Interval =

    (x y) tnx+ny,/2 (s/nx) + (s/ny) 1 - 2 (x y) + tnx+ny,/2 (s/nx) + (s/ny)

    40 t20,0.05 (748.75/10) + (748.75/12) 1 - 2 40 t20,0.05 (748.75/10) +

    (748.75/12)

    40 1.725 x 11.716 1 - 2 -40 + 1.725 x 11.716

    -60.21 1 - 2 -19.79

    b. = 0.1

    /2 = 0.05

    degrees of freedom,v = (sx/nx + sy/ny) [ (sx/nx)/nx-1 + sy/ny)/ny-1 ]

    = (30/10 + 25/12) [ (30/10)/9 + (25/12)/9 ]

    = 16.8 = 17

    Confidence interval = (x-y) tv,/2 (sx/nx +sy/ny) 1 - 2 (x-y) + tv,/2 (sx/nx + sy/ny)

  • 8/3/2019 bus 173 2nd hw

    14/22

    14

    = -40 1.74 x 11.92 1 - 2 -40 + 1.74 x 11.92

    = -60.74 1 - 2 -19.26

    8.42

    nx = 21 x = 72.1 sx = 11.3

    ny = 18 y = 73.8 sy = 10.6

    d.f. = nx + ny 2 = 21 + 18 2= 37

    x y = 72.1 73.8 = -1.7

    = 0.2 /2 = 0.1

    s = [(nx 1)sx + (ny 1)sy ] (nx + ny -2)

    = (20 x 11.3 + 17 x 10.6) / 37

    = 120.65

    ME = t37,0.1 [ (120.65/21) + (120.65/18) ]

    = 1.303 x 3.53

    = 4.597

    Confidence interval = (x-y) ME 1 - 2 (x-y) + ME

    = -1.7 4.597 1 - 2 -1.7 + 4.597

    = -6.297 1 - 2 2.897

    8.50

    n1 = 400 p1 = 0.75

    n2 = 500 p2 = 0.45

  • 8/3/2019 bus 173 2nd hw

    15/22

    15

    = 0.05 /2 = 0.0025 p1 p2 = 0.75 0.45 = 0.3

    (0.75 x 0.25)/400 + (0.45 x 0.55)/500 = 0.031

    Confidence interval =

    (p1-p2) Z/2[ p1(1-p2)/n1 + p2(1-p2)/n2] p1 p2 (p1-p2) + Z/2[ p1(1-p2)/n1 + p2(1-

    p2)/n2]

    0.3 1.96 x 0.031 p1 p2 0.3 + 1.96 x 0.031

    0.239 p1 p2 0.361

    7.42

    a. p(Xn-1,1-/2 < Xn-1 < Xn-1,/2) = 1

    p[ (n-1)s/Xn-1,/2 < < (n-1)s/Xn-1,1-/2) = 1

    Lower Limit = (n-1)s / Xn-1,/2

    20 x 16 / (X20, 0.0125) = 20 x 16 /37.57 = 8.52

    b. 15 x 8 / X15, 0.025 = 15 x 8 / 27.49 = 34.92

    c. 27 x 15 / X27, 0.005 = 27 x 15 / 49.64 = 122.38

    7.48

    n = 18 s = 10.4 = 0.1

    p[ (n-1)s/Xn-1,/2 < < (n-1)s/Xn-1,1-/2) = 1

    p[ 17 x 10.4/X17, 0.05 < < 17 x 10.4/X17, 0.95) = 0.90

    Confidence Interval = 17 x 10.4/27.59 < < 17 x 10.4/8.67

  • 8/3/2019 bus 173 2nd hw

    16/22

    16

    66.64 < < 212.08

    9.4

    a. Ho = the food is unhealthy

    H1 = the food is healthy

    b. Ho = the food is healthy

    H1 = the food is unhealthy

    9.8

    As the sample size increases, the critical value, x, decreases.

    As the population variance, , increases, the critical value increases.

    9.10

    p-value

    a. (-80 + 70) / (15/25) = -3.33b. (-80 + 70) / (900/25) = -1.67c. (-80 + 70) / (400/25) = -2.5d. (-80 + 70) / (600/25) = -2.04

    9.12

    n = 9 = 0.1

    Ho : 50

    H1 : < 50

  • 8/3/2019 bus 173 2nd hw

    17/22

    17

    P(Z z) = p (48.5 50) / (3/9)

    = p( Z -1.8)

    As -1.8 < -1.64, we will reject the null hypothesis.

    9.14

    a. If x > x2 reject Ho

    (106 100) / 15/25 > 1.96

    2 > 1.96 so, reject Ho.

    b. (104 100) / 10/25 > 1.96

    2 > 1.96 so, reject Ho(95 100) / 100/25 > 1.96

    c. (95 100)/10/25 > 1.96

    = -2.5 < 1.96 so, accept Ho.

    d. (92 100) / 18/25 > 1.96

    = -2.22 < 1.96 so accept Ho

    9.20

    Ho : = 0

    H1 : < 0 x2 = -2.91 0 / 11.33/170 = -3.35

    x = -2.91

    s = 11.33 if x < xc reject ho, thus here we fail to reject Hon = 170

    9.24

  • 8/3/2019 bus 173 2nd hw

    18/22

    18

    Ho = = 20

    H1 = 20

    = 0.05 n = 9

    x = 20.36

    Reject Ho if t = (x-o)/(s/n) < tn-1,/2 OR, t = (x-o)/(s/n) > tn-1,/2

    s = 1/(n-1) (xi x)

    = 1/(n-1) (1.04 + 0.66 + 0.66 + 0.24 + 0.44 + 0.26 + 0.66 + 0.06 + 0.54)

    = 1/8(3.0024) = 0.3753

    So, s = 0.6126(X - o)/s/n < -tn-1, /2 OR, (X - o)/s/n > -tn-1, /2

    20.36 20/0.6126/9 < -t8.0.025 20.36 20/0.6126/9 > -t8.0.025

    1.76 > -2.306 1.76 < 2.306

    So, reject the Ho.

    9.28

    a. Reject Ho if (P Po)/Po(1-Po)/n > Z

    P 0,25 = 0.047

    P = 0.297

    b. p 0.25/(0.25 x 0.75)/225 > 2.17

    p 0.25 = 0.0626

    p = 0.3130

    c. p 0.25/(0.25 x 0.75)/625 > 2.17

    p 0.25 = 0.038

    p = 0.2880

  • 8/3/2019 bus 173 2nd hw

    19/22

    19

    d. p 0.25/(0.25 0.75)/900 > 2.17

    p 0.25 = 0.0313

    p = 0.2810

    9.34

    p = 28/50 0.56

    n = 50

    Ho : Po = 0.5

    H1 : Po > 0.5

    = 0.05 Z = 1.96reject Ho if (P Po)/Po(1-Po)/n > Z

    (0.56 0.50)/ 0.5/50

    = 0.8480

    As 0.8480 < Z, we fail to reject the Ho

    9.38

    Reject Ho ifPx > OR, Px < 0.46

    a. Ho is not rejected

    p(Px < 0.46) = p(Z < (0.46 0.52)/(0.52 x 0.48)/600)

    = p(Z < -2.96)

    = 0.5 p(0 < z < 2.96)

    = 0.5 0.4985

    = 0.0015

    And, p(Px > 0.54) = p(z > (0.54 0.52)/(0.52 x 0.48)/600)= p(z > 0.981)

    = 0.5 0.3365

    = -.1635

    = 0.0015 + 0.1635 = 0.1650

  • 8/3/2019 bus 173 2nd hw

    20/22

    20

    b. Ho is rejected

    p(0.46 < Px < 0.54) = p (0.46 0.58)/(0.42 x 0.58)/600 < Zx < (0.54 0.58)/(0.42 x 0.58)/600

    = p(-5.96 < Z < -1.99)

    = 0.4999 0.4767

    = 0.0232

    = 0.0232

    c. Ho is not rejected

    p(Px (0.54 0.53)/((0.53x0.47)/600) = p(Z>0.49)

    = 0.5 0.1879

    = 0.3121

    = 0.003 + 0.3121 = 0.3151

    d. Ho is not rejected

    p(Px>0.54) = p(Z> (0.54 0.48)/((0.52x0.48)/600) = p(Z>2.94)

    = 0.5 0.4984

    = 0.0016

    p(Px>0.46) = p(Z> (0.46 0.48)/((0.52x0.48)/600) = p(Z>-.98)

    = 0.5 0.3365

    = 0.1635

    = 0.0016 + 0.1635 = 0.1651

    e. Ho is rejected

    P(0.46 < Px < 0.54) =

    p(0.46 0.43)/((0.57x0.43)/600) < Z < p(0.54 0.43)/((0.57x0.43)/600)

  • 8/3/2019 bus 173 2nd hw

    21/22

    21

    = p(1.48 < Z < 5.44)

    = 0.5 0.4306

    = 0.0694

    = 0.0694

    9.40

    x = 3.07 o = 3 n = 64 = 0.4 Z = 1.96

    a. Ho : = 3

    H1 : > 3

    Reject H1 if (x- )/(0.4/64) > Z

    (3.07 3)/(0.4/64) = 1.4

    As 1.4 > 1.96, we fail to reject Ho.

    b. reject Ho if (x-)/(/n) > Z

    = x > 1.96(0.4/64) + 3

    X > 3.098

    So, Ho is rejected

    P(X>3.098)

    P(Z > (3.098 3.1)/(0.4/64) = P(Z>-0.04) = 0.5 + 0.016 = 0.5160

    9.48

    n = 8

    s = 1/(n-1)(xi x)

    = 1/7 (1.625 + 43.625 + 21.625 + 45.375 + 18.38 + 22.625 + 34.375)

    = 1/7(6463.5)

    S = 30.39Ho : 500

    H1 : < 500

    Reject Ho if (n-1)s/ > Xn-1,

    = (7 x 923.4)/500

  • 8/3/2019 bus 173 2nd hw

    22/22

    22

    =12.93

    X7,0.1 = 12.02

    As 12.02 < 12.93, reject Ho.