Upload
peter-cook
View
217
Download
2
Tags:
Embed Size (px)
Citation preview
What Is It? • Analysis of Variance (ANOVA): allows for the simultaneous
comparison of the difference between two or more means• Partition: a statistical procedure in which the total variance
is divided into separate components– Partitioning of variance is what gives the ANOVA its name
• One-Way ANOVA: compares more than two levels of a single IV
General Linear Model• Factor: the term used for an IV in an ANOVA
– Factors have several Levels (values or conditions)• Major Assumptions:
– The only difference in means is due to the levels of the IV– The variances of the groups are equivalent
(homogeneity of variance)
Assumptions of ANOVA
• Data meet the criteria for parametric statistics (interval/ratio level data).
• The data is normally distributed in each group.• There is homogeneity of variance• The observations in each sample are independent
of one another.
Components of Variance• Total Variance: the variance of
all scores in the data set regardless of experimental group
• Comprised of: – Between-Groups Variance– Within-Groups Variance
Σ(X ij - X )2
N - 1ŝ 2
total =
X = the grand mean
Within-Groups Variance• Within-Groups Variance:
estimate of the average variance within each group
• Homogeneity of Variance: σ1
2 = σ22 = σ3
2 = … σj2
Σ(X ij - X j )2
n j - 1Σ
k
ŝ 2within =
k = the number of groups
Between-Groups Variance• Between-Groups Variance: estimate of variance between
group means
• Two Sources: – Error Variance: uncontrolled
and unpredicted differences among individual scores; the within-groups variance estimates the error variance
– Treatment Variance: the variance among group means that is due to the effects of the IV
Σn j (X j - X )2
k - 1ŝ 2
between =
The F-Ratio• F-Ratio: the ratio of the between-groups variance divided
by the within-groups variance • Can be expressed as:
• Or
• Or
treatment variance + error varianceerror variance
F =
between-groups variancewithin-groups variance
F =
σ2between
σ2within
F =
The F-Ratio
• No Treatment Effect
• Treatment Effect Present
0.0 + 5.05.0
= 1.00F =
5.0 + 5.05.0
F = = 2.00
Is it Significant?• Same Concept: distributions represent the probability of
various F-ratios when the null hypothesis is true• Two types of degrees of freedom determine the shape of
the distribution.– Between-Groups
– Within-Groups
or
• If computed F > critical F (or if the computer tells you it is), the F statistic is significant.
df between = k - 1
df within = Σ(n j - 1)
df within = N - k
Post What?
• The F-ratio does not specify which means are different from other means.
• It only implies that the difference between the means (at least two) is great enough to be statistically significant.
• Post hoc tests utilize pairwise comparisons to determine which groups are statistically different.
Using SPSS to Compute a One-Way ANOVA
• Analyze General Linear Model Univariate• Move the independent variable to the Fixed Factor(s) box
Move the dependent variable to the Dependent Variable box
• Click Options highlight the independent variable in the Factor(s) box and move it to the Display Means for box Under Display, check descriptive statistics, estimates of effect size, and homogeneity tests Note that the significance level is already set at 0.05 Click Continue
• Click Post Hoc highlight the independent variable in the Factor(s) box and move it to the Post Hoc Tests for box Under Equal Variances Assumed, check Tukey (not Tukey’s-b) Click continue.
What Does It All Mean?
The descriptive statistics box provides the mean, standard deviation, and number for each group.
Levene’s test is designed to compare the error variance of the dependent variable across groups. We do not want this result to be significant.
Descriptive Statistics
Dependent Variable: Number of Objects Recalled
11.1250 1.55265 8
14.4286 3.55233 7
17.4000 2.40832 5
13.8500 3.55816 20
sleep_catLittle
Average
Sufficient
Total
Mean Std. Deviation N
Levene's Test of Equality of Error Variancesa
Dependent Variable: Number of Objects Recalled
.732 2 17 .496F df1 df2 Sig.
Tests the null hypothesis that the error variance ofthe dependent variable is equal across groups.
Design: Intercept+sleep_cata.
Understanding the Output
The row you are interested in is the row which has the name of your variable in it. The between df appear in this row; the within degrees of freedom appear in the error row. F is your test statistic, and Sig is its probability.
Estimated marginal means (the next box), I did not put here. It merely provides the 95% confidence intervals for each of the means.
Tests of Between-Subjects Effects
Dependent Variable: Number of Objects Recalled
124.761a 2 62.380 9.159 .002 .519
3943.531 1 3943.531 578.983 .000 .971
124.761 2 62.380 9.159 .002 .519
115.789 17 6.811
4077.000 20
240.550 19
SourceCorrected Model
Intercept
sleep_cat
Error
Total
Corrected Total
Type III Sumof Squares df Mean Square F Sig.
Partial EtaSquared
R Squared = .519 (Adjusted R Squared = .462)a.
Post Hoc Analysis
Multiple Comparisons provide the mean difference between each level of the IV and its significance. The numbers in this box repeat themselves. It is only necessary to interpret one of each comparison… which one depends on the hypothesis.
Multiple Comparisons
Dependent Variable: Number of Objects Recalled
Tukey HSD
-3.3036 1.35071 .063 -6.7686 .1615
-6.2750* 1.48782 .002 -10.0918 -2.4582
3.3036 1.35071 .063 -.1615 6.7686
-2.9714 1.52815 .157 -6.8917 .9488
6.2750* 1.48782 .002 2.4582 10.0918
2.9714 1.52815 .157 -.9488 6.8917
(J) sleep_catAverage
Sufficient
Little
Sufficient
Little
Average
(I) sleep_catLittle
Average
Sufficient
MeanDifference
(I-J) Std. Error Sig. Lower Bound Upper Bound
95% Confidence Interval
Based on observed means.
The mean difference is significant at the .05 level.*.