51
Experimental Design, Statistical Analysis CSCI 4800/6800 University of Georgia March 7, 2002 Eileen Kraemer

Experimental Design, Statistical Analysis

Embed Size (px)

DESCRIPTION

Experimental Design, Statistical Analysis. CSCI 4800/6800 University of Georgia March 7, 2002 Eileen Kraemer. Research Design. Elements: Observations/Measures Treatments/Programs Groups Assignment to Group Time. Observations/Measure. Notation: ‘O’ Examples: Body weight - PowerPoint PPT Presentation

Citation preview

Page 1: Experimental Design, Statistical Analysis

Experimental Design, Statistical Analysis

CSCI 4800/6800University of GeorgiaMarch 7, 2002Eileen Kraemer

Page 2: Experimental Design, Statistical Analysis

Research Design

Elements: Observations/Measures Treatments/Programs Groups Assignment to Group Time

Page 3: Experimental Design, Statistical Analysis

Observations/Measure

Notation: ‘O’ Examples:

Body weight Time to complete Number of correct response

Multiple measures: O1, O2, …

Page 4: Experimental Design, Statistical Analysis

Treatments or Programs

Notation: ‘X’ Use of medication Use of visualization Use of audio feedback Etc.

Sometimes see X+, X-

Page 5: Experimental Design, Statistical Analysis

Groups

Each group is assigned a line in the design notation

Page 6: Experimental Design, Statistical Analysis

Assignment to Group

R = randomN = non-equivalent groupsC = assignment by cutoffs

Page 7: Experimental Design, Statistical Analysis

Time

Moves from left to right in diagram

Page 8: Experimental Design, Statistical Analysis

Types of experiments

True experiment – random assignment to groupsQuasi experiment – no random assignment, but has a control group or multiple measuresNon-experiment – no random assignment, no control, no multiple measures

Page 9: Experimental Design, Statistical Analysis

Design Notation ExampleR O1 X O1,2

R O1 O1,2

Pretest-posttest treatment

comparison group

randomized experiment

Page 10: Experimental Design, Statistical Analysis

Design Notation Example

N O X O

N O O

Pretest-posttest

Non-Equivalent Groups

Quasi-experiment

Page 11: Experimental Design, Statistical Analysis

Design Notation ExampleX O

Posttest Only

Non-experiment

Page 12: Experimental Design, Statistical Analysis

Goals of design ..

Goal:to be able to show causalityFirst step: internal validity: If x, then y AND If not X, then not Y

Page 13: Experimental Design, Statistical Analysis

Two-group Designs

Two-group, posttest only, randomized experiment

R X O

R O

Compare by testing for differences between means of groups, using t-test or one-way Analysis of Variance(ANOVA)

Note: 2 groups, post-only measure, two distributions each with mean and variance, statistical (non-chance) difference between groups

Page 14: Experimental Design, Statistical Analysis

To analyze …

What do we mean by a difference?

Page 15: Experimental Design, Statistical Analysis

Possible Outcomes:

Page 16: Experimental Design, Statistical Analysis

Measuring Differences …

Page 17: Experimental Design, Statistical Analysis

Three ways to estimate effect

Independent t-testOne-way Analysis of Variance (ANOVA)Regression Analysis (most general)

equivalent

Page 18: Experimental Design, Statistical Analysis

Computing the t-value

Page 19: Experimental Design, Statistical Analysis

Computing the variance

Page 20: Experimental Design, Statistical Analysis

Regression Analysis

Solve overdetermined system of equations for β0 and β1, while minimizing sum of e-terms

Page 21: Experimental Design, Statistical Analysis

Regression Analysis

Page 22: Experimental Design, Statistical Analysis

ANOVA

Compares differences within group to differences between groupsFor 2 populations, 1 treatment, same as t-testStatistic used is F value, same as square of t-value from t-test

Page 23: Experimental Design, Statistical Analysis

Other Experimental Designs

Signal enhancers Factorial designs

Noise reducers Covariance designs Blocking designs

Page 24: Experimental Design, Statistical Analysis

Factorial Designs

Page 25: Experimental Design, Statistical Analysis

Factorial Design

Factor – major independent variable Setting, time_on_task

Level – subdivision of a factor Setting= in_class, pull-out Time_on_task = 1 hour, 4 hours

Page 26: Experimental Design, Statistical Analysis

Factorial Design

Design notation as shown2x2 factorial design (2 levels of one factor X 2 levels of second factor)

Page 27: Experimental Design, Statistical Analysis

Outcomes of Factorial Design Experiments

Null caseMain effectInteraction Effect

Page 28: Experimental Design, Statistical Analysis

The Null Case

Page 29: Experimental Design, Statistical Analysis

The Null Case

Page 30: Experimental Design, Statistical Analysis

Main Effect - Time

Page 31: Experimental Design, Statistical Analysis

Main Effect - Setting

Page 32: Experimental Design, Statistical Analysis

Main Effect - Both

Page 33: Experimental Design, Statistical Analysis

Interaction effects

Page 34: Experimental Design, Statistical Analysis

Interaction Effects

Page 35: Experimental Design, Statistical Analysis

Statistical Methods for Factorial Design

Regression AnalysisANOVA

Page 36: Experimental Design, Statistical Analysis

ANOVA

Analysis of variance – tests hypotheses about differences between two or more meansCould do pairwise comparison using t-tests, but can lead to true hypothesis being rejected (Type I error) (higher probability than with ANOVA)

Page 37: Experimental Design, Statistical Analysis

Between-subjects design

Example: Effect of intensity of background

noise on reading comprehension Group 1: 30 minutes reading, no

background noise Group 2: 30 minutes reading,

moderate level of noise Group 3: 30 minutes reading, loud

background noise

Page 38: Experimental Design, Statistical Analysis

Experimental Design

One factor (noise), three levels(a=3)Null hypothesis: 1 = 2 = 3

Noise None Moderate High

R O O O

Page 39: Experimental Design, Statistical Analysis

Notation

If all sample sizes same, use n, and total N = a * nElse N = n1 + n2 + n3

Page 40: Experimental Design, Statistical Analysis

Assumptions

Normal distributions

Homogeneity of variance Variance is equal in each of the

populations

Random, independent samplingStill works well when assumptions not quite true(“robust” to violations)

Page 41: Experimental Design, Statistical Analysis

ANOVA

Compares two estimates of variance MSE – Mean Square Error, variances

within samples MSB – Mean Square Between, variance

of the sample means

If null hypothesis is true, then MSE approx = MSB, since

both are estimates of same quantity Is false, the MSB sufficiently > MSE

Page 42: Experimental Design, Statistical Analysis

MSE

Page 43: Experimental Design, Statistical Analysis

MSB

Use sample means to calculate sampling distribution of the mean,

= 1

Page 44: Experimental Design, Statistical Analysis

MSB

Sampling distribution of the mean * nIn example, MSB = (n)(sampling dist) = (4) (1) = 4

Page 45: Experimental Design, Statistical Analysis

Is it significant?

Depends on ratio of MSB to MSEF = MSB/MSEProbability value computed based on F value, F value has sampling distribution based on degrees of freedom numerator (a-1) and degrees of freedom denominator (N-a)Lookup up F-value in table, find p valueFor one degree of freedom, F == t^2

Page 46: Experimental Design, Statistical Analysis

Factorial Between-Subjects ANOVA, Two factors

Three significance tests Main factor 1 Main factor 2 interaction

Page 47: Experimental Design, Statistical Analysis

Example Experiment

Two factors (dosage, task)3 levels of dosage (0, 100, 200 mg)2 levels of task (simple, complex)2x3 factorial design, 8 subjects/group

Page 48: Experimental Design, Statistical Analysis

Summary tableSOURCE df Sum of Squares Mean Square F pTask 1 47125.3333 47125.3333 384.174 0.000 Dosage 2 42.6667 21.3333 0.174 0.841 TD 2 1418.6667 709.3333 5.783 0.006 ERROR 42 5152.0000 122.6667 TOTAL 47 53738.6667

Sources of variation: Task Dosage Interaction Error

Page 49: Experimental Design, Statistical Analysis

Results

Sum of squares (as before)Mean Squares = (sum of squares) / degrees of freedomF ratios = mean square effect / mean square errorP value : Given F value and degrees of freedom, look up p value

Page 50: Experimental Design, Statistical Analysis

Results - example

Mean time to complete task was higher for complex task than for simpleEffect of dosage not significantInteraction exists between dosage and task: increase in dosage decreases performance on complex while increasing performance on simple

Page 51: Experimental Design, Statistical Analysis

Results