Significance Testing 10/15/2013. Readings Chapter 3 Proposing Explanations, Framing Hypotheses, and...

Preview:

Citation preview

Significance Testing

10/15/2013

Readings

• Chapter 3 Proposing Explanations, Framing Hypotheses, and Making Comparisons (Pollock) (pp. 58-76)

• Chapter 5 Making Controlled Comparisons (Pollock)

• Chapter 4 Making Comparisons (Pollock Workbook)

Homework and Exams

• Homework 3 due today

• Exam 2 on Thursday

OPPORTUNITIES TO DISCUSS COURSE CONTENT

Office Hours For the Week

• When– Wednesday 10-12– Thursday 8-12– And by appointment

Course Learning Objectives

1. Students will learn the basics of research design and be able to critically analyze the advantages and disadvantages of different types of design.

2. Students will achieve competency in conducting statistical data analysis using the SPSS software program.

Bivariate Data Analysis

CROSS-TABULATIONS and Compare Means

Running a Test

• Select and Open a Dataset in SPSS

• Run either– A cross tab with column %’s (two categorical

variables)– A compare means test (involves a categorical and

continuous variable)

What are Cross Tabs?• a simple and effective way to measure

relationships between two variables.

• also called contingency tables- because it helps us look at whether the value of one variable is "contingent" upon that of another

When To Use Compare Means?

• A way to compare ratio variables by controlling for an ordinal or nominal variable – One ordinal vs. a ratio or

interval– One nominal vs. a ratio or

interval

• This shows the average of each category

Running Cross Tabs

• Select, Analyze

– Descriptive Statistics

– Cross Tabulations

Running Cross-Tabs

• Dependent variable is usually the row

• Independent variable is usually the column.

We have to use the measures available

Lets Add Some Percent'sClick on Cells Cell Display

In SPSS

• Open the States.SAV

• Analyze – Compare Means – Means

Where the Stuff Goes

• Your categorical variable goes in the independent List

• Your continuous variable goes in the Dependent List

Hypothesis Testing

Why Hypothesis Testing

• To determine whether a relationship exists between two variables and did not arise by chance. (Statistical Significance)

• To measure the strength of the relationship between an independent and a dependent variable? (association)

What is Statistical Significance?

• The ability to say that that an observed relationship is not happening by chance. It is not causality

• It doesn't mean the finding is important or that it has any real world application (beware of large samples)

• Practical significance is often more important

Determining Statistical Significance

• Establishing parameters or “confidence intervals”

• Are we confident that our relationship is not happening by chance?

• We want to be rigorous (we usually use the 95% confidence interval any one remember why)

How do we establish confidence

• Establishing a “p” value or alpha value

• This is the amount of error we are willing to accept and still say a relationship exists

P-values or Alpha levels

• p<.05 (95% confidence level) - There is less than a 5% chance that we will be wrong.

• p<.01. (99% confidence level) 1% chance of being wrong

• p<.001 (99.9 confidence level) 1 in 1000 chance of being wrong

Problems of the Alpha level (p-value)

• Setting it too high (e.g. .10)

• Setting it too low (.001)

• We have to remember our concepts and our units of analysis

You should always use the 95% Confidence interval (p<.05) unless

there is a good reason not to.

STATING HYPOTHESES

Testing a hypothesis

• Before we can test it, we have to state it– The Null Hypothesis- There is no relationship

between my independent and dependent variable

– The Alternate Hypothesis

• We are testing for Significance: We are trying to disprove the null hypothesis and find it false!

About the Null

The Alternate Hypothesis

• Also called the research hypothesis

• State it clearly

• State an expected direction

After testing, the Null is either

• True- no relationship between the groups, in which case the alternate hypothesis is false---- Nothing is going on (except by chance)!

• False- there is a relationship and the alternative hypothesis is correct-- something is going on (statistically)!

It seems pretty obvious whether or not you have a statistically significant

relationship, but we can often goof things up.

DECISION TYPES AND ERRORS

Keep or Reject the Null?

Errors and Decisions

A Type I Error

• Type I Error- the incorrect or mistaken rejection of a true null hypothesis (a false alarm)

A Type II error

• A Type II Error- accepting a null-hypothesis when it should have been rejected. (denial)

Type I and II (Climate Change)

You do not want to make either error

Recommended