CMNS 260: Empirical Communication Research Methods 13-Review and Overview of the Course Professor:...

Preview:

Citation preview

CMNS 260: Empirical Communication Research Methods 13-Review and Overview of the Course

CMNS 260: Empirical Communication Research Methods 13-Review and Overview of the Course

Professor: Jan Marontate Teaching Assistants: Nawal Musleh-Motut, Megan Robertson

Lab Instructor: Chris JeschelnikSchool of Communication.

Simon Fraser UniversityFall 2011

Outline of Class Activities Today• Syllabus & Outline of Class Sessions– Objectives

• Selected excerpts of lecture material to review for final examination

• Study tips for final examination• Discussion of last assignment

Course contentCourse content

• Introduce different forms of research • Analyze relationships between goals,

assumptions, theories and methods• Study basic data collection and analysis

techniques• Research process—focusing on empirical

methods

Why study methods? Practical aspects– learn to read other people’s research & critically

evaluate it– learn ways to find your own “data” to answer your

own research questions– acquire skills potential employers seek– self-defense (against misinformation) &

responsible citizenship

Babbie (1995: 101)

The Research Process

Why study methods? – “Knowledge is power” (to acquire skills for social

action or change)• “Savoir pour pouvoir, Pouvoir pour prévoir” (Auguste

Comte)• «To know to do (have power), to do (have power) in

order to predict the future and plan for it »

– « Knowledge is understanding »• “décrire, comprendre, expliquer ” (Gilles Gaston Granger)• “to describe, to understand and to explain”

Research has the potential to inform and misinform

• even well-done research is not always used accurately

• some research is technically flawed• knowledge of methods an important tool for

understanding logic and limits of claims about research

Research Methodology (Scholarly Perspectives)

• Process– methods– logic of inquiry (assumptions & hypotheses)

• Produces– laws, principles and theories that can be tested• (Karl Popper & notion of falsifiability for politically

engaged scholars interested in the fight against genocide in the early 20th century)

Research has the potential to inform and misinform

• even well-done research is not always used accurately

• some research is technically flawed• knowledge of methods an important tool for

understanding logic and limits of claims about research

Other Ways of Knowing

– authority (parents, teachers, religious leaders, media gurus)

– tradition (past practices)

– common sense– media (TV. etc.)– personal experience

Talk show host Oprah Winfrey

Cory DoctorowElectronic Frontier Assoc. & Boingboing.net

Ordinary Inquiry vs. Scholarly Inquiry

Risks of “Errors” associated with non-scholarly knowledge

• selective observation--only notice some phenomena-- miss others

• overgeneralization-evidence applied to too wide a range of conditions

• premature closure--jumping to conclusions• halo effect--idea of being influenced by prestige

Communication as a Science?

• Field more recent – affiliations with the sciences, social sciences & the

humanities

• Scholarly work (like old ideas of science) distinguished from mythology by methods AND goals

• many different approaches

Relations between theory and empirical observation

• Theory and empirical research– Testing theories through empirical observation

(deductive)– Using empirical observation to develop theories

(Inductive)

Theories

EmpiricalGeneralizations

Observations

Predictions(Hypotheses)

TheScientificProcess

Empirical and LogicalFoundations of Research

(does not have to start with theory)

Source: Singleton & Straits (1999: 27); Babbie (1995: 55)

Scholarly Communities--Norms

• universalism -- research judged on “scientific” merit

• organized scepticism -- challenge and question research

• disinterestedness-- openness to new ideas, non-partisan

• communalism--sharing with others• honesty

Research Questions

• Questions researchers ask themselves, not the questions they ask their informants

• Must be empirically testable• Not– too vague– too general– untestable (with implicit, untested assumed outcomes)

Developing research topics

““Dimensions” of ResearchDimensions” of Research

Neuman (2000: 37)

Purpose ofPurpose of

StudyStudy

Intended Use Intended Use of Studyof Study

Treatment of Time Treatment of Time in Studyin Study

Space Unit of Space Unit of

Analysis Analysis

(examples)(examples)

ExploratoryExploratory

DescriptiveDescriptive

ExplanatoryExplanatory

BasicBasic

AppliedApplied

-Action-Action

-Impact-Impact

-Evaluation-Evaluation

Cross-sectionalCross-sectional

LongitudinalLongitudinal

-Panel-Panel

-Time series-Time series

-Cohort analysis -Cohort analysis

-Case Study-Case Study

--Trend studyTrend study

-dependent -individual-dependent -individual

-independent -family-independent -family

-household-household

-artifact-artifact

(media, (media,

technology)technology)

Exploratory ResearchExploratory Research

• When not much is known about topic• Surprises (e.g. Serendipity effect)• Acquire familiarity with basic concerns

and develop a picture• Explore feasibility of additional

research• Develop questions

Descriptive ResearchDescriptive Research

• Focuses on “who”, “what” and “how”• Background information, to stimulate new

ways of thinking, to classify types, etc.

Explanatory ResearchExplanatory Research

• To test theories, predictions, etc…• Idea of “advancing” knowledge

Intended Use of StudyIntended Use of Study

• Basic• Applied– action research (We can make a difference)– social impact assessment (What will be the

effects?)– evaluation research (Did it work?)– needs assessment (Who needs what?)– cost-benefit analysis (What is it worth?)

Basic or Fundamental ResearchBasic or Fundamental Research

• Concerns of scholarly community• Inner logic and relation to theoretical issues

in field

Applied ResearchApplied Research

• commissioned/judged/used by people outside the field of communication

• goal of practical applications– usefulness of results

Types of Applied ResearchTypes of Applied Research

Action Research Social Impact Assessment Needs Assessment Evaluation Research

• formative (built in)• summative (final outcomes)

Cost-benefit analysis

Treatment of TimeTreatment of Time Cross-sectional(one point in time)

Longitudinal (more than one point in time)

Main Types of Longitudinal StudiesMain Types of Longitudinal Studies• Panel study

– Exactly the same people, at least twice• Cohort Analysis

– same category of people or things (but not exactly same individuals) who/which shared an experience at at least two times

– Examples: Birth cohorts. Graduating Classes, Video games invented in the same year2000 2010

41-50 41-5051-60 51-6061-70 61-7071-80 71-80

• Time-series– same type of info., not exactly same people, multiple time periods, e.g. Same place

2006 2011Burnaby residents Burnaby residents

• Case Studies may be longitudinal or cross-sectional

Lexis Diagram (To study Cohort Survival)

Importance of Choosing Appropriate Unit of Analysis

• example: Ecological Fallacy (cheating)

Ecological Fallacy

Ecological Fallacy

Ecological Fallacy & Reductionism

ecological fallacy--wrong unit of analysis (too high)

reductionism--wrong unit of analysis (too low)reductionism--wrong unit of analysis (too low)

Relationship of Theory & Empirical Observation (Wheel of Science)

Deductive & Inductive Methods (p. 71)

Conceptualization & Operationalization of Conceptualization & Operationalization of Research questionsResearch questions

• Conceptualization:

Development of abstract concepts

• Operationalization:

Finding concrete ways to do research

Reliability & Validity

Reliability dependability is the indicator consistent? same result every time?

Validity measurement validity - how well the conceptual and

operational definitions mesh with each other does measurement tool measure what we think ?

Hypothesis Testing

Possible outcomes in Testing Hypotheses (using empirical research)

• support (confirm) hypothesis• reject (not support) hypothesis• partially confirm or fail to

support• avoid use of PROVE

Causal diagrams

X Y

X Y

Direct relationship (positive correlation)

Indirect relationship (negative correlation)

Causal Diagrams

YX+

X1

X2

Y+

_

X Z Y+ +

XY

Z

+

_

X1

X2

Z Y_+

_

+

Neuman (2000: 56)

Types of Errors in Causal Explanation

• ecological fallacy• reductionism• tautology• teleology• Spuriousness

Double-Barrelled Hypothesis & Interaction Effect

OR

Means one of THREE things

1

2

Interaction effect

Recall: Importance of Choosing Appropriate Unit of Analysis

• Recall example: Ecological Fallacy (cheating)

Ecological Fallacy (cheating)

Ecological Fallacy (cheating Box)

Ecological Fallacy & Reductionism

ecological fallacy--wrong unit of analysis (too high)

reductionism--wrong unit of analysis (too low)reductionism--wrong unit of analysis (too low)

Teleology & Tautology

tautology--circular reasoning (true by definition)teleology--too vague for testing

Neuman (2000: 140)

Spurious Relationship

spuriousness--false relationship (unseen third variable or simply not connected)

Neuman (2000: 140)

Example: Storks & Babies– Observations: – Lots of storks seen around apartment buildings

in a new neighbourhood with low cost housing– An increase in number of pregnancies– Did the storks bring the babies???

?

But...

• The relationship is spurious.– The storks liked the heat coming from the

smokestacks on the roof of the building, and so were more likely to be attracted to that building.

– The tenants of the building were mostly young newlyweds starting families.

– So…the storks didn’t bring the babies after all.

Causal Diagram for Storks

• Stork = S• Baby = B

• Newlywed = N• Chimneys on Building = C

S B+

N B+

C S+

Another example of spurious relationships: number of firefighters & damage

• The larger the number of firefighters, the greater the damage

But...

• A larger number of firefighters is necessary to fight a larger fire. A larger fire will cause more damage than a small one.

• Debate about Hockey Riots in Vancouver. – Did the size of the crowd & amount of drinking

cause the riots? – Did bad planning and inadequate policing cause

the fire?

Causal Diagram

• Firefighter = F• Damage = D

• Size of Fire = S

F D+

F

S+

+ D

BothMoral and

Legal

IllegalOnly

ImmoralOnly

BothImmoral

and Illegal

EthicalIll

egal

Legal

Unethical Source: figure adapted fromNeuman (2000:91)

Ethics & LegalityEthics & LegalityTypology of Legal and Moral Typology of Legal and Moral

Actions in ResearchActions in Research

Privacy, Anonymity, ConfidentialityPrivacy, Anonymity, Confidentiality

• privacy: a legal right (note : public vs. private domain)--even if subject is dead

• anonymity: subjects remain nameless & responses cannot be connected to them (problem in small samples)

• confidentiality: subjects’ identity may be known but not disclosed by researcher, identity can’t be linked to responses

4-Measurement—Scales & Indices (Part 2 of 2 slideshows)

4-Measurement—Scales & Indices (Part 2 of 2 slideshows)

Neuman & Robson Chapter 6

•systematic observation •can be replicated

Creating Measures

Measures must have response categories that are: mutually exclusive

possible observations must only fit in one category

exhaustive categories must cover all possibilities

Composite Measures

• Composite measures are instruments that use several questions to measure a given variable (construct).

• A composite measure unidimensional (all items measure the same construct)– Indices (plural form of index) and scales

Logic of Index Construction

actions combined in single measure, often an ordinal level of

measurement

Logic of Scalesactions ranked

Logic Index--example

Logic Scale-example

Treatment of Missing Data

• eliminate cases with missing data?• substitute average score ?• Guess ?• insert random value ?

• deciding what measure to use for reference populations example: employment rates

Rates & Standardization:

Sampling: key ideas & terms

Bad sampling frame

= parameters do not accurately represent target population– e.g., a list of people in the phone directory

does not reflect all the people in a town because not everyone has a phone or is listed in the directory.

Types of NonprobabilitySamples

4

16

Types of Probability Sampleslink to useful webpage: http://www.socialresearchmethods.net/kb/sampprob.php

Stratified

Evaluating Sampling

• Is the sample representative of the population under

study?

• Assessing Equal chance of being chosen

• Examine Sampling distribution of parameters of

population

• Use Central Limit Theorem to calculate Confidence

Intervals and estimate Margin of Error

Asking Questions

that can be answered

Types of Surveys & Survey Instruments• Self-administered Surveys

• Mail• Web

• Surveys based on Interactive Interviews• Telephone• Online (interactive)• Face-to-face

– Individuals– Focus groups

• Survey Instruments:– Questionnaires

• self-administered • Respondent reads questions & records answers

– Interview Schedules • interviewer reads questions & records responses

Main Types of Unobtrusive Measures

• Physical traces– Erosion (ex. wear on floor in museum

displays as measure of popularity of display)– Accretion (ex. garbage)

• Simple observation• Media analysis such as content analysis,

critical discourse analysis (ex. advertisements, news reports, films, music lyrics etc…)

• Analysis of archives, existing statistics & running records (ex. shoppers’ records, library borrowers’ histories)

• Simple observation

Types of Equivalence for comparative research using existing statistics

Types of Equivalence for comparative research using existing statistics

• lexicon equivalence (technique of back translation)

• contextual equivalence (ex. role of religious leaders in different societies)

• conceptual equivalence (ex. income)• measurement equivalence (ex. different

measure for same concept)

Discrete & Continuous Variables

• Continuous– Variable can take infinite (or large) number of values

within range• Ex. Age measured by exact date of birth

• Discrete– Attributes of variable that are distinct but not

necessarily continuous• Ex. Age measured by age groups (Note: techniques exist

for making assumptions about discrete variables in order to use techniques developed for continuous variables)

Cleaning Data

• checking accuracy & removing errors –Possible Code Cleaning• check for impossible codes (errors)

– Some software checks at data entry– Examine distributions to look for impossible codes

– Contingency cleaning• inconsistencies between answers (impossible

logical combinations, illogical responses to skip or contingency questions)

Treatment of Missing Data (%)• Comparison with medium & low collapsed

Table 5-1 Alienation of Workers

Level of Alienation F %High 30 14 Medium & Low 120 58 No Response 60 29

(Total) 210 100

Table 5-1 Alienation of Workers

Level of Alienation F %High 30 20 Medium & Low 120 80

(Total) 150 100

Non-respondents included Non-respondents eliminated

Grouping Response Categories(%)

• Comparison of with high & medium response categories collapsed

Table 5-1 Alienation of Workers

Level of Alienation Freq % High& medium 87Low 13

(Total) 150

Table 5-1 Alienation of Workers

Level of Alienation Freq %High & Medium 62Low 10No Response 29

(Total) 210 100

Core Notions in Basic Univariate Statistics

Ways of describing data about one variable (“uni”=one)–Measures of central tendency• Summarize information about one variable • three types of “averages”: arithmetic mean,

median, mode

–Measures of dispersion• Analyze Variations or “spread”• Range, standard deviation, percentiles, z-scores

Normal & Skewed Distributions

Details on the Calculation of Standard Deviation

Neuman (2000: 321)Neuman (2000: 321)

The Bell Curve & standard deviation

If Time: Begin Bivariate Statistics (Results with two variables)

• Types of relationships between two variables:– Correlation (or covariation)• when two variables ‘vary together’

– a type of association– Not necessarily causal

• Can be same direction (positive correlation or direct relationship)• Can be in different directions (negative correlation or

indirect relationship)– Independence• No correlation, no relationship• Cases with values in one variable do not have any

particular value on the other variable

Recall (Lecture 2) *Types of variables*

• independent variable (cause)• dependent variable (effect)• intervening variable – (occurs between the independent and the

dependent variable temporally)

• control variable – (temporal occurance varies, illustrations later

today)

Causal Relationships

• proposed for testing (NOT like assumptions)• 5 characteristics of causal hypothesis (p.128)

– at least 2 variables– cause-effect relationship (cause must come before

effect)– can be expressed as prediction– logically linked to research question+ a theory– falsifiable

Types of Correlations & Causal Relationships between Two Variables

X=independent variable Y=dependent variable

• Positive Correlation (Direct relationship)– when X increases Y increases or vice versa

• Negative Correlation (Indirect or inverse relationship)– when X increases Y decreases or vice versa

• Independence – no relationship (null hypothesis)

• Co-variation – vary together ( a type of association but not necessarily causal)

YX-

YX+

Five Common Measures of Association between Two Variables

General Idea of Statistical Significance

• In general English ‘significance’ means important or meaningful but this is NOT how the term is used in statistics

• Tests of statistical significance show you how likely a result is due to chance.

Multi-variate Statistics: Elaboration Paradigm (Types of Patterns)

• Replication: same relationship in both partials as in bivariate table

• Specification: bivariate relationship only seen in one of the partial tables

• Interpretation: bivariate relationship weakens greatly or disappears in partial tables (control variable is intervening—happens in between independent & dependent)

• Explanation: Bivariate relationship weakens or diappears in partial table (control variable is before independent variable)

• Suppressor: No bivariate relationship; relationshp only appears in partial tables.

Elaboration Paradigm Summary

Study Tips for Final Exam

• Practice questions• Other ideas for preparation

Recommended