54
Using the climate debate to revitalize general chemistry Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you feel like you are in a race to complete a list of content from the ever-expanding standards? Do you dread giving yet another lecture about balancing equations? Imagine a general chemistry classroom where instead of listening to a lecture, students are leading a discussion about the chemistry behind climate issues. Imagine students studying, but instead of making flashcards and lists, they are engaging with complex civic issues and devising potential approaches to solve them. Imagine yourself with a renewed enthusiasm for the craft of teaching. The Innovative Course-building Group (IC-bG) http://icbg.wordpress.com

Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Using the climate debate to revitalize general chemistry

Kimberly Cossey, Catrena Lisse, Julia Metzker,

Chavonda Mills, Rosalie Richards

Does planning your course make you feel like you are in a race to complete a list of content from the ever-expanding standards? Do

you dread giving yet another lecture about balancing equations? Imagine a general chemistry classroom where instead of listening

to a lecture, students are leading a discussion about the chemistry behind climate issues. Imagine students studying, but instead of making flashcards and lists, they are engaging with complex civic issues and devising potential approaches to solve them. Imagine yourself with a renewed enthusiasm for the craft of teaching.

The Innovative Course-building Group (IC-bG) http://icbg.wordpress.com

Page 2: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Using the climate debate to revitalize general chemistry

Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards

Day 1

Big Idea &�

Learning Goals

Page 3: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Adapted from Understanding by Design by Wiggins

& McTighe

Big Idea What is the “big picture” or

lofty idea?

Goals What do you want your

student to be able to do?

Activities Debates, reports,

experiments, posters, presentations,

interviews, essays, exams

Assessment Did students achieve the goals? Include

formative & summative assessment.

Reflection What worked? What

can be improved?

Workshop Overview

Page 4: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

About&the&Innovative&Course1building&Group&The$ Innovative$ Course0Building$ Group$ (IC0bG)$ is$ a$ grass0roots$ social$ network$ for$ learning$ that$supports$teaching$faculty$and$staff$across$disciplines.$We$use$civic$issues$as$a$catalyst$for$designing$engaging$ courses$ that$will$ result$ in$ important$ student$ learning.$We$ are$ a$ community$ of$ life0long$learners.$ $We$ encourage$ our$ participants$ to$ become$ mentors$ to$ future$ IC0bG$ participants$ and$engage$ in$ long0and0short0term$ planning$ for$ the$ group.$ $Our$mentors$ represent$many$ disciplines$and$interest,$which$adds$to$the$unique$character$of$this$group.$

GUIDING&PRINCIPLES)…!! Time$ is$ valuable:$ gatherings$ are$ deliberately$ designed$ to$ be$ productive,$ meaningful,$ and$

enjoyable$uses$of$this$limited$resource.$

! Good$ideas$recycled,$refined,$and$adapted$become$great$ideas.$

! Teaching$and$learning$rarely$happen$in$isolation:$collaboration$supports$innovation.$

FACILITATORS!Kimberly$Cossey$<[email protected]>$

Catrena$Lisse$<[email protected]>$$

Julia$Metzker$<[email protected]>$

Chavonda$Mills$<[email protected]>$

Rosalie$Richards$<[email protected]>$

$

http://icbg.wordpress.com Twitter: @ic_bg #icbgsi

$

$

Page 5: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#1:''Speed'Dating'Ice4Breaker'

1. Civic Issue: Course Context:

Connections between course context and civic issue:

2. Civic Issue: Course Context:

Connections between course context and civic issue:

3. Civic Issue: Course Context:

Connections between course context and civic issue:

Page 6: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Adapted from: Understanding by Design by Wiggins & McTighe

Cutting Edge Course Design Tutorial by Barbara Tewksbury http://serc.carleton.edu/NAGTWorkshops/coursedesign/tutorial/

Big Idea What is the “big picture” or

lofty idea?

Goals What do you want your

student to be able to do?

Activities Debates, reports,

experiments, posters, presentations,

interviews, essays, exams

Assessment Did students achieve the goals? Include

formative & summative assessment.

Backward Course Design

Page 7: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Notes&

Page 8: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#2:'Goal'Bucket'Activity Task • Read each course goal and content objective on the

provided pieces of colored paper • In your group, decide which bucket is most

appropriate for each course goal and content objective • Place appropriate course goal and content objective in

bucket • Record your choice (GP, CC, OD, or NF) on the

activity sheet

Color Key • White paper = Course Goal • Blue Paper = Content Objective

Bucket Category Abbreviations Ground-Level Pollution = GP, Climate Change = CC, Ozone Depletion = OD, No Fit = NF

Course'Goal' Bucket'

CG1.'Students'will'demonstrate'an'understanding'of'the'chemical'properties'of'atoms,'molecules,'ions'and'gases.'

CG2.'Students'will'be'able'to'demonstrate'an'understanding'of'the'chemical'principles'of'stoichiometry,'reactions'in'solutions,'thermochemistry,'atomic'structure,'periodicity'and'bonding.'

CG3.'Students'will'construct'strategies'to'solve'problems'with'integrated'concepts'and'evaluate'solutions.'

CG4.'Students'will'be'able'to'implement'effective'search'strategies'and'evaluate'sources'of'chemical'information'for'relevance'and'authority.'

CG5.'Students'will'be'able'to'explain'multiple'approaches'that'respond'to'problems'in'chemistry'

CG6.'Students'will'be'able'to'explain'and'analyze'scientific'evidence.'

CG7.'Students'will'be'able'to'form'logical'conclusions'from'the'chemical'information'presented.'

CG8.'Students'will'be'able'to'explain'multiple'approaches'that'respond'to'problems'in'chemistry.'

!Content'Objective' Bucket

CO1.'Redox'Reactions' '

CO2.'Mole'' '

CO3.'Solution'Concentration'and'Molarity' '

CO4.'Chemical'Bonding'' '

Page 9: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Content'Objective' Bucket

CO5.'Gas'Laws' '

CO6.'Combustion'Chemistry' '

CO7.'Stoichiometry' '

CO8.'Naming' '

CO9.'Reactions'Types'' '

CO10.'Hess’s'Law' '

CO11.'Electromagnetic'Spectrum' '

CO12.'Petroleum'Refining'&'Coal'' '

CO13.'Acid/Base'Reactions' '

CO14.'Spectroscopy' '

CO15.'Lewis'Structures' '

CO16.'VSPER'&'Molecular'Geometry'' '

CO17.'Limiting'Reactants' '

CO18.'Energy'Balance' '

CO19.'Dimensional'Analysis' '

CO20.'Electron'Configuration' '

CO21.'Quantum'Numbers'&'Orbitals' '

CO22.'Atomic'Structure'' '

CO23.'Ionization'Energy' '

CO24.'Periodic'Trends' '

CO25.'Thermochemistry' '

!

Page 10: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#3:'“Big'Idea”'Think7Group7Share' Activity Tasks

• Choose a “Big Idea” to design a course/module around. • Formalize your “Big Idea” by answering the guiding questions below. • Share your “Big Idea” with your partner and discuss.

Guiding Questions

1. Is your “Big Idea” broad enough to cover your course content?

2. Is your “Big Idea” engaging to students? Is it relatable to your student population? e.g., Is it an idea of their generation?

Page 11: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

3. Is your “Big Idea” appropriately complex? e.g., Complex enough to be challenging but not so complex that students spend all their time on background knowledge.

4. What skills, dispositions, and content knowledge will a student have at the end of your course?

!

Page 12: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

SCHMI&Criteria&Learning(outcomes"describe(what!we#want#students#to#be#good#at#doing!by#the#end#of#the#course.

SOME$TIPS$WHEN$WRITING"OUTCOMES:!• Your focus should be on helping your students develop the ability to solve problems in a discipline and

apply what they have learned to future tasks. Your focus should not be on exposure to a topic. • Avoid: “students will appreciate” or “students will understand” or “students will be exposed to” • What do you do as a professional in your discipline? Ask yourself this question to help you set

outcomes. CHECK%YOUR%OUTCOMES.!DO"THEY"MEET"THE"SCHMI!CRITERIA?!

• Student!centered?!• Concrete?!• Higher!order?!• Measurable?!!• Inclusive?!

Student(centered!• Is#the#outcome!student#focused,)rather)than)teacher#focused?!• A!outcome!indicates!what!you!expect!your!students!to!be!able!to!do!upon!successful!completion!of!

the!course!not!what!you!are!going!to!do!as!the!course!instructor.!• This!outcome!is!not!student#centered!“This!course!will!introduce!students!to!the!fundamental!

concepts!of!calculus.”!• This!outcome!is!student#centered!“Students!will!demonstrate!their!understanding!of!a!

fundamental!concept!of!calculus!by!calculating!derivatives.”!Concrete!

• Is#the#outcome!concrete,(rather(than(vague(and(abstract?!• Will$your$students$understand$what$you$mean$when$they$read$the$course$outcomes?!• This%outcome%is%not!concrete'“Students(will(enrich(their(critical(thinking$skills.”!• This%outcome%is%concrete%“Students(will(demonstrate(their(critical(thinking(skills(by(…(((add(what(

they%will%do%in%your%course%here).”!! !

Goals? Objectives? Outcomes? Different schools and different disciplines use different jargon. When we talk about outcomes or goals, we are referring to the course outcomes or goals that you would list on your syllabus. Use whatever language is most appropriate for your situation.!

Page 13: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

SCHMI&Criteria&cont’d&Higher(order!

• Does%the%outcome!focus&on&higher#order%thinking%skills%(predict,%analyze,%develop%or%evaluate)%rather&than&lower#order%skills%(list,%identify,%classify)?!

• Students(will(acquire(the(lower#order%skills%as%the%move%towards%achieving%the%higher#order%outcomes.!

• This%outcome%does"not!require&higher#order%thinking%skills%“Students%will%list%the%enzymes%used%in%the$process$of$photosynthesis.”!

• This%outcome%requires%higher#order%(and%lower#order)&skills&“Students&will&compare&and&contrast&the$processes$of$respiration$and$photosynthesis.”!

Measurable!• Can$you$design$an$activity$that$would$allow$you$to$determine$whether$students$have$met$the$

outcome!or#not? • Is#the#outcome#assessable? • Caution: The verbs to know, learn and understand indicate internal mental states that are not

automatically accessible to outsiders. • Students must demonstrate their knowledge, learning, and understanding in some way to make

assessment possible • This%outcome%is%not!measurable)“Students(will(learn&to&think&critically.”!• This%outcome%is%measurable%“Students)will)implement'effective'search'strategies'and$evaluate$

sources'of!information)for)relevance)and)authority.”!Inclusive!

• Does%your%goal%represent%and%recognize%the%diversity%of%students?!• This%outcome%is%not!inclusive)“Students(will(choose!resources!from!the%Journal%of%the%American%

Chemical%Society!for!relevance!and!authority”!!• This%outcome%is%inclusive%“Students(will(evaluate!scientific!and!popular!information!for!relevance!

and!authority”.!!!

Page 14: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#4:''Evaluating'Goals'

Notes: !!

Students will understand the role of science and scientists in the global community.

• Student(centered?!!Y!/!N!!• Concrete?!Y!/!N!• Higher(order?!Y!/!N!• Measurable?!!Y!/!N!• Inclusive?!Y!/!N!

An improvement: Students will demonstrate their understanding of the role of science and scientists in the global community by describing an example of how scientific research has influenced public policy.

Students(will(improve(their(understanding(of(statistics.!• Student(centered?!!Y!/!N!• Concrete?!!Y!/!N!• Higher(order?!!Y!/!N!• Measurable?!!Y!/!N!• Inclusive?!Y!/!N!

An improvement: Students will collect, organize and analyze data, choose a model to fit their data and defend the choice of their model in the context of the data.!

Appreciate the living world around you, and be able to view it from a more informed perspective.

• Student(centered?!Y!/!N!• Concrete?!!Y!/!N!• Higher(order?!!Y!/!N!• Measurable?!Y!/!N!• Inclusive?!Y!/!N!

An improvement: Students will explain a current event (newspaper article, podcast, broadcast, etc) based on course material. This goal could be made more specific by adding relevant subject information.

Students will list misrepresentatives of the criminal justice system that they see in television.

• Student(centered?!!Y!/!N!• Concrete?!!Y!/!N!• Higher(order?!Y!/!N!• Measurable?!!Y!/!N!• Inclusive?!Y!/!N!

An improvement: Students will differentiate and highlight the differences between media and reality representations of the criminal justice system.

Page 15: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#5:''Writing'Goals'(1) What civic issue do you want to use in your course?

(2) Jot down skills, dispositions and abilities that a student who successfully completes the course will possess.

(3) Individually, use these skills, dispositions and abilities to write three to five learning outcomes (not

activities!) for the course. Evaluate the outcome according to the SCHM criteria. Learning Outcome Upon successful completion of this course, a student will:

Student Centered?

Concrete? Higher Order?

Measurable? Inclusive?

1.

2.

3.

4.

5.

(4) Share your learning outcomes with your partner and discuss. Is the language clear? Do you think students

will know be able to understand the objectives? Revise if necessary. (5) Write your learning outcomes on a large post-it note. (6) Gallery Walk: post (using sticky notes) one piece of warm feedback and one piece of cool feedback for

each set of goals including your own. !

Page 16: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

May 12, 2014 by Chronicle Staff

The TickerBreaking news from all corners of academe.

Active Learning Is Found to Foster Higher PassRates in STEM Courses

Report: “Active Learning Increases Student Performance in Science, Engineering, and

Mathematics” (http://www.pnas.org/cgi/doi/10.1073/pnas.1319030111)

Authors: Scott Freeman, Mary Wenderoth, Sarah Eddy, Miles McDonough, Nnadozie Okoroafor,

Hannah Jordt, and Michelle Smith

Organizations: The lead researchers are at the University of Washington. The paper was

published in the Proceedings of the National Academy of Sciences.

Summary: The researchers conducted a meta-analysis of 225 studies of undergraduate

education in science, technology, engineering, and mathematics, the STEM disciplines. The

studies compared the failure rates of students whose STEM courses used some form of active-

learning methods—like requiring students to participate in discussions and problem-solving

activities while in class—with those of students whose courses were traditional lectures, in

which they generally listened.

The studies were conducted at two- and four-year institutions chiefly in the United States and

previously appeared in STEM-education journals, databases, dissertations, and conference

proceedings. To be included, the studies had to assure that the students in each kind of course

were equally qualified and able, their instructors were largely similar, and the examinations they

took either were alike or used questions from the same pool.

Results: A 12-point difference emerged. While 34 percent of students in the lecture courses

failed, 22 percent of students failed in courses that used active-learning methods.

Page 17: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Bottom Line: Calls for more STEM graduates have long been stymied by attrition in those

majors, and introductory courses have often proved to be a big obstacle. Different teaching

methods may help remedy that pattern.

Copyright © 2014 The Chronicle of Higher Education

Page 18: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Active learning increases student performance inscience, engineering, and mathematicsScott Freemana,1, Sarah L. Eddya, Miles McDonougha, Michelle K. Smithb, Nnadozie Okoroafora, Hannah Jordta,and Mary Pat Wenderotha

aDepartment of Biology, University of Washington, Seattle, WA 98195; and bSchool of Biology and Ecology, University of Maine, Orono, ME 04469

Edited* by Bruce Alberts, University of California, San Francisco, CA, and approved April 15, 2014 (received for review October 8, 2013)

To test the hypothesis that lecturing maximizes learning andcourse performance, we metaanalyzed 225 studies that reporteddata on examination scores or failure rates when comparing studentperformance in undergraduate science, technology, engineer-ing, and mathematics (STEM) courses under traditional lecturingversus active learning. The effect sizes indicate that on average,student performance on examinations and concept inventories in-creased by 0.47 SDs under active learning (n = 158 studies), andthat the odds ratio for failing was 1.95 under traditional lecturing(n = 67 studies). These results indicate that average examinationscores improved by about 6% in active learning sections, and thatstudents in classes with traditional lecturing were 1.5 times morelikely to fail than were students in classes with active learning.Heterogeneity analyses indicated that both results hold acrossthe STEM disciplines, that active learning increases scores on con-cept inventories more than on course examinations, and that ac-tive learning appears effective across all class sizes—although thegreatest effects are in small (n ≤ 50) classes. Trim and fill analysesand fail-safe n calculations suggest that the results are not due topublication bias. The results also appear robust to variation in themethodological rigor of the included studies, based on the qualityof controls over student quality and instructor identity. This is thelargest and most comprehensive metaanalysis of undergraduateSTEM education published to date. The results raise questions aboutthe continued use of traditional lecturing as a control in researchstudies, and support active learning as the preferred, empiricallyvalidated teaching practice in regular classrooms.

constructivism | undergraduate education | evidence-based teaching |scientific teaching

Lecturing has been the predominant mode of instruction sinceuniversities were founded in Western Europe over 900 y ago

(1). Although theories of learning that emphasize the need forstudents to construct their own understanding have challengedthe theoretical underpinnings of the traditional, instructor-focused, “teaching by telling” approach (2, 3), to date there hasbeen no quantitative analysis of how constructivist versus expo-sition-centered methods impact student performance in un-dergraduate courses across the science, technology, engineering,and mathematics (STEM) disciplines. In the STEM classroom,should we ask or should we tell?Addressing this question is essential if scientists are committed

to teaching based on evidence rather than tradition (4). Theanswer could also be part of a solution to the “pipeline problem”that some countries are experiencing in STEM education: Forexample, the observation that less than 40% of US students whoenter university with an interest in STEM, and just 20% ofSTEM-interested underrepresented minority students, finish witha STEM degree (5).To test the efficacy of constructivist versus exposition-centered

course designs, we focused on the design of class sessions—asopposed to laboratories, homework assignments, or other exer-cises. More specifically, we compared the results of experimentsthat documented student performance in courses with at leastsome active learning versus traditional lecturing, by metaanalyzing

225 studies in the published and unpublished literature. The activelearning interventions varied widely in intensity and implementa-tion, and included approaches as diverse as occasional groupproblem-solving, worksheets or tutorials completed during class,use of personal response systems with or without peer instruction,and studio or workshop course designs. We followed guidelines forbest practice in quantitative reviews (SI Materials and Methods),and evaluated student performance using two outcome variables:(i) scores on identical or formally equivalent examinations, conceptinventories, or other assessments; or (ii) failure rates, usuallymeasured as the percentage of students receiving a D or F gradeor withdrawing from the course in question (DFW rate).The analysis, then, focused on two related questions. Does ac-

tive learning boost examination scores? Does it lower failure rates?

ResultsThe overall mean effect size for performance on identical orequivalent examinations, concept inventories, and other assess-ments was a weighted standardized mean difference of 0.47 (Z =9.781, P << 0.001)—meaning that on average, student perfor-mance increased by just under half a SD with active learningcompared with lecturing. The overall mean effect size for failurerate was an odds ratio of 1.95 (Z = 10.4, P << 0.001). This oddsratio is equivalent to a risk ratio of 1.5, meaning that on average,students in traditional lecture courses are 1.5 times more likely tofail than students in courses with active learning. Average failurerates were 21.8% under active learning but 33.8% under tradi-tional lecturing—a difference that represents a 55% increase(Fig. 1 and Fig. S1).

Significance

The President’s Council of Advisors on Science and Technologyhas called for a 33% increase in the number of science, tech-nology, engineering, and mathematics (STEM) bachelor’s degreescompleted per year and recommended adoption of empiricallyvalidated teaching practices as critical to achieving that goal. Thestudies analyzed here document that active learning leads toincreases in examination performance that would raise averagegrades by a half a letter, and that failure rates under traditionallecturing increase by 55% over the rates observed under activelearning. The analysis supports theory claiming that calls to in-crease the number of students receiving STEM degrees could beanswered, at least in part, by abandoning traditional lecturing infavor of active learning.

Author contributions: S.F. and M.P.W. designed research; S.F., M.M., M.K.S., N.O., H.J.,and M.P.W. performed research; S.F. and S.L.E. analyzed data; and S.F., S.L.E., M.M.,M.K.S., N.O., H.J., and M.P.W. wrote the paper.

The authors declare no conflict of interest.

*This Direct Submission article had a prearranged editor.

Freely available online through the PNAS open access option.1To whom correspondence should be addressed. E-mail: [email protected].

This article contains supporting information online at www.pnas.org/lookup/suppl/doi:10.1073/pnas.1319030111/-/DCSupplemental.

www.pnas.org/cgi/doi/10.1073/pnas.1319030111 PNAS Early Edition | 1 of 6

PSYC

HOLO

GICALAND

COGNITIVESC

IENCE

S

Page 19: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Heterogeneity analyses indicated no statistically significantvariation among experiments based on the STEM discipline ofthe course in question, with respect to either examination scores(Fig. 2A; Q = 910.537, df = 7, P = 0.160) or failure rates (Fig. 2B;Q = 11.73, df = 6, P = 0.068). In every discipline with more than10 experiments that met the admission criteria for the meta-analysis, average effect sizes were statistically significant foreither examination scores or failure rates or both (Fig. 2, Figs.S2 and S3, and Tables S1A and S2A). Thus, the data indicatethat active learning increases student performance across theSTEM disciplines.For the data on examinations and other assessments, a het-

erogeneity analysis indicated that average effect sizes were lowerwhen the outcome variable was an instructor-written course ex-amination as opposed to performance on a concept inventory(Fig. 3A and Table S1B; Q = 10.731, df = 1, P << 0.001). Al-though student achievement was higher under active learning forboth types of assessments, we hypothesize that the difference ingains for examinations versus concept inventories may be due tothe two types of assessments testing qualitatively different cogni-tive skills. This explanation is consistent with previous research

indicating that active learning has a greater impact on studentmastery of higher- versus lower-level cognitive skills (6–9), andthe recognition that most concept inventories are designed todiagnose known misconceptions, in contrast to course examinationsthat emphasize content mastery or the ability to solve quantitativeproblems (10). Most concept inventories also undergo testing forvalidity, reliability, and readability.Heterogeneity analyses indicated significant variation in terms

of course size, with active learning having the highest impacton courses with 50 or fewer students (Fig. 3B and Table S1C;Q = 6.726, df = 2, P = 0.035; Fig. S4). Effect sizes were sta-tistically significant for all three categories of class size, how-ever, indicating that active learning benefitted students inmedium (51–110 students) or large (>110 students) class sizesas well.When we metaanalyzed the data by course type and course

level, we found no statistically significant difference in activelearning’s effect size when comparing (i) courses for majorsversus nonmajors (Q = 0.045, df = 1, P = 0.883; Table S1D), or(ii) introductory versus upper-division courses (Q = 0.046, df = 1,P = 0.829; Tables S1E and S2D).

Fig. 1. Changes in failure rate. (A) Data plotted as percent change in failure rate in the same course, under active learning versus lecturing. The mean change(12%) is indicated by the dashed vertical line. (B) Kernel density plots of failure rates under active learning and under lecturing. The mean failure rates undereach classroom type (21.8% and 33.8%) are shown by dashed vertical lines.

Fig. 2. Effect sizes by discipline. (A) Data on examination scores, concept inventories, or other assessments. (B) Data on failure rates. Numbers below datapoints indicate the number of independent studies; horizontal lines are 95% confidence intervals.

2 of 6 | www.pnas.org/cgi/doi/10.1073/pnas.1319030111 Freeman et al.

Page 20: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

To evaluate how confident practitioners can be about theseconclusions, we performed two types of analyses to assesswhether the results were compromised by publication bias, i.e.,the tendency for studies with low effect sizes to remain un-published. We calculated fail-safe numbers indicating how manymissing studies with an effect size of 0 would have to be pub-lished to reduce the overall effect sizes of 0.47 for examinationperformance and 1.95 for failure rate to preset levels that wouldbe considered small or moderate—in this case, 0.20 and 1.1, re-spectively. The fail-safe numbers were high: 114 studies on exam-ination performance and 438 studies on failure rate (SI Materialsand Methods). Analyses of funnel plots (Fig. S5) also support alack of publication bias (SI Materials and Methods).To assess criticisms that the literature on undergraduate

STEM education is difficult to interpret because of methodo-logical shortcomings (e.g., ref. 11), we looked for heterogeneityin effect sizes for the examination score data, based on whetherexperiments did or did not meet our most stringent criteria forstudent and instructor equivalence. We created four categoriesto characterize the quality of the controls over student equivalencein the active learning versus lecture treatments (SI Materials andMethods), and found that there was no heterogeneity based onmethodological quality (Q = 2.097, df = 3, P = 0.553): Experi-ments where students were assigned to treatments at randomproduced results that were indistinguishable from three typesof quasirandomized designs (Table 1). Analyzing variation withrespect to controls over instructor identity also produced noevidence of heterogeneity (Q = 0.007, df = 1, P = 0.934): Morepoorly controlled studies, with different instructors in the twotreatment groups or with no data provided on instructor equiv-alence, gave equivalent results to studies with identical or ran-domized instructors in the two treatments (Table 1). Thus, theoverall effect size for examination data appears robust to variationin the methodological rigor of published studies.

DiscussionThe data reported here indicate that active learning increasesexamination performance by just under half a SD and that lec-turing increases failure rates by 55%. The heterogeneity analysesindicate that (i) these increases in achievement hold across all of theSTEM disciplines and occur in all class sizes, course types, andcourse levels; and (ii) active learning is particularly beneficial insmall classes and at increasing performance on concept inventories.Although this is the largest and most comprehensive meta-

analysis of the undergraduate STEM education literature todate, the weighted, grand mean effect size of 0.47 reported hereis almost identical to the weighted, grand-mean effect sizes of0.50 and 0.51 published in earlier metaanalyses of how alter-natives to traditional lecturing impact undergraduate courseperformance in subsets of STEM disciplines (11, 12). Thus, ourresults are consistent with previous work by other investigators.The grand mean effect sizes reported here are subject to im-

portant qualifications, however. For example, because strugglingstudents are more likely to drop courses than high-achievingstudents, the reductions in withdrawal rates under active learn-ing that are documented here should depress average scores onassessments—meaning that the effect size of 0.47 for examina-tion and concept inventory scores may underestimate activelearning’s actual impact in the studies performed to date (SIMaterials and Methods). In contrast, it is not clear whether effectsizes of this magnitude would be observed if active learningapproaches were to become universal. The instructors whoimplemented active learning in these studies did so as volunteers.It is an open question whether student performance would in-crease as much if all faculty were required to implement activelearning approaches.Assuming that other instructors implement active learning and

achieve the average effect size documented here, what would

Fig. 3. Heterogeneity analyses for data on examination scores, concept inventories, or other assessments. (A) By assessment type—concept inventories versusexaminations. (B) By class size. Numbers below data points indicate the number of independent studies; horizontal lines are 95% confidence intervals.

Table 1. Comparing effect sizes estimated from well-controlled versus less-well-controlled studies

95% confidence interval

Type of control n Hedges’s g SE Lower limit Upper limit

For student equivalenceQuasirandom—no data on student equivalence 39 0.467 0.102 0.268 0.666Quasirandom—no statistical difference in prescoreson assessment used for effect size

51 0.534 0.089 0.359 0.709

Quasirandom—no statistical difference on metricsof academic ability/preparedness

51 0.362 0.092 0.181 0.542

Randomized assignment or crossover design 16 0.514 0.098 0.322 0.706For instructor equivalence

No data, or different instructors 59 0.472 0.081 0.313 0.631Identical instructor, randomized assignment,or ≥3 instructors in each treatment

99 0.492 0.071 0.347 0.580

Freeman et al. PNAS Early Edition | 3 of 6

PSYC

HOLO

GICALAND

COGNITIVESC

IENCE

S

Page 21: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

a shift of 0.47 SDs in examination and concept inventory scoresmean to their students?

i) Students performing in the 50th percentile of a class based ontraditional lecturing would, under active learning, move tothe 68th percentile of that class (13)—meaning that insteadof scoring better than 50% of the students in the class, thesame individual taught with active learning would score betterthan 68% of the students being lectured to.

ii) According to an analysis of examination scores in three intro-ductory STEM courses (SI Materials and Methods), a change of0.47 SDs would produce an increase of about 6% in averageexamination scores and would translate to a 0.3 point in-crease in average final grade. On a letter-based system, mediansin the courses analyzed would rise from a B− to a B or froma B to a B+.

The result for undergraduate STEM courses can also becompared with the impact of educational interventions at theprecollege level. A recent review of educational interventionsin the K–12 literature reports a mean effect size of 0.39 whenimpacts are measured with researcher-developed tests, analo-gous to the examination scores analyzed here, and a mean effectsize of 0.24 for narrow-scope standardized tests, analogous to theconcept inventories analyzed here (14). Thus, the effect size ofactive learning at the undergraduate level appears greater thanthe effect sizes of educational innovations in the K–12 setting,where effect sizes of 0.20 or even smaller may be considered ofpolicy interest (14).There are also at least two ways to view an odds ratio of 1.95

for the risk of failing a STEM course:

i) If the experiments analyzed here had been conducted as ran-domized controlled trials of medical interventions, they mayhave been stopped for benefit—meaning that enrollingpatients in the control condition might be discontinued be-cause the treatment being tested was clearly more beneficial.For example, a recent analysis of 143 randomized controlledmedical trials that were stopped for benefit found that theyhad a median relative risk of 0.52, with a range of 0.22 to 0.66(15). In addition, best-practice directives suggest that datamanagement committees may allow such studies to stop forbenefit if interim analyses have large sample sizes and P val-ues under 0.001 (16). Both criteria were met for failure ratesin the education studies we analyzed: The average relativerisk was 0.64 and the P value on the overall odds ratiowas << 0.001. Any analogy with biomedical trials is qual-ified, however, by the lack of randomized designs in studiesthat included data on failure rates.

ii) There were 29,300 students in the 67 lecturing treatmentswith data on failure rates. Given that the raw failure rate inthis sample averaged 33.8% under traditional lecturing and21.8% under active learning, the data suggest that 3,516 fewerstudents would have failed these STEM courses under activelearning. Based on conservative assumptions (SI Materials andMethods), this translates into over US$3,500,000 in saved tuitiondollars for the study population, had all students been exposedto active learning. If active learning were implemented widely,the total tuition dollars saved would be orders of magnitudelarger, given that there were 21 million students enrolled inUS colleges and universities alone in 2010, and that about athird of these students intended to major in STEM fields asentering freshmen (17, 18).

Finally, increased grades and fewer failures should make asignificant impact on the pipeline problem. For example, the2012 President’s Council of Advisors on Science and Technologyreport calls for an additional one million STEM majors in theUnited States in the next decade—requiring a 33% increase

from the current annual total—and notes that simply increasingthe current STEM retention rate of 40% to 50% would meetthree-quarters of that goal (5). According to a recent cohortstudy from the National Center for Education Statistics (19),there are gaps of 0.5 and 0.4 in the STEM-course grade pointaverages (GPAs) of first-year bachelor’s and associate’s degreestudents, respectively, who end up leaving versus persisting inSTEM programs. A 0.3 “bump” in average grades with activelearning would get the “leavers” close to the current perfor-mance level of “persisters.” Other analyses of students who leaveSTEM majors indicate that increased passing rates, higher grades,and increased engagement in courses all play a positive role in re-tention (20–22).In addition to providing evidence that active learning can

improve undergraduate STEM education, the results reportedhere have important implications for future research. The studieswe metaanalyzed represent the first-generation of work on un-dergraduate STEM education, where researchers contrasted adiverse array of active learning approaches and intensities withtraditional lecturing. Given our results, it is reasonable to raiseconcerns about the continued use of traditional lecturing as acontrol in future experiments. Instead, it may be more pro-ductive to focus on what we call “second-generation research”:using advances in educational psychology and cognitive scienceto inspire changes in course design (23, 24), then testing hy-potheses about which type of active learning is most appropriateand efficient for certain topics or student populations (25).Second-generation research could also explore which aspects ofinstructor behavior are most important for achieving the greatestgains with active learning, and elaborate on recent work in-dicating that underprepared and underrepresented students maybenefit most from active methods. In addition, it will be impor-tant to address questions about the intensity of active learning:Is more always better? Although the time devoted to activelearning was highly variable in the studies analyzed here, rangingfrom just 10–15% of class time being devoted to clicker questionsto lecture-free “studio” environments, we were not able to evaluatethe relationship between the intensity (or type) of active learningand student performance, due to lack of data (SI Materialsand Methods).As research continues, we predict that course designs inspired

by second-generation studies will result in additional gains instudent achievement, especially when the types of active learninginterventions analyzed here—which focused solely on in-classinnovations—are combined with required exercises that arecompleted outside of formal class sessions (26).Finally, the data suggest that STEM instructors may begin to

question the continued use of traditional lecturing in everydaypractice, especially in light of recent work indicating that activelearning confers disproportionate benefits for STEM studentsfrom disadvantaged backgrounds and for female students inmale-dominated fields (27, 28). Although traditional lecturinghas dominated undergraduate instruction for most of a millen-nium and continues to have strong advocates (29), current evi-dence suggests that a constructivist “ask, don’t tell” approachmay lead to strong increases in student performance—amplifyingrecent calls from policy makers and researchers to support facultywho are transforming their undergraduate STEM courses (5, 30).

Materials and MethodsTo create a working definition of active learning, we collected written defi-nitions from 338 audience members, before biology departmental seminarson active learning, at universities throughout the United States and Canada.We then coded elements in the responses to create the following con-sensus definition:

Active learning engages students in the process of learning throughactivities and/or discussion in class, as opposed to passively listening

4 of 6 | www.pnas.org/cgi/doi/10.1073/pnas.1319030111 Freeman et al.

Page 22: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

to an expert. It emphasizes higher-order thinking and often involvesgroup work. (See also ref. 31, p. iii).

Following Bligh (32), we defined traditional lecturing as “. . .continuous ex-position by the teacher.” Under this definition, student activity was assumedto be limited to taking notes and/or asking occasional and unpromptedquestions of the instructor.

Literature Search. We searched the gray literature, primarily in the form ofunpublished dissertations and conference proceedings, in addition to peer-reviewed sources (33, 34) for studies that compared student performancein undergraduate STEM courses under traditional lecturing versus activelearning. We used four approaches (35) to find papers for consideration:hand-searching every issue in 55 STEM education journals from June 1, 1998to January 1, 2010 (Table S3), searching seven online databases using anarray of terms, mining reviews and bibliographies (SI Materials and Methods),and “snowballing” from references in papers admitted to the study (SIMaterials and Methods). We had no starting time limit for admission tothe study; the ending cutoff for consideration was completion or publicationbefore January 1, 2010.

Criteria for Admission. As recommended (36), the criteria for admission to thecoding and final data analysis phases of the study were established at theonset of the work and were not altered. We coded studies that (i) contrastedtraditional lecturing with any active learning intervention, with total classtime devoted to each approach not differing by more than 30 min/wk; (ii)occurred in the context of a regularly scheduled course for undergraduates;(iii) were largely or solely limited to changes in the conduct of the regularlyscheduled class or recitation sessions; (iv) involved a course in astronomy,biology, chemistry, computer science, engineering, geology, mathematics,natural resources or environmental science, nutrition or food science,physics, psychology, or statistics; and (v) included data on some aspect ofstudent academic performance.

Note that criterion i yielded papers representing a wide array of activelearning activities, including vaguely defined “cooperative group activitiesin class,” in-class worksheets, clickers, problem-based learning (PBL), andstudio classrooms, with intensities ranging from 10% to 100% of class time(SI Materials and Methods). Thus, this study’s intent was to evaluate theaverage effect of any active learning type and intensity contrasted withtraditional lecturing.

The literature search yielded 642 papers that appeared to meet these fivecriteria and were subsequently coded by at least one of the authors.

Coding. All 642 papers were coded by one of the authors (S.F.) and 398 werecoded independently by at least one other member of the author team (M.M.,M.S., M.P.W., N.O., or H.J.). The 244 “easy rejects”were excluded from the studyafter the initial coder (S.F.) determined that they clearly did not meet one ormore of the five criteria for admission; a post hoc analysis suggested that theeasy rejects were justified (SI Materials and Methods).

The two coders met to review each of the remaining 398 papers and reachconsensus (37, 38) on

i) The five criteria listed above for admission to the study;ii) Examination equivalence—meaning that the assessment given to stu-

dents in the lecturing and active learning treatment groups had to beidentical, equivalent as judged by at least one third-party observerrecruited by the authors of the study in question but blind to the hy-pothesis being tested, or comprising questions drawn at random froma common test bank;

iii) Student equivalence—specifically whether the experiment was based onrandomization or quasirandomization among treatments and, if quasir-andom, whether students in the lecture and active learning treatmentswere statistically indistinguishable in terms of (a) prior general academicperformance (usually measured by college GPA at the time of enteringthe course, Scholastic Aptitude Test, or American College Testing scores),or (b) pretests directly relevant to the topic in question;

iv) Instructor equivalence—meaning whether the instructors in the lectureand active learning treatments were identical, randomly assigned, orconsisted of a group of three or more in each treatment; and

v) Data that could be used for computing an effect size.

To reduce or eliminate pseudoreplication, the coders also annotated theeffect size data using preestablished criteria to identify and report effectsizes only from studies that represented independent courses and pop-ulations reported. If the data reported were from iterations of the samecourse at the same institution, we combined data recorded for more than

one control and/or treatment group from the same experiment. We alsocombined data from multiple outcomes from the same study (e.g., a seriesof equivalent midterm examinations) (SI Materials and Methods). Codersalso extracted data on class size, course type, course level, and type of activelearning, when available.

Criteria iii and iv were meant to assess methodological quality in the finaldatasets, which comprised 158 independent comparisons with data on stu-dent examination performance and 67 independent comparisons with dataon failure rates. The data analyzed and references to the correspondingpapers are archived in Table S4.

Data Analysis. Before analyzing the data, we inspected the distribution ofclass sizes in the study and binned this variable as small, medium, and large(SI Materials and Methods). We also used established protocols (38, 39) tocombine data from multiple treatments/controls and/or data from multipleoutcomes, and thus produce a single pairwise comparison from each in-dependent course and student population in the study (SI Materials andMethods).

The data we analyzed came from two types of studies: (i) randomizedtrials, where each student was randomly placed in a treatment; and (ii)quasirandom designs where students self-sorted into classes, blind to thetreatment at the time of registering for the class. It is important to note thatin the quasirandom experiments, students were assigned to treatment asa group, meaning that they are not statistically independent samples. Thisleads to statistical problems: The number of independent data points in eachtreatment is not equal to the number of students (40). The element ofnonindependence in quasirandom designs can cause variance calculations tounderestimate the actual variance, leading to overestimates for significancelevels and for the weight that each study is assigned (41). To correct for thiselement of nonindependence in quasirandom studies, we used a clusteradjustment calculator in Microsoft Excel based on methods developed byHedges (40) and implemented in several recent metaanalyses (42, 43).Adjusting for clustering in our data required an estimate of the intraclasscorrelation coefficient (ICC). None of our studies reported ICCs, however,and to our knowledge, no studies have reported an ICC in college-level STEMcourses. Thus, to obtain an estimate for the ICC, we turned to the K–12literature. A recent paper reviewed ICCs for academic achievement inmathematics and reading for a national sample of K–12 students (44). Weused the mean ICC reported for mathematics (0.22) as a conservative es-timate of the ICC in college-level STEM classrooms. Note that although thecluster correction has a large influence on the variance for each study, itdoes not influence the effect size point estimate substantially.

We computed effect sizes and conducted the metaanalysis in the Com-prehensive Meta-Analysis software package (45). All reported P values aretwo-tailed, unless noted.

We used a random effects model (46, 47) to compare effect sizes. Therandom effect size model was appropriate because conditions that couldaffect learning gains varied among studies in the analysis, including the (i)type (e.g., PBL versus clickers), intensity (percentage of class time devoted toconstructivist activities), and implementation (e.g., graded or ungraded) ofactive learning; (ii) student population; (iii) course level and discipline; and(iv) type, cognitive level, and timing—relative to the active learning exercise—of examinations or other assessments.

We calculated effect sizes as (i) the weighted standardized mean differ-ence as Hedges’ g (48) for data on examination scores, and (ii) the log-oddsfor data on failure rates. For ease of interpretation, we then converted log-odds values to odds ratio, risk ratio, or relative risk (49).

To evaluate the influence of publication bias on the results, we assessedfunnel plots visually (50) and statistically (51), applied Duval and Tweedie’strim and fill method (51), and calculated fail-safe Ns (45).

Additional Results. We did not insist that assessments be identical or formallyequivalent if studies reported only data on failure rates. To evaluate thehypothesis that differences in failure rates recorded under traditional lec-turing and active learning were due to changes in the difficulty of exami-nations and other course assessments, we evaluated 11 studies where failurerate data were based on comparisons in which most or all examinationquestions were identical. The average odds ratio for these 11 studies was 1.97 ±0.36 (SE)—almost exactly the effect size calculated from the entire dataset.

Although we did not metaanalyze the data using “vote-counting”approaches, it is informative to note that of the studies reporting statisticaltests of examination score data, 94 reported significant gains under activelearning whereas only 41 did not (Table S4A).

Additional results from the analyses on publication bias are reported inSupporting Information.

Freeman et al. PNAS Early Edition | 5 of 6

PSYC

HOLO

GICALAND

COGNITIVESC

IENCE

S

Page 23: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

ACKNOWLEDGMENTS.We thank Roddy Theobald for advice on interpretingodds ratios; the many authors who provided missing data upon request (SIMaterials and Methods); Colleen Craig, Daryl Pedigo, and Deborah Wiegandfor supplying information on examination score standard deviations and

grading thresholds; Kelly Puzio and an anonymous reviewer for advice onanalyzing data from quasirandom studies; and Steven Kroiss, Carl Wieman,and William Wood for comments that improved the manuscript. M.S. wassupported in part by National Science Foundation Grant 0962805.

1. Brockliss L (1996) Curricula. A History of the University in Europe, ed de Ridder-Symoens H (Cambridge Univ Press, Cambridge, UK), Vol II, pp 565–620.

2. Piaget J (1926) The Language and Thought of the Child (Harcourt Brace, New York).3. Vygotsky LS (1978) Mind in Society (Harvard Univ Press, Cambridge, MA).4. Handelsman J, et al. (2004) Education. Scientific teaching. Science 304(5670):521–522.5. PCAST STEM Undergraduate Working Group (2012) Engage to Excel: Producing One

Million Additional College Graduates with Degrees in Science, Technology, Engi-neering, and Mathematics, eds Gates SJ, Jr, Handelsman J, Lepage GP, Mirkin C (Officeof the President, Washington).

6. Haukoos GD, Penick JE (1983) The influence of classroom climate on science processand content achievement of community college students. J Res Sci Teach 20(7):629–637.

7. Martin T, Rivale SD, Diller KR (2007) Comparison of student learning in challenge-based and traditional instruction in biomedical engineering. Ann Biomed Eng 35(8):1312–1323.

8. Cordray DS, Harris TR, Klein S (2009) A research synthesis of the effectiveness, repli-cability, and generality of the VaNTH challenge-based instructional modules in bio-engineering. J. Eng Ed 98(4).

9. Jensen JL, Lawson A (2011) Effects of collaborative group composition and inquiryinstruction on reasoning gains and achievement in undergraduate biology. CBE LifeSci Educ 10(1):64–73.

10. Momsen JL, Long TM, Wyse SA, Ebert-May D (2010) Just the facts? Introductory un-dergraduate biology courses focus on low-level cognitive skills. CBE Life Sci Educ 9(4):435–440.

11. Ruiz-Primo MA, Briggs D, Iverson H, Talbot R, Shepard LA (2011) Impact of un-dergraduate science course innovations on learning. Science 331(6022):1269–1270.

12. Springer L, Stanne ME, Donovan SS (1999) Effects of small-group learning on un-dergraduates in science, mathematics, engineering, and technology. Rev Educ Res69(1):21–51.

13. Bowen CW (2000) A quantitative literature review of cooperative learning effects onhigh school and college chemistry achievement. J Chem Educ 77(1):116–119.

14. Lipsey MW, et al. (2012) Translating the Statistical Representation of the Effects ofEducational Interventions into Readily Interpretable Forms (US Department of Edu-cation, Washington).

15. Montori VM, et al. (2005) Randomized trials stopped early for benefit: A systematicreview. JAMA 294(17):2203–2209.

16. Pocock SJ (2006) Current controversies in data monitoring for clinical trials. Clin Trials3(6):513–521.

17. National Center for Education Statistics (2012) Digest of Education Statistics (US De-partment of Education, Washington).

18. National Science Board (2010) Science and Engineering Indicators 2010 (NationalScience Foundation, Arlington, VA).

19. National Center for Education Statistics (2012) STEM in Postsecondary Education (USDepartment of Education, Washington).

20. Seymour E, Hewitt NM (1997) Talking About Leaving: Why Undergraduates Leave theSciences (Westview Press, Boulder, CO).

21. Goodman IF, et al. (2002) Final Report of the Women’s Experiences in College Engi-neering (WECE) Project (Goodman Research Group, Cambridge, MA).

22. Watkins J, Mazur E (2013) Retaining students in science, technology, engineering, andmathematics (STEM) majors. J Coll Sci Teach 42(5):36–41.

23. Slavich GM, Zimbardo PG (2012) Transformational teaching: Theoretical under-pinnings, basic principles, and core methods. Educ Psychol Rev 24(4):569–608.

24. Dunlosky J, Rawson KA, Marsh EJ, Nathan MJ, Willingham DT (2013) Improving stu-dents’ learning with effective learning techniques: Promising directions from cogni-tive and educational psychology. Psych Sci Publ Int 14(1):4–58.

25. Eddy S, Crowe AJ, Wenderoth MP, Freeman S (2013) How should we teach tree-thinking? An experimental test of two hypotheses. Evol Ed Outreach 6:1–11.

26. Freeman S, Haak D, Wenderoth MP (2011) Increased course structure improves per-formance in introductory biology. CBE Life Sci Educ 10(2):175–186.

27. Lorenzo M, Crouch CH, Mazur E (2006) Reducing the gender gap in the physicsclassroom. Am J Phys 74(2):118–122.

28. Haak DC, HilleRisLambers J, Pitre E, Freeman S (2011) Increased structure and activelearning reduce the achievement gap in introductory biology. Science 332(6034):1213–1216.

29. Burgan M (2006) In defense of lecturing. Change 6:31–34.30. Henderson C, Beach A, Finkelstein N (2011) Facilitating change in undergraduate

STEM instructional practices: An analytic review of the literature. J Res Sci Teach 48(8):952–984.

31. Bonwell CC, Eison JA (1991) Active Learning: Creating Excitement in the Classroom(George Washington Univ, Washington, DC).

32. Bligh DA (2000) What’s the Use of Lectures? (Jossey-Bass, San Francisco).33. Reed JG, Baxter PM (2009) Using reference databases. The Handbook of Research

Synthesis and Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC (Russell SageFoundation, New York), pp 73–101.

34. Rothstein H, Hopewell S (2009) Grey literature. The Handbook of Research Synthesisand Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC (Russell Sage Foundation,New York), pp 103–125.

35. White HD (2009) Scientific communication and literature retrieval. The Handbook ofResearch Synthesis and Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC (RussellSage Foundation, New York), pp 51–71.

36. Lipsey MW, Wilson DB (2001) Practical Meta-Analysis (Sage Publications, ThousandOaks, CA).

37. Orwin RG, Vevea JL (2009) Evaluating coding decisions. The Handbook of ResearchSynthesis and Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC (Russell SageFoundation, New York), pp 177–203.

38. Higgins JPT, Green S, eds (2011) Cochrane Handbook for Systematic Reviews of In-terventions, Version 5.1.0 (The Cochrane Collaboration, Oxford). Available at www.cochrane-handbook.org. Accessed December 14, 2012.

39. Borenstein M (2009) Effect sizes for continuous data. The Handbook of SystematicReview and Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC (Russell SageFoundation, New York), pp 221–235.

40. Hedges LV (2007) Correcting a significance test for clustering. J Educ Behav Stat 32(2):151–179.

41. Donner A, Klar N (2002) Issues in the meta-analysis of cluster randomized trials. StatMed 21(19):2971–2980.

42. Davis D (2012) Multiple Comprehension Strategies Instruction (MCSI) for ImprovingReading Comprehension and Strategy Outcomes in the Middle Grades. (The CampbellCollaboration, Oxford). Available at http://campbellcollaboration.org/lib/project/167/.Accessed December 10, 2013.

43. Puzio K, Colby GT (2013) Cooperative learning and literacy: A meta-analytic review.J Res Ed Effect 6(4):339–360.

44. Hedges LV, Hedberg EC (2007) Intraclass correlation values for planning group-ran-domized trials in education. Educ Eval Policy Anal 29:60–87.

45. Borenstein M, et al. (2006) Comprehensive Meta-Analysis (Biostat, Inc., Englewood, NJ).46. Hedges LV (2009) Statistical considerations. The Handbook of Research Synthesis and

Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC (Russell Sage Foundation, NewYork), pp 38–47.

47. Raudenbush SW (2009) Analyzing effect sizes: Random-effects models. The Hand-book of Research Synthesis and Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC(Russell Sage Foundation, New York), pp 295–315.

48. Gurevitch J, Hedges LV (1999) Statistical issues in ecological meta-analyses. Ecology80(4):1142–1149.

49. Fleiss J, Berlin JA (2009) Effect sizes for dichotomous data. The Handbook of ResearchSynthesis and Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC (Russell SageFoundation, New York), pp 237–253.

50. Greenhouse JB, Iyengar S (2009) Sensitivity analysis and diagnostics. The Handbook ofResearch Synthesis and Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC (RussellSage Foundation, New York), pp 417–433.

51. Sutton AJ (2009) Publication bias. The Handbook of Research Synthesis and Meta-Analysis, eds Cooper H, Hedges LV, Valentine JC (Russell Sage Foundation, New York),pp 435–452.

6 of 6 | www.pnas.org/cgi/doi/10.1073/pnas.1319030111 Freeman et al.

Page 24: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

HW:$Why$use$Active$Learning?$

I"hear"and"I"forget."I"see"and"I"remember.""I"do"and"I"understand."

- Confucius) Read% the% summary% and% article.% % Use% them% to%make% your% case% as% to% the% importance% of% shifting% the% focus% in% your%classroom%from%what%you,%the%instructor,%do%to%what%students%should%be%able%to%do%with%course%material.$$ $ $ $ $ $ $ $ $ $ $ $% Ways$to$use$Active$Learning$in$your$classroom:$

What$are$the$Student!Gains$from$using$Active$Learning$in$your$classroom?$

Reasons$to$switch$to$Active$Learning$in$your$classroom:$

Page 25: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Using the climate debate to revitalize general chemistry

Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards

Day 2

Activities &�

Assessments

Page 26: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Adapted from Understanding by Design by Wiggins

& McTighe

Big Idea What is the “big picture” or

lofty idea?

Goals What do you want your

student to be able to do?

Activities Debates, reports,

experiments, posters, presentations,

interviews, essays, exams

Assessment Did students achieve the goals? Include

formative & summative assessment.

Reflection What worked? What

can be improved?

Workshop Overview

Page 27: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

THE EDUCATION ALLIANCE at Brown University

Changing Systems to Personalize Learning: Teaching to Each Student

Chalk Talk

Chalk Talk is a silent way to reflect, generate ideas, check on learning, develop projects, or solveproblems. It can be used productively with any group—students, faculty, workshop participants,committees. Because it is done completely in silence, it gives groups a change of pace and encour-ages thoughtful contemplation. It can be an unforgettable experience.

Format

Time: Varies according to need, can be from 5 minutes to an hour.

Materials: Chalk board and chalk or paper roll on the wall and markers.

Process:

1 The facilitator explains VERY BRIEFLY that Chalk Talk is a silent activity. (No one may talk atall. Anyone may add to the chalk talk as they please.) You can comment on other people’sideas simply by drawing a connecting line to the comment. It can also be very effective to saynothing at all except to put finger to lips in a gesture of silence and simply begin with #2.

2 The facilitator writes a relevant question in a circle on the board. Sample questions:! What did you learn today?! So what? Or now what?! What do you think about social responsibility and schooling?! How can we involve the community in the school, and the school in community?! How can we keep the noise level down in this room?! What do you want to tell the scheduling committee?! What do you know about Croatia?! How are decimals used in the world?

3 The facilitator either hands a piece of chalk to everyone or places many pieces of chalk atthe board and hands several pieces to people at random.

4 People write as they feel moved. There are likely to be long silences—that is natural, soallow plenty of wait time before deciding it is over.

5 How the facilitator chooses to interact with the Chalk Talk influences its outcome. Thefacilitator can stand back and let it unfold or expand thinking by:! circling other interesting ideas, thereby inviting comments to broaden! writing questions about a participant comment adding his/her own reflections or ideas

Source: Coalition of Essential Schools. Reprinted with permission.

Page 28: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#6:'Active'Learning'Feedback'!

WARM FEEDBACK COOL FEEDBACK!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Page 29: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

•  Analogical or metaphorical description/creative expression (drama/role play). This is formalized in the ‘Reacting to the past’ pedagogy.

•  Case study—use of a relevant scenario to provoke application of learning •  “Clickers”—responding to teacher-made questions electronically and results

can be displayed for whole class. •  Concept mapping – students construct a diagram showing how different

ideas and topics are related to one another. •  Cooperative learning—a structured vehicle where the only means for the

small group to succeed is by each individual making a substantive contribution to the task/goal. There are many types of cooperative models.

•  Debate or mock trial—current or historical •  Fishbowl discussion—inner participants discuss while outer viewers remain

silent. One variation is to have students lead the discussion •  Goal-setting •  Group quizzes •  Inquiry/discovery learning •  Jigsaw – different groups of students each become ‘experts’ on a certain

topic. Then new groups form and these new groups each contain one expert from the original groups. The students then teach each other in the new groups.

•  News article summaries—finding, summarizing, and explaining relevant content in a current news article

•  Peer and/or self-grading—a reflective technique that promotes critical thinking

•  Peer tutoring / peer instruction •  Presentations – poster or oral presentations. •  PBL-Problem-based learning •  Research projects •  Service-learning—applying academic content learned to help address a real

community need •  Simulations—virtual or real-time interactions •  Social learning--small groups discuss answers to a teacher-posed question

before responding to the large group. Team-based learning is a more structured variation.

•  Surveys •  Team-based Learning

A short (and by no means conclusive list) of active learning strategies

Page 30: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#7:'Jigsaw'of'Active'Learning'Strategies'Each%team%member%will%select%an%activity%from%the%table.%%%1) Become%an%expert%on%your%selected%activity%(individual)%

a) Read%and%research%topic%b) Be%able%to%explain%to%your%group%

2) Put%the%puzzle%together%(group)%a) Each%team%member%will%explain%their%activity%to%the%team%b) Take%notes%and%be%able%to%explain%other%types%of%activities%

3) Report%Out%(group)%a) Reflect%b) Written%or%oral%summary%

Active!Learning!Strategy! Description!of!Activity! Notes:!

' ' '

' ' '

' ' '

' ' '

%

Page 31: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#8:'Design'an'Activity'for'Buckets'INSTRUCTIONS!

• Choose a bucket of goals and dump it out • Brainstorm ideas for activities that incorporate your goals

o Try to design activities that use more than one goal at once o Incorporate activities that you already have

• Formalize your thoughts into a fully designed activity by answering the guiding questions below.

Guiding Question

1. What do you need? (Materials)

2. What will you do? (Preparation)

!Activity'#8:'Design'Your'Own'Activity'cont’d'

Page 32: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

3. What will students do?

4. How does the activity meet the goals?

5. How will you know if students are successful?

!

Page 33: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#9:'Design'Your'Own'Activity'INSTRUCTIONS!

• Pick one of your own goals • Brainstorm ideas for activities that incorporate your goals

o Try to design activities that use more than one goal at once o Incorporate activities that you already have

• Formalize your thoughts into a fully designed activity by answering the guiding questions below.

Guiding Question

1. What do you need? (Materials)

2. What will you do? (Preparation)

!! '

Page 34: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#9:'Design'Your'Own'Activity'cont’d'

3. What will students do?

4. How does the activity meet the goals?

5. How will you know if students are successful?

!

Page 35: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#10:'Assessment'Anticipation'Guide''DIRECTIONS:+Following+is+a+set+of+statements+about+assessment+practices+for+higher+educators+that+affect+student+learning.+Before+the+session+begins,+complete+this+anticipation+guide+by+marking+on+the+line+to+the+left.++Write+A"for+those+statements+that+you+agree+with;+D+for+those+with+which+you+disagree.+YOU+HAVE+TO+CHOOSE+ONE+OR+THE+OTHER!++_____+1)+The+terms+assessment,(grading,(test,(and+evaluation+mean+effectively+the+same+thing.+++_____+2)+The+test+grade+is+an+effective+means+for+students+to+receive+feedback+and+improve+performance.++++_____+3)+Rubrics+developed+and+distributed+before+an+assignment+generally+help+to+improve+learning.+++_____+4)+It+is+unfair+to+test+students+with+illTstructured+problems+that+they+haven’t+been+exposed+to+in+solving+those+from+the+textbook.++++_____+5)+Generally+multipleTchoice+questions+work+better+to+measure+content+knowledge+while+essay+questions+that+are+openTended+are+more+suited+to+assessing+higherTorder+skills.++++_____+6)+Information+from+student+performance+on+assignments+can+assist+the+professor+in+teaching+better.+++_____+7)+From+time+to+time+reviewing+student+work+products+with+colleagues+can+inform+the+professor+about+his/her+teaching+effectiveness.+++_____+8)+If+students+are+expected+to+arrive+in+your+class+knowing+certain+information+that+you+do+not+teach+them+it+is++good+practice+to+test+and+grade+them+on+it+so+they+know+you+are+serious.+++_____+9)+If+you+want+students+to+work+together+collaboratively+you+should+expect+to+teach+and+assess+skills+such+as+conflict+resolution.+++_____+10)+An+anticipation+guide+can+be+a+useful+assessment+tool+for+the+instructor+and/or+the+student.++++ +

Page 36: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Information'for'Anticipation'Guide'1)+Assessment+ is+ the+process+of+ gathering+ information,+which+will+be+used+ to+make+a+ judgment.+Thus+noting+ that+most+ students+ missed+ item+ 12+ on+ a+ test+ is+ an+ assessment+ process+ that+ might+ inform+ you+ that+ you+ included+ a+vaguelyTworded+ question+ on+ the+ test+ or+ that+ item+ 12+ tests+ a+ sophisticated+ concept+most+ students+ did+ not+ “get”.+Giving+a+test+ is+an+assessment+but+so+ is+asking+a+question+orally+that+ is+answered+correctly+(or+ incorrectly).+Even+noting+which+ students,+ if+ any,+ raise+ their+ hands+ to+ respond+ to+ a+ question+ offers+ you+ valuable+ information+ upon+which+you+might+make+a+ judgment+ such+as+ “I’ll+ proceed+with+what+ I+ planned+ to+do+next+because+many+ students+understand+this+portion.”+++Grading+is+the+process+of+scoring+student+work,+the+results+of+which+can+be+used+to+rank+student+performance.+It+is+common+to+believe+that+high+scores+mean+students+learned+a+great+deal+and+low+scores+mean+they+did+not+but+this+is+not+a+fair+conclusion+to+draw.+Students+may+score+well+because+they+already+knew+the+material+or+score+poorly+because+the+material+being+assessed+was+not+taught.+Many+people,+students,+professors,+and+the+public+alike,+equate+grades+with+learning+but+that+is+often+wrong.++Test+ is+any+ formalized+attempt+to+determine+what+a+ learner+knows+and+or+ is+able+ to+do+ in+relationship+to+certain+criteria+or+standard.+Tests+are+often+scored+and+assigned+a+grade.+However+the+information+about+performance+is+documented.+The+speech+referred+to+below+can+be+considered+a+test+even+though+it+is+a+performance+but+so+too+is+traditional+paperTandTpencil+quiz.+Even+a+checklist+of+behaviors+can+be+a+test+++Evaluation+is+the+process+of+determining+what+level+of+performance+meets+the+standard+or+does+not.+It+is+common+to+believe+that+scores+above+70+or+letter+grades+of+“C”+or+above+show+a+performance+that+meets+the+standard+but+these+are+arbitrary+designations.++In+a+performance+such+as+giving+a+speech+the+level+judged+to+be+adequate+might+include+speaker’s+voice+projection+should+be+sufficient+to+be+heard+in+a+20”x24”+room.+That+criteria+(voice+projection)+and+standard+(heard+anywhere+in+20x24+room)+can+be+assigned+a+grade+of+“A”,+or+“B”,+or+“D”+or+anything+you’d+like+such+as+“butterfly”,+“gobbledygook”,+etc.+The+point+is+the+association+between+assessment,+grading,+test,+and+evaluation+has+ evolved+ as+ an+ unintended+ consequence+ of+ the+ system+ without+ much+ consideration+ of+ the+ goal.+ Those+ who+understand+the+differences+in+the+terms+and+use+them+appropriately+are+able+to+be+intentional+about+their+teaching+and+make+better+judgments+regarding+student+learning++2)+For+many+people+tests(and+grades(cause+some+anxiety+and+as+a+result+they+don’t+“hear”+well.++Feedback+is+meant+to+provide+information+to+the+learner+about+his/her+performance+in+terms+of+certain+criteria+and+standards.+If+one’s+hearing+is+obstructed,+due+to+anxiety,+then+even+in+the+best+of+worlds,+one+might+not+be+able+to+use+the+feedback+of+a+test+ grade.+ On+ top+ of+ that,+ tests+ and+ grades+ are+ often+ idiosyncratic—your+ 80+ is+my+ 90;+ your+ desire+ to+measure+application+counters+my+quest+ to+ test+ content+acquisition,+ and+so+on.+Therefore+ if+ you+want+ to+offer+ learners+ the+opportunity+ to+ improve,+ providing+ them+ feedback+ in+ a+ lessTintimidating+ and+meaningful+way+will+ improve+ their+performance+more+than+will+a+test+grade+because+they+will+be+able+to+apply+the+information+you+convey.++++++3)+Assessment+can+be+formative,+ taking+place+while+learning+is+in+process+or+summative,+occurring+at+the+end+of+a+learning+ cycle.+ Summative+assessment+ is+often+graded.+Learning+ increases+as+ formative+assessment+ is+ conducted+along+the+way+as+it+can+provide+feedback+to+students.+Feedback+can+be+made+availTable+in+various+ways+but+a+potent+vehicle+is+through+rubrics+that+explicate+the+criteria+upon+which+learners+will+be+judged+and+the+standards+they+will+be+expected+to+meet.+Benchmarks+(levels+of+performance+such+as+“basic”+or+ “meets”)+show+students+ the+ learning+targets+ and+ they+ can+ then+ use+ the+ information+ to+ improve+ in+ advance+ of+ a+ final+ judgment.(Designing+ effective+rubrics+is+an+art+that+requires+much+practice.++

Page 37: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Information'for'Anticipation'Guide'cont’d'4+&+8)+Unfortunately+much+of+what+ is+ taught+and+assessed+ in+college+courses+ is+ lowTlevel,+ recall+material,+ in+part+because+ it+ is+ the+ easiest+ to+ test.+ However,+ most+ professors+ would+ say+ they+ want+ students+ to+ think+ critically.++Addressing+illTstructured+problems,+those+that+mirror+realTworld+situations,+calls+for+critical+and+creative+thinking+and+so+they+should+be+used+in+assessments.+IllTstructured+problems+have+uncertainty+and+messiness+built+into+them+and+do+not+have+a+ single+correct+answer.+They+require+higherTorder+skills+of+evaluation+and+synthesis+as+well+as+application+of+previously+ learned+skills+ in+novel+contexts.+The+key+to+whether+they+should+be+used+on+summative+assessments+is+to+ensure+that+students+have+had+to+apply+their+learning+before+a+graded+test,+not+merely+parroted+a+response+that+can+be+found+in+the+textbook.+Similarly,+if+you+are+not+teaching+material+or+students+are+not+provided+an+opportunity+ to+ learn+ it+ in+your+course,+ it+ is+not+something+that+you+should+be+assessing.+ +After+all,+you+will+be+judged+ ineffective+ if+ students+ perform+ poorly+ on+ your+ assessments—it+means+ they+ aren’t+ learning+ so+ assessing+them+on+material+that+they+haven’t+mastered+from+other+courses+hurts+you,+not+them.++5)+ Students+must+ know+content+ to+which+ they+ can+ apply+more+ sophisticated+ thinking+ so+ there+ is+ nothing+wrong+with+ teaching+ and+ assessing+ lower+ order,+ recall+ material.+ MultipleTchoice,+ matching,+ and+ other+ items+ where+students+select+a+response+are+an+efficient+way+to+do+this.+Similar+to+the+illTstructured+problem+referred+to+above,+assessing+students’+ability+to+use+their+knowledge+is+better+done+in+a+format+for+which+there+is+not+a+single,+correct+response.+Responding+to+an+essay+question+generally+gets+at+students’+capacity+for+applying+the+content+material.+You+can+create+multipleTchoice+questions+that+test+higherTorder+skills,+for+instance+by+asking+for+a+justification+of+a+causal+relationship,+however,+ these+are+generally+harder+to+create.+You+can+also+ increase+the+difficulty+ level+of+an+assessment+ of+ comprehension+ material+ in+ selectedTresponse,+ matching+ items+ by+ including+ more+ terms+ than+definitions+to+be+matched+or+including+terms+that+can+be+matched+to+more+than+one+definition.++6+ &+ 7)+ Assessment+ isn’t+ just+ for+ judging+ students+ but+ it+ is+ especially+ useful+ for+ improving+ teaching.+ Professors+should+ look+ not+ just+ at+ grade+ distributions+ but+ consider+ item+ analysis+ or+ reviewing+ student+ performance+ for+patterns+that+would+indicate+how+teaching+can+be+improved.+Sometimes+by+seeing+what+students+can+do,+it+affirms+that+your+teaching+strategies+are+working,+and+by+determining+what+they+can’t+do+provides+insight+into+what+you+might+ change.+ For+ instance,+ poor+ performance+ on+ shortTquizzes+ at+ the+ beginning+ of+ class+ might+ suggest+ that+students+ aren’t+ reading+ the+ material+ closely.+ You+might+ change+ your+ practice+ so+ that+ rather+ than+merely+ being+assigned+ reading+ students+ must+ come+ to+ class+ with+ two+ questions+ they+ have+ about+ the+ reading.+ Or+ you+ might+change+your+practice+so+that+you+model+how+you+complete+a+close+reading+of+material.+Having+colleagues+work+with+you+ from+ time+ to+ time+ might+ bring+ a+ fresh+ perspective+ to+ your+ own+ observation+ and+ conclusion+ about+ what+students+are+learning.++++++9)+ Just+as+ in+#7+where+you+wouldn’t+ test+what+you+hadn’t+ taught;+you+are+obliged+ to+ teach+what+you+will+ assess.+Especially+with+group+work+professors+presume+that+students+have+the+skills+ to+work+together+when+ in+ fact+ they+haven’t+ been+ taught+ those+ skills.+ Conflict+ resolution+ is+ one+ area+where+most+ people+ are+ deficient+ and+ often+why+group+ work+ is+ ineffective.+ If+ you+ will+ be+ assigning+ group+ work+ you+ need+ to+ assess+ whether+ student+ have+ the+knowledge+and+skills+to+complete+it+and+if+they+do+not,+you+should+teach+those+skills.+Once+taught+those+skills+should+be+assessed+summatively.+++10)+You+be+the+judge++

Page 38: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Formative vs. Summative Form

ative Assessm

ent Sum

mative A

ssessment

Goal

• On-going assessm

ents, reviews, and

observations • Provides im

mediate feedback in order

to improve student learning and

instructional methods

• Usually not graded (som

etimes w

orth a few

participation points) • C

an be anonymous

• Help students prepare for graded

(summ

ative assessment) and m

onitor progress • “low

stakes”

• Evaluate the effectiveness of instructional program

s and services at the end of an academ

ic year or at a pre-determ

ined time.

• The goal is to m

ake a judgment

of student competency and

determine if students have

mastered specific com

petencies • Typically graded • N

on-anonymous

• “High stakes”

Exam

ples Posing a question in class and asking for a show

of hands M

inute papers O

ne-sentence summ

aries O

bserving students H

omew

ork O

nline quizzes

Graded final exam

N

ational Standardized Test C

ritique of a senior project Portfolio Lab R

eport O

nline quizzes

Page 39: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Northern Illinois University, Faculty Development and Instructional Design Center [email protected], http://facdev.niu.edu, 815.753.0595

Assessment Evaluation Decision-Making

Formative and Summative Assessment Assessment is the process of gathering data. More specifically, assessment is the ways instructors gather data about their teaching and their students’ learning (Hanna & Dettmer, 2004). The data provide a picture of a range of activities using different forms of assessment such as: pre-tests, observations, and examinations. Once these data are gathered, you can then evaluate the student’s performance. Evaluation, therefore, draws on one’s judgment to determine the overall value of an outcome based on the assessment data. It is in the decision-making process then, where we design ways to improve the recognized weaknesses, gaps, or deficiencies. The figure below represents the systematic process of assessment, evaluation, and decision-making. The results (data) of the assessment (examinations, observations, essays, self-reflections) are evaluated based on judgment of those data. What to do next—the decision making step, is based on the evaluation.

Types of Assessment There are three types of assessment: diagnostic, formative, and summative. Although are three are generally referred to simply as assessment, there are distinct differences between the three.

1. Diagnostic Assessment Diagnostic assessment can help you identify your students’ current knowledge of a subject, their skill sets and capabilities, and to clarify misconceptions before teaching takes place (Just Science Now!, n.d.). Knowing students’ strengths and weaknesses can help you better plan what to teach and how to teach it. Types of Diagnostic Assessments

x Pre-tests (on content and abilities) x Self-assessments (identifying skills and competencies) x Discussion board responses (on content-specific prompts) x Interviews (brief, private, 10-minute interview of each student)

2. Formative Assessment Formative assessment provides feedback and information during the instructional process, while learning is taking place, and while learning is occurring. Formative assessment measures student progress but it can also assess your own progress as an instructor. For example, when

Assessment is the process of gathering data.

There are three types of assessment: diagnostic, formative, and summative.

Page 40: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

FORMATIVE AND SUMMATIVE ASSESSMENT Page | 2

Northern Illinois University, Faculty Development and Instructional Design Center [email protected], http://facdev.niu.edu, 815.753.0595

implementing a new activity in class, you can, through observation and/or surveying the students, determine whether or not the activity should be used again (or modified). A primary focus of formative assessment is to identify areas that may need improvement. These assessments typically are not graded and act as a gauge to students’ learning progress and to determine teaching effectiveness (implementing appropriate methods and activities). In another example, at the end of the third week of the semester, you can informally ask students questions which might be on a future exam to see if they truly understand the material. An exciting and efficient way to survey students’ grasp of knowledge is through the use of clickers. Clickers are interactive devices which can be used to assess students’ current knowledge on specific content. For example, after polling students you see that a large number of students did not correctly answer a question or seem confused about some particular content. At this point in the course you may need to go back and review that material or present it in such a way to make it more understandable to the students. This formative assessment has allowed you to “rethink” and then “re-deliver” that material to ensure students are on track. It is good practice to incorporate this type of assessment to “test” students’ knowledge before expecting all of them to do well on an examination.

Types of Formative Assessment

x Observations during in-class activities; of students non-verbal feedback during lecture

x Homework exercises as review for exams and class discussions) x Reflections journals that are reviewed periodically during the

semester x Question and answer sessions, both formal—planned and

informal—spontaneous x Conferences between the instructor and student at various points

in the semester x In-class activities where students informally present their results x Student feedback collected by periodically answering specific

question about the instruction and their self-evaluation of performance and progress

3. Summative Assessment

Summative assessment takes place after the learning has been completed and provides information and feedback that sums up the teaching and learning process. Typically, no more formal learning is taking place at this stage, other than incidental learning which might take place through the completion of projects and assignments. Rubrics, often developed around a set of standards or expectations, can be used for summative assessment. Rubrics can be given to students before they begin working on a particular project so they know what is

A primary focus of formative assessment is to identify areas that may need improvement.

It is good practice to incorporate this type of assessment to “test” students’ knowledge before expecting all of them to do well on an examination.

Page 41: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

FORMATIVE AND SUMMATIVE ASSESSMENT Page | 3

Northern Illinois University, Faculty Development and Instructional Design Center [email protected], http://facdev.niu.edu, 815.753.0595

expected of them (precisely what they have to do) for each of the criteria. Rubrics also can help you to be more objective when deriving a final, summative grade by following the same criteria students used to complete the project. High-stakes summative assessments typically are given to students at the end of a set point during or at the end of the semester to assess what has been learned and how well it was learned. Grades are usually an outcome of summative assessment: they indicate whether the student has an acceptable level of knowledge-gain—is the student able to effectively progress to the next part of the class? To the next course in the curriculum? To the next level of academic standing? See the section “Grading” for further information on grading and its affect on student achievement. Summative assessment is more product-oriented and assesses the final product, whereas formative assessment focuses on the process toward completing the product. Once the project is completed, no further revisions can be made. If, however, students are allowed to make revisions, the assessment becomes formative, where students can take advantage of the opportunity to improve. Types of Summative Assessment

x Examinations (major, high-stakes exams) x Final examination (a truly summative assessment) x Term papers (drafts submitted throughout the semester would be

a formative assessment) x Projects (project phases submitted at various completion points

could be formatively assessed) x Portfolios (could also be assessed during it’s development as a

formative assessment) x Performances x Student evaluation of the course (teaching effectiveness) x Instructor self-evaluation

Summary Assessment measures if and how students are learning and if the teaching methods are effectively relaying the intended messages. Hanna and Dettmer (2004) suggest that you should strive to develop a range of assessments strategies that match all aspects of their instructional plans. Instead of trying to differentiate between formative and summative assessments it may be more beneficial to begin planning assessment strategies to match instructional goals and objectives at the beginning of the semester and implement them throughout the entire instructional experience. The selection of appropriate assessments should also match course and program objectives necessary for accreditation requirements.

Rubrics also can help you to be more objective when deriving a final, summative grade by following the same criteria students used to complete the project.

Summative assessment is more product-oriented and assesses the final product, whereas formative assessment focuses on the process toward completing the product.

Page 42: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

FORMATIVE AND SUMMATIVE ASSESSMENT Page | 4

Northern Illinois University, Faculty Development and Instructional Design Center [email protected], http://facdev.niu.edu, 815.753.0595

References Hanna, G. S., & Dettmer, P. A. (2004). Assessment for effective teaching: Using

context-adaptive planning. Boston, MA: Pearson A&B. Just Science Now! (n.d.). Assessment-inquiry connection.

http://www.justsciencenow.com/assessment/index.htm

Page 43: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#11:'Assessment'Examples'Direct/ Indirect, D or I

Formal or Informal, F or I Formative or Summative, F or S

Example Description Direct

/ Ind Formal/ Inform

Form/ Summ

Notes

1. minute papers

2. clickers

3. games

4. written exams

5. poster presentations

6. oral presentations

7. portfolios

8. concept maps

9. rubrics

10. SALG online survey

!

Page 44: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#12:'Evaluation'Criteria!Refer%to%the%goals%and%activity%that%you%designed.%%Think%about%what%you%would%expect%from%your%ideal%student.%%What%knowledge,%skills%or%dispositions%would%you%observe?%%What%does%mastery%look%like?%WILL#STUDENTS#BE#ABLE!TO#DEMONSTRATE#GAINS!IN#THE#CRITERIA#BY"COMPLETING"THE"PLANNED$TASK?!!IS"MASTERY(POSSIBLE?!!DESCRIBE'THE'IDEAL%STUDENT%PERFORMANCE%BELOW:!%%%%%%%%WRITE%YOUR%EXPECTATIONS#AS#3"4!CRITERIA!FOR$EVALUATION!%%%%%%%%DO"THE"CRITERIA"ALIGN!TO#THE#OUTCOMES#IN#YOUR$COURSE/UNIT%GOALS?!!DESCRIBE'THE'ALIGNMENT"BELOW:!%%%%%%%%%

Page 45: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Student demonstrates an

exemplary mastery of

learning goal.

Student demonstrates an

accomplished mastery of the learning goal.

Student demonstrates a

developing mastery of the learning goal.

Student demonstrates a

low-level of mastery.

So little work is provided, the

level of mastery can't be assessed.

ORGANIZATION & APPEARANCE • map is organized, legible and visually appealing • formatting improves readibility • arrangement is creative • makes effective use of drawings, attachments or other visualsComments:

Work presented is ... • accurate and concise. • thorough and appropriate. • clearly related to the activity probes. • intelligible.EXPLANATIONExplanations are .. • thorough and appropriate. • intelligible and easily understood. • linked to subjects discussed in class and reading. • clearly demonstrate student's understanding of the material. • show creative or unexpected connections to other material. (explanations must demonstrate student understanding of relevant concepts)

Activity: INFORMATION

Page 46: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Team

Pro

ject P

ee

r Evaluatio

n R

ub

ric

Your Nam

e_________________________

Team M

ember’s N

ame (that you are evaluating)____________________________

Category 4 - exem

plary 3 - accom

plished 2 - developing

1 - beginning or incom

plete 0 - not college level w

ork Score

Preparation D

iscussio

n clearly

ind

icates th

e stud

ent

has th

oro

ugh

ly read

and

is prep

ared to

exp

lain an

d d

iscuss

with

oth

ers.

Has d

on

e the read

ing

with

som

e th

oro

ugh

ness, m

ay lack so

me d

etail or

critical insigh

t.

Has d

on

e the read

ing;

lacks tho

rou

ghn

ess of

un

derstan

din

g or

insigh

t.

Has n

ot read

the text

and

cann

ot su

stain

any refere

nce to

it in

the co

urse o

f d

iscussio

n.

Un

able to

refer to

text for evid

ence o

r su

pp

ort o

f remarks.

Comm

ents:

Appropriateness P

ostu

re, dem

eano

r an

d b

eh

avior clearly

dem

on

strate respect

and

atten

tiven

ess to

oth

ers. Stud

ent is

always resp

ectful o

f o

thers.

Listens to

oth

ers mo

st o

f the tim

e, do

es no

t stay fo

cused

on

o

ther's co

mm

ents

(too

bu

sy fo

rmu

lating o

wn

) or

loses co

ntin

uity o

f d

iscussio

n. Sh

ow

s co

nsisten

cy in

respo

nd

ing to

the

com

men

ts of o

thers,

respo

nses are

respectfu

l.

Listens to

oth

ers som

e o

f the tim

e, do

es no

t stay fo

cused

on

o

ther's co

mm

ents

(too

bu

sy fo

rmu

lating o

wn

) or

loses co

ntin

uity o

f d

iscussio

n. Sh

ow

s so

me co

nsisten

cy in

respo

nd

ing to

the

com

men

ts of o

thers,

respo

nses are

respectfu

l.

Drifts in

and

ou

t of

discu

ssion

, listenin

g to

som

e remarks w

hile

clearly missin

g or

igno

ring o

thers.

Resp

on

ses are

so

metim

es d

isresp

ectful.

Disre

spectfu

l of

oth

ers wh

en

they are

speakin

g; beh

avior

ind

icates to

tal n

on

invo

lvemen

t w

ith gro

up

or

discu

ssion

.

Comm

ents:

Participation Th

e stud

ent o

rganized

team

meetin

gs. The

stud

ent atten

ded

all m

eetings an

d w

as on

tim

e.

The stu

den

t attend

ed

all meetin

gs and

was

on

time.

The stu

den

t misse

d

on

e meetin

g. Th

e stud

ent m

issed

several m

eetin

gs. Th

e stud

ent d

id n

ot

meet th

e team.

Comm

ents:

Total Score

Page 47: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!Activity'#13:'Build'a'Rubric!

'ACTIVITY'TITLE:!!!Criteria'

Exemplary'

Accomplished'

Developing'Em

erging'

!

!!

!!

!

!!

!!

!

!!

!!

WILL#TH

E#RUBRIC#EVALUATIO

N&GIVE&YO

U&TH

E$INFO

RMATIO

N$YO

U$W

ANT?!!D

ESCRIBE'THE'DATA'YO

U!EXPECT&BELO

W:!

!

Page 48: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

GC1Y%CRITICAL%THINKING:%CHEMISTRY%&%CLIMATE% % %LISSE%–%METZKER%–%RICHARDS%%

!Climate(Change!Depletion)Major)Assessment!

You%are%a%museum%curator%for%the%local%science%museum.%%Recently%your%community%

has%experienced%unusually%hot%summers%and%a%lack%of%rain%that%has%led%to%drought%

conditions.%%Your%museum%directory%has%tasked%you%with%developing%an%interactive%

unit%for%all%ages%(5O80)%that%shows%the%scientific%explanation%for%climate%change%and%

addresses%the%following%list%of%questions:%

%

1. Is%the%planet%warming?%2. How%is%global%warming%connected%to%energy%use?%3. Does%global%warming%affect%our%local%weather?%%If%so,%how?%

%

As%you%prepare%your%exhibit%you%are%expected%to:%

• be%creative%

• use%highOquality,%scientific%sources%(no%Wikipedia%or%websites)%

• provide%clear,%simple%explanations%of%the%phenomenon%without%omitting%any%

of%the%scientific%support%

%

These%exhibits%will%be%displayed%to%the%public,%as%an%art%exhibit%after%the%assessment%

is%complete.%

Page 49: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

DR. M

ETZK

ER

- GE

OR

GIA

CO

LLEG

E &

STA

TE U

NIV

ER

SITY

25 O

CTO

BE

R 2012

PA

GE

1 O

F 2 ScientificPosterR

ubric-Group.docx

Science&Poster&–!Group&Presentation&Rubric!!"

""

"PO

STER'NUMBER:'''__________"

CATEG

ORY"

Exceeds"Exp

ectations"

(4"points)"

Meets"A

LL"expectatio

ns"

(2<3"points)"

Meets"SO

ME"exp

ectations"

(1<2"points)"

Does"n

ot"m

eet"expectatio

ns"

(0<1"point)"

Score"

Research

"and"Presen

tation"Criteria"

Quality"o

f"Inform

ation"

Information*is*of*high*quality,*

accurate*and*concise.**Results*

are*clearly*outlined*and*the*significance*of*the*w

ork*is*clearly*explained.**C

ontains*sufficient*and*appropriate*

source*material.*

All*assignm

ent*criteria*addressed*but*som

e*inform

ation*is*not*high*quality.*Inform

ation*is*accurate*but*not*concise.***Significance*of*the*w

ork*is*implied*(not*explicit).*

Contains*sufficient*and*

appropriate*source*material.**

Addresses*som

e*assignment*

criteria*OR*much*inform

ation*is*not*high*quality*O

R*

information*is*not*concise*O

R*

significance*of*work*is*not*

addressed.**Source*material*is*

not*appropriate*or*does*not*support*the*w

ork.*

Does*not*address*

assignment*criteria.*

Information*is*not*accurate*

OR*Inform

ation*has*little*or*nothing*to*do*w

ith*the*main*

topic.*Source*materials*are*

of*poor*quality*or*not*included.*

*x2*

Interp

retation"

Work*presents*a*logical*and*

rational*interpretation*of*the*topic*draw

n*from*source*

materials.*

Work*presents*a*logical*and*

rational*interpretation*of*the*topic*but*is*not*supported*by*

source*materials.*

Interpretation*is*not*based*in*logic*or*interpretation*of*material*is*not*strong.**

No*interpretation*

performed*(inform

ation*presented*is*directly*taken*from

*source*material).*

!x2*

Dem

onstrated

"Knowled

ge"

Presenters*have*sufficient*know

ledge*of*material*to*

communicate*chem

ical*inform

ation*to*chemists*and*

general*audiences.*

Presenters*have*passable*

knowledge*of*m

aterial*and/or*com

munication*w

ith*audience*is*adequate.*

Presenters*have*passable*

knowledge*of*m

aterial*but*have*difficulty*com

municating*

beyond*a*rudimentary*level.*

Presenters*cannot*

communicate*w

ith*audience*at*a*rudim

entary*level.*

!x2*

Communicatio

n"Criteria*

i"

Delivery"

Delivery*techniques*(posture,*

gesture,*eye*contact,*and*vocal*expressiveness)*m

ake*the*presentation*com

pelling,*and*speaker*appears*polished*and*

confident.*

Delivery*techniques*(posture,*

gesture,*eye*contact,*and*vocal*expressiveness)*m

ake*the*presentation*interesting,*and*speaker*appears*com

fortable.*

Delivery*techniques*(posture,*gesture,*eye*contact,*and*

vocal*expressiveness)*make*

the*presentation*understandable,*and*speaker*

appears*tentative.*

Delivery*techniques*

(posture,*gesture,*eye*contact,*and*vocal*

expressiveness)*detract*from

*the*understandability*of*the*presentation,*and*

speaker*appears*uncom

fortable.*

*

Organization!

Organizational*pattern*

(specific*introduction*and*conclusion,*sequenced*

material*w

ithin*the*body,*and*transitions)*is*clearly*and*

consistently*observable*and*is*skillful*and*m

akes*the*content*of*the*presentation*cohesive.*

Organizational*pattern*

(specific*introduction*and*conclusion,*sequenced*

material*w

ithin*the*body,*and*transitions)*is*clearly*and*consistently*observable*within*the*presentation.*

Organizational*pattern*

(specific*introduction*and*conclusion,*sequenced*

material*w

ithin*the*body,*and*transitions)*is*interm

ittently*observable*w

ithin*the*presentation.*

Organizational*pattern*

(specific*introduction*and*conclusion,*sequenced*material*w

ithin*the*body,*and*transitions)*is*not*observable*w

ithin*the*presentation.*

*

Page 50: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

DR. M

ETZK

ER

- GE

OR

GIA

CO

LLEG

E &

STA

TE U

NIV

ER

SITY

25 O

CTO

BE

R 2012

PA

GE

2 O

F 2 ScientificPosterR

ubric-Group.docx

CATEG

ORY"

Exceeds"Exp

ectations"

(4"points)"

Meets"A

LL"expectatio

ns"

(2<3"points)"

Meets"SO

ME"exp

ectations"

(1<2"points)"

Does"n

ot"m

eet"expectatio

ns"

(0<1"point)"

Score"

Central!Message!

Central*m

essage*is*compelling*

(precisely*stated,*appropriately*repeated,*mem

orable,*and*strongly*supported.)*

Central*m

essage*is*clear*and*consistent*w

ith*the*supporting*m

aterial*

Central*m

essage*is*basically*understandable*but*is*not*often*repeated*and*is*not*

mem

orable.*

Central*m

essage*can*be*deduced,*but*is*not*explicitly*stated*in*the*presentation.*

*

Presen

tation"Criteria"

Prep

aredness"/"

Dem

eanor"

Group*is*clearly*prepared*and*

has*obviously*rehearsed.*Does*

not*read*directly*from*poster.**

Professional,*confident*dem

eanor*enhances*presenter’s*credibility.*

Group*has*prepared*but*m

ore*rehearsal*is*needed.*

Occasionally*reads*directly*from

*poster.**Dem

eanor*is*mostly*professional*and*

confident.*

Group*is*som

ewhat*prepared*

but*it*is*clear*that*rehearsal*is*lacking.**Frequently*reads*directly*from

*poster.**Dem

eanor*is*not*professional*or*lacks*confidence.*

Group*is*unprepared*to*present.**D

emeanor*is*

unprofessional*or*disrespectful*to*others.*

"

Particip

ation"

Group*functions*as*a*team

,*equal*participation*by*all*

mem

bers*

Group*m

ostly*functions*as*a*team

*some*unevenness*in*

participations*

Group*participation*is*uneven.**One*m

ember*dom

inates*or*one*m

ember*doesn’t*

participate*as*much*as*the*

rest*of*the*group.*

Group*does*not*function*as*a*team

,*some*m

ore*than*one*or*m

ore*mem

bers*do*not*participate*at*all.*

*

Dem

onstratio

n"

of"

Understan

ding"

Each*team

*mem

ber*is*equally*know

ledgeable*and*able*to*answ

er*questions*confidently.*

Each*team

*mem

ber*is*equally*know

ledgeable*and*able*to*answ

er*questions*but*answ

ers*are*not*confident.*

Some*team

*mem

bers*clearly*have*m

ore*understanding*than*others*

One*or*m

ore*team*mem

bers*is*clearly*unprepared*and*dem

onstrates*little*or*no*understanding.*

*

Commen

ts:"""

Total"Sco

re:""""

i Excerpted w

ith permission from

Assessing O

utcomes and Im

proving Achievem

ent: Tips and tools for Using R

ubrics, edited by Terrel L. Rhodes. C

opyright 2010 by the Association

of Am

erican Colleges and U

niversities.”

Page 51: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

Using the climate debate to revitalize general chemistry

Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards

Day 3

Reflection

Page 52: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Activity'#14:'Reflection!(1) From(a(student’s(perspective,(compare(and(contrast(your(new(course(and(your(previous(course((or(

traditional(course)(covering(the(same(content.(( (2) How(does(your(newly(designed(course(move(the(responsibility(of(learning(to(the(students?(

(3) How(have(your(skills(improved(in(the(following(areas:((a) using a big idea as an organizer(

(b) setting goals

(c) using active learning strategies

(d) designing assessments

(e) mapping goals, activities, assessment

(4) Were(there(any(other(topics(that(you(wished(had(been(presented(as(part(of(this(workshop?(Or(was(there(anything(that(you(wanted(to(discuss(more?((WRITE(ANSWER(ON(YOUR(NAMETAG(–(TICKET(OUT(OF(THE(DOOR(

Page 53: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

NSF-sponsored Chemistry Collaborations, Workshops & Communities of Scholars (ccwcs.org)

cCWCS Small Grants Programs __________________________________________________________________________________________________________________________

The Chemistry Collaborations Workshops and Communities of Scholars program (cCWCS, www.ccwcs.org) invites proposals for the following activities:

x Dissemination activities related to high-quality curriculum materials and pedagogies for the chemistry undergraduate curriculum. cCWCS already offers a series of 5-day intensive workshops each summer that make use of lots of hands-on-activities. The aim of this grant program is to encourage other types of (shorter, less intensive) activities. Deadline: May 1, Oct 1.

x Implementation of high-quality curriculum materials and pedagogies in classes at their home instruction. Awards are limited to the purchase of small non-consumable items related to implementation of activities related to a cCWCS workshop that the proposer has attended. The awards require matching support from the proposer’s home institution. Deadline: May 1, Oct 1.

Page 54: Using the climate debate to revitalize general chemistry€¦ · Kimberly Cossey, Catrena Lisse, Julia Metzker, Chavonda Mills, Rosalie Richards Does planning your course make you

!

This document was modified from the IC-bG resource library at http://icbg.wordpress.com/resources!

Resources(for(Continued(Learning(COURSE'DESIGN!! McTighe,)J.)&)Wiggins,)G.)The)Understanding)by)Design)Professional)Development)Workbook)(2004).)! Designing)Effective)and)Innovative)Courses)(Tutorial))J)

http://serc.carleton.edu/NAGTWorkshops/coursedesign/tutorial/)! SENCER)Model)Courses)J)http://sencer.net)! The)Innovative)CourseJbuilding)Group)blog)J)))

http://icbg.wordpress.com)! Backward)Design)template)available)at:)http://digitalliteracy.mwg.org/curriculum/template.html)

SETTING'GOALS'/'WRITING%OUTCOMES!! Bloom’s)Taxonomy)and)Verbs)(many)online))! L.)B.)Nilson,)The$graphic$syllabus$and$the$outcomes$map:$Communicating$your$course)(2007).)

ACTIVITIES!&"ACTIVE"LEARNING!! Barkley,)E.)F.)(2009).)Student$engagement$techniques:$A$handbook$for$college$faculty.)John)Wiley)&)Sons.)! Angelo,)T.)A.,)&)Cross,)K.)P.)(1993).)Classroom)Assessment)Techniques:)A)Handbook)for)College)Teachers.)! (Eds.).)(2012).)Engage$to$Excel:$Producing$One$Million$Additional$College$Graduates$with$Degrees$in$Science,$

Technology,$Engineering,$and$Mathematics.$Report$to$the$President.)ERIC.)! Gollub,)J.)P.,)Bertenthal,)M.)W.,)Labov,)J.)B.,)&)Curtis,)P.)C.)(2002).)Learning$and$understanding:$Improving$

advanced$study$of$mathematics$and$science$in$US$high$schools.)National)Academy)Press.)! National)Center)for)Case)Study)Teaching)in)Science)–))

http://sciencecases.lib.buffalo.edu/cs/)! ProblemJBased)Learning)Clearinghouse)(U.)Delaware))–))

https://primus.nss.udel.edu/Pbl/)! ChemConnections)Modules)(some)are)outdated))–))

http://chemconnections.org/)! Virtual)Inorganic)Pedagogical)Electronic)Resource)–))

https://www.ionicviper.org/)! POGIL)J)http://www.pogil.org/)! Concept)Tests)for)General)Chemsitry)J)http://people.brandeis.edu/~herzfeld/conceptests.html)! General)Chemistry)Case)Studies)J)http://www.chemcases.com/)

ASSESSMENT!! Student)Assessment)of)Learning)Gains)(SALG))Survey)–))

http://www.salgsite.org/)! AAC&U)Value)Rubrics)–))

http://www.aacu.org/value/rubrics)! FieldJtested)Learning)Assessment)Guide)–))

http://www.flaguide.org/)! Julia’s)Rubric)Library)

http://chemistry.gcsu.edu/~metzker/courses/rubricsJguidelines/))