13
Effects of Active Learning on Enhancing Student Critical Thinking in an Undergraduate General Science Course Kyoungna Kim & Priya Sharma & Susan M. Land & Kevin P. Furlong Published online: 12 September 2012 # Springer Science+Business Media, LLC 2012 Abstract To enhance studentscritical thinking in an undergraduate general science course, we designed and implemented active learning modules by incorporating group-based learning with authentic tasks, scaffolding, and individual reports. This study examined the levels of critical thinking students exhibited in individual reports and the studentscritical thinking level change over time. Findings indicated that studentsaverage critical thinking level fell in the category of developing, but studentsscores on individual reports revealed a statistically significant increase. The study suggested that the active learning strategies employed in the study were useful to promote student critical thinking. Keywords Active learning . Geoscience . Undergraduate education . Scaffolding . Critical thinking New developments in the science of learning emphasize the importance of developing student competency to deal with complex problems in real-life contexts (Bransford et al. Innov High Educ (2013) 38:223235 DOI 10.1007/s10755-012-9236-x Kyoungna Kim is Instructor at Diplomatic Language Services. She received a Ph.D. in Instructional Systems from Pennsylvania State University. Her interests include the design of technology-enhanced, learner-centered learning environments; experience and interaction design; and everyday learning. Kevin P. Furlong is Professor of Geosciences at Pennsylvania State University. He received a Ph.D. in Geophysics from the University of Utah. His interests include design of learning modules to engage undergraduate students in authentic, scientific thinking about natural hazards. Susan M. Land is Associate Professor of Education at the Pennsylvania State University. She received a Ph.D. in Instructional Technology from Florida State University. Her research investigates frameworks for the design of open-ended, technology-rich learning environments. She studies learning environments and design connected to everyday contexts, mobile devices, social networking, and student-created design projects. Priya Sharma is Associate Professor of Education at the Pennsylvania State University. She received a Ph.D. in Instructional Technology from the University of Georgia. Her work focuses on design and learning as it occurs in the context of online networks and with ubiquitous digital and connectivity tools. K. Kim (*) Diplomatic Language Services, Arlington, VA, USA e-mail: [email protected] P. Sharma : S. M. Land Department of Education, Pennsylvania State University, University Park, PA, USA K. P. Furlong Department of Geosciences, Pennsylvania State University, University Park, PA, USA

Active Learning and Critical Thinking

Embed Size (px)

DESCRIPTION

Journal Of Active Learning

Citation preview

Page 1: Active Learning and Critical Thinking

Effects of Active Learning on Enhancing Student CriticalThinking in an Undergraduate General Science Course

Kyoungna Kim & Priya Sharma & Susan M. Land & Kevin P. Furlong

Published online: 12 September 2012# Springer Science+Business Media, LLC 2012

Abstract To enhance students’ critical thinking in an undergraduate general science course, wedesigned and implemented active learning modules by incorporating group-based learning withauthentic tasks, scaffolding, and individual reports. This study examined the levels of criticalthinking students exhibited in individual reports and the students’ critical thinking level changeover time. Findings indicated that students’ average critical thinking level fell in the category of“developing”, but students’ scores on individual reports revealed a statistically significantincrease. The study suggested that the active learning strategies employed in the study wereuseful to promote student critical thinking.

Keywords Active learning . Geoscience . Undergraduate education . Scaffolding .

Critical thinking

New developments in the science of learning emphasize the importance of developingstudent competency to deal with complex problems in real-life contexts (Bransford et al.

Innov High Educ (2013) 38:223–235DOI 10.1007/s10755-012-9236-x

Kyoungna Kim is Instructor at Diplomatic Language Services. She received a Ph.D. in Instructional Systemsfrom Pennsylvania State University. Her interests include the design of technology-enhanced, learner-centeredlearning environments; experience and interaction design; and everyday learning.

Kevin P. Furlong is Professor of Geosciences at Pennsylvania State University. He received a Ph.D. inGeophysics from the University of Utah. His interests include design of learning modules to engageundergraduate students in authentic, scientific thinking about natural hazards.

Susan M. Land is Associate Professor of Education at the Pennsylvania State University. She received aPh.D. in Instructional Technology from Florida State University. Her research investigates frameworks for thedesign of open-ended, technology-rich learning environments. She studies learning environments and designconnected to everyday contexts, mobile devices, social networking, and student-created design projects.

Priya Sharma is Associate Professor of Education at the Pennsylvania State University. She received a Ph.D.in Instructional Technology from the University of Georgia. Her work focuses on design and learning as itoccurs in the context of online networks and with ubiquitous digital and connectivity tools.

K. Kim (*)Diplomatic Language Services, Arlington, VA, USAe-mail: [email protected]

P. Sharma : S. M. LandDepartment of Education, Pennsylvania State University, University Park, PA, USA

K. P. FurlongDepartment of Geosciences, Pennsylvania State University, University Park, PA, USA

Page 2: Active Learning and Critical Thinking

2000; National Research Council 1996). Dealing with complex problems requires studentsto engage in active critical thinking processes, which include purposeful, reasoned, and goal-directed higher-order thinking (Halpern 1999) as well as identifying problems in context,considering influences, analyzing appropriate data and evidence, making inferences andsound decisions, and evaluating relevant elements (Paul 1995; Perkins 1998).

A primary educational goal of colleges and universities is to help students develop the abilityto think critically, to communicate effectively, and to solve problems (National Education GoalsPanel 1991). In undergraduate general science education, for example, it is important to developstudents’ ability to understand concepts about the natural world, to use scientific information tomake daily life choices, and to engage in public discourse about important issues involvingscience (National Research Council 1996; www.project2061.org), that is, to promote students’scientific thinking and critical thinking (Halpern 1999; Yuretich 2004). There is broadconsensus that it is important to engage students in authentic practices in scienceeducation (National Research Council 1996, 2012; www.project2061.org), by providingmeaningful contexts that will enhance their ability to apply what they have learned(Edelson and Reiser 2006).

Introductory Undergraduate Science Courses

Prior research has demonstrated that college science students often fail to apply concepts theyhave learned in the classroom to real-life situations, presumably the result of limited applicationopportunities in science classrooms (Gupta 2005). In particular, undergraduate general educa-tion courses taking place in the environment of a large lecture hall classes pose challenges forpromoting critical thinking due to barriers such as physical space, an emphasis onmemorizationof facts from lecture for multiple-choice exams (McConnell 2005), and passive learning(Chapman 2001).

Many introductory science classes tend to focus on lower-level cognitive tasks, which offerfew opportunities for students to engage in higher cognitive tasks including application,analysis, synthesis, and evaluation (Yuretich 2004). In addition, many instructors are unawareof the possible impact of such instructional strategies on the development of student thinking;and many are ill-prepared to cultivate students’ thinking skills (McConnell 2005). Fewintroductory science courses provide students with learning environments where they engagein tasks and assignments that encourage their critical thinking (McConnell 2005). Even whencourses are well designed to meet the need for the development of critical thinking, appropriateassessment tools are also needed to be developed in order to measure the level of thinking and tomonitor student development in the context of their learning. Despite these challenges, based ona review of over 300 studies on undergraduate science course innovations, Ruiz-Primo et al.(2011) confirmed overall positive effects of course innovations on student learning when theinnovation emphasis was on transformation of the course from traditional lecture-based learn-ing to a more student-centered instructional approach, focusing on students' active role in theirlearning and developing deep understanding of critical concepts.

Our research focused on the use of appropriate design strategies to foster innovation within anundergraduate science classroom. Our goal was to advance students’ thinking ability despite thepresence of commonly-cited barriers such as large numbers of students, fixed seating, and lecturehall style arrangement. Our design focused on supporting students to engage actively in authenticpractices by providing a systematically-designed learning environment to support the use ofscientific knowledge in solving real-life problems (Edelson and Reiser 2006; National ResearchCouncil 1996, 2012). Our design included supports of cognitive process such as scaffolding

224 Innov High Educ (2013) 38:223–235

Page 3: Active Learning and Critical Thinking

strategies and tools that are necessary to support knowledge building and the problem-solvingprocess in learner-centered, authentic environments (Kim 2009; Ge and Land 2003). By provid-ing a theoretically and empirically driven evaluation of our efforts, we aim to contributespecifically to an understanding of how to support and advance the critical thinking and scientificthinking competencies of students within large, undergraduate science classrooms.

Critical Thinking

Critical thinking is defined as the “purposeful, reasoned, and goal-directed” use of cognitive skillsand strategies (Halpern 1999, p. 70). It requires students to be engaged actively in the process ofconceptualizing, applying, analyzing, synthesizing, evaluating, and communicating information(Scriven and Paul 1996). However, the range of perspectives on critical thinking is quite broad;and the literature offers various definitions, such as argument analysis, problem-solving, decision-making, and cognitive process. Some scholars have included the notion of reflection on one's ownthinking, and decision making (e.g., Scriven and Paul 1996 cited as in MacKnight 2000) andmetacognition as part of critical thinking (e.g., Ennis 1991; Halpern 1999). For instance, Halpern(1999) addressed the notion of metacognition and characterized critical thinking, stating, “Whenwe think critically, we are evaluating the outcomes of our thought processes–how good a decisionis or how well a problem is solved” (p.70).

In relation to pedagogy, Moon (2008) linked critical thinking to activities such as reflection andargument while considering the progression of student learning in higher education. Aligned withMoon’s considerations, King (1995) suggested that critical thinking includes skills and the specificprocess of analyzing the presented arguments, making inferences, drawing logical conclusions, andcritically evaluating all relevant elements as well as the possible consequences of each decision.

Active Learning Strategies to Promote Critical Thinking

Problem tasks that require students to engage in data retrieval, analysis, evaluation, and synthesisare believed to support complex thinking (Paul 1995; Perkins 1998). Ill-structured problems(Jonassen 1997), or those having multiple possible answers, provide students with opportunitiesto engage in critical thinking processes such as seeking alternatives and considering other pointsof views. Hager, Sleet, Logan, and Hooper’s study (2003) incorporated critical-thinking tasks byusing open problems that required students to apply chemistry and physics concepts to theproblems of everyday life; and they found that the use of open problems and tasks in smallcooperative groups was effective for enhancing students’ thinking skills. Similarly, Kronberg andGriffin (2000) selected analysis problems as a means for developing student critical thinking in anintroductory biology course, requiring students to apply their knowledge and understanding of thesituation and offering choices and alternatives depending on the justifications students made fortheir selections. Requiring students to justify each response was found to be effective indeveloping critical thinking and improving student achievement and retention by helping themto analyze and synthesize information in an applied manner (Kronberg and Griffin 2000).

Collaborative learning can also facilitate student critical thinking. Prior research and theoryhave detailed the intellectual benefits of student collaboration with peers (e.g., Scardamalia andBereiter 1996; Vygotsky 1978). The use of dialogue and social interaction in group-basedlearning can be viewed as a form of scaffolding. Theoretically, students help each other carry out atask beyond their individual capabilities (Vygotsky 1978). Peer interaction during collaborativelearning or small-group learning can be beneficial for the development of critical thinking. For

Innov High Educ (2013) 38:223–235 225

Page 4: Active Learning and Critical Thinking

instance, Yuretich (2004) employed an “in-class investigation” approach that was intended tointroduce critical thinking skills into large classes; students were given questions that required themto synthesize and evaluate information from the lectures and readings and to engage in groupdiscussion and cooperative learning activities. Students completed these investigations whilediscussing them with their peers and then reviewed the answers as a class. Yuretich argued thatthis active learning strategy would improve higher-order thinking skills. Thus, “having theopportunity to pause and reflect on, analyze, and discuss processes and concepts is the key” (p.44).

Scaffolding, or the support of student’s cognitive processes in complex tasks, is also needed tofacilitate student critical thinking. Without external support, it is challenging for students to askthought-provoking questions, to activate and use their relevant prior knowledge, and to solveproblems in a purposeful manner (King 1995; Land 2000). Providing appropriate scaffolding iscritical for the promotion of critical thinking,which is the key to the development of high-level scienceperformance on ill-structured problems that reflect everyday scientific practice (Lajoie et al. 2001).

Writing essays or research reports is another way to engage students in critical thinking in ageneral science course. It is shown to be effective in helping students to identify problems incontext, to consider influences, to analyze appropriate data and evidence, to make inferences, tomake sound decisions, and to evaluate relevant elements (Bunce and VandenPlas 2006;Russell 2004).

The Study

The context for our study relied upon learning activities that engage students in argumen-tation and reflective learning. We adapted both Moon’s and King’s notions of criticalthinking, defining it as the type of reasoning skill that requires students not only to acquireknowledge of scientific phenomena (e.g., natural disasters) but also to apply this knowledge.Thus, for our purposes critical thinking skills refer to the ability to identify issues, analyzedata and evidence, make judgments, critically and reflectively evaluate relevant elements,and draw conclusions. More specifically, the purposes of this study were twofold: (1) toexamine the levels of critical thinking exhibited in individual reports over the semester and(2) to explore the effect of active learning on undergraduate students’ critical thinking.

The students’ critical thinking skills were articulated in individual and group artifacts asthey engaged in learning activities dealing with authentic science-related problems. Theywere involved in solving specific, real-life, natural disaster problems in small groups afterwhich they produced individual reports that analyzed, synthesized, and evaluated the prob-lems discussed during the group work. Thus, we implemented active learning by incorpo-rating three primary instructional support mechanisms: group-based learning with authentictasks, scaffolding, and the preparation of written individual reports.

Activities were designed to engage students in several collaborative group activities as ameans to encourage understanding of the concepts and integration of prior knowledge whiledealing with the geoscience issues of a natural disaster. We provided two types of support in ourinstructional materials: procedural scaffolding, which makes explicit the sequence of activitiesfor complex tasks (Kim 2009) and cognitive scaffolding, which helped learners reason throughcomplex problems and guided them in what to consider (Hannafin et al. 1999). For instance,students were asked to consider several perspectives that dealt with the authentic problems; andthey were required to respond to questions and provide reasoning for the decisions they made.

In addition to group activities using authentic problems and scaffolding, a series of individ-ual assignments was designed to promote critical and reflective thinking. For example, studentswere asked to write an individual report that provided their own decisions regarding a situation

226 Innov High Educ (2013) 38:223–235

Page 5: Active Learning and Critical Thinking

and the reasoning behind their decisions. By engaging in this individual assignment after groupwork, students were encouraged to review their own thinking about the problem, which couldhelp to analyze and synthesize group decisions. One learning activity (“Hurricane Smith”)involved a scenario where students were appointed as Special Aides for Disaster Managementto the office of the Mayor of their specific community. The students were required to articulatetheir analysis of the hurricane situation, make suggestions about the evacuation decision withdata, and propose decisions with justification.

Method

Participants and Context of the Study

The research participants were undergraduate students from an introductory geoscience courseat a large, northeastern public university. One hundred seventy-three students were enrolled,and 155 of them agreed to participate in the study, which had been approved by the humansubjects research board at the University. The study context involved two multifaceted andintentionally designed instructional modules –Active Learning Modules I & II—on naturaldisasters. Each module was implemented over three 1.5 hour sessions as part of hurricane andglobal warming units. These sessions were conducted during class time twice weekly.

The two learning modules share the following design elements (Kim 2009): they (a) usecurrent events and situations as contexts for the activities; (b) provide visible supports, orscaffolds, for student thinking; and (c) provide opportunities for students to engage in peerdiscussions and collaborative activities. The first module, developed by a research teamincluding experts from the fields of Instructional Design and Earth Sciences, presented a real-life complex problem related to hurricanes. This scenario is titled “Hurricane Smith” andrevolves around decision-making processes for an evacuation plan in the event of an imminenthurricane The second module is titled “Bangladesh Global Warming” and is composed of astructure similar to the first module, presenting an authentic problem associated with globalwarming. Each module was implemented over a week during which group activities wereconducted. After the group activities were completed, students were asked to write an individualreport at the end of the module, which as explained above, required students to show clear linksto research and data and to provide justification for all their conclusions.

Procedures

The two active learning modules were implemented during weeks 6 and week 12 of the semesterterm. The instructional materials and associated handouts were distributed to all participants; andparticipants were randomly divided into groups for the in-class learning activities, which weredesigned for groups of four or five participants. For example, in the Hurricane Smith module, fourstudents were assigned to each group (six groups working on commerce, disability, emergency,infrastructure, media, and school group), and then students from each group were gathered withinthe larger location-based community groups (e.g., the community of Hilton Head) to discuss theirdecisions. Students were required to prepare their individual reports.

Data Analysis

To examine the changes in student critical thinking through active learning, we appliedquantitative analyses. A data set that included students’ individual reports from the modules

Innov High Educ (2013) 38:223–235 227

Page 6: Active Learning and Critical Thinking

was analyzed to gauge changes in critical thinking abilities over time, that is, across the twowritten reports. A collaborative research team developed a coding scheme for critical thinking (seeAppendix A) that was used to analyze the reports. The critical thinking level was scored based onrubrics and included four subcategories to evaluate students’ ability in 1) identifying problemswhile considering social context; 2) evaluating group and community decisions; 3) developing aperspective by justifying one’s own decisions, presenting evidence/data, and integrating issues;and 4) communicating effectively. For each subcategory, the rubric was further broken down intothree levels of critical thinking that were termed “emerging,” “developing,” and “mastering.”

In an effort to ensure the raters’ scoring reliability, four raters from the research teamassessed 10 % of the same two sets of students’ individual reports (n 015; 10 % of total N).After establishing a high inter-rater reliability ( .97) between the raters, four raters worked onthe scoring job separately, which included two raters assessing the set of individual report Iand two other raters assessing the set of individual report II.

Results

The results of the data analyses are presented according to the following categories: (a)evidence of critical thinking in student reports, (b) mean performance on the two individualreports, and (c) changes in critical thinking over time.

The Evidence of Critical Thinking

For the first module (Hurricane Smith), data from 131 students was collected (based on studentswho agreed to participate in the study as well as those who submitted individual reports); andthe mean score of performance on the individual reports was 27.74 out of a total of 30. Thisshows that students’ average critical thinking level fell in the category ofDeveloping for all foursubcategories. Mean scores for the four subcategories are presented in Table 1. The category for

Table 1 Summary of ScoreAnalysis for Individual reports I & II

Total score for each sub-category06. Total scores foreach Individual reports I&II024and 36, respectively. α° n0105:participants for both reports. To-tal score0100.

Individual reportssubcategory

Raw Scores Percentage Mean Performance α°

Subcategory Dimensions Mean Scores PercentageMeanScores

SD Range

Low High

Individual reports I (n0131)

1. Identifying problems 4.67 68.34 19.36 16.7 100

2. Evaluating decisions 4.05

3. Developing andjustifying own decisions

3.66

4. Communication 4.46

Individual reports II (n0125)

1. Selection of impacts 4.86 75.66 12.28 36.0 100

2. Selection of supportingmaterial

4.06

3. Presentation of data 4.61

4. Quality of integration 4.32

5. Self-reflection 4.35

6. Language & Mechanics 5.09

228 Innov High Educ (2013) 38:223–235

Page 7: Active Learning and Critical Thinking

“Developing own perspective by justifying decisions…” was low compared to the other threesubcategories (Mean03.66), even though this score is rated at the same Developing level.

For the second module (Bangladesh Global Warming) 125 students participated, and the meanperformance score on the individual reports was 27.30 out of total 36 (see Table 1). This shows thatstudents’ average critical thinking level fell in the category ofDeveloping for all six subcategories.Even though there was little difference among the mean scores for the six subcategories, thecategory for “2. Selection of supportingmaterial”was slightly lower (Mean04.06) compared to theother five subcategories, even though this score was rated at the same Developing level.

For both reports students’ average critical thinking levels fell in the category of Developingfor all subcategories (e.g., identifying problems while considering social context, developing aperspective, justifying own decisions, presenting evidence/data, and integrating issues, etc.).

Summary of Mean Performance

To identify whether or not there were any significant differences for the students’ averageperformance between the two reports, raw scores based on each rubric were converted topercentage mean scores, as their total scores were different (Total024 and 36, respectively).These converted percentage mean scores from the students who submitted two individualreports were analyzed with a paired-t test (n0105) (see Table 2 for means and standarddeviations). The range of scores was from 16.7 to 100 in individual report I (HurricaneSmith) and from 36.1 to 100 in individual report II (Bangladesh Global Warming).

On average, participants performed higher on individual report II (M075.66, SE01.20, t(104)0−3.53, p0 .001, r0.45) than individual report I (M068.34, SE01.89), with a gain instudents’ percentage mean scores from individual reports I (M068.34) to individual report II(M075.66), which was significant at the .05 level with a p-value of .001 (see Table 2). Thesefindings show that there were improvements (7.31 points) in students’ percentage meanscores from individual report I (M068.34) to individual report II (M075.66), which wassignificant at the .05 level with a medium effect size (0.45).

Changes in Critical Thinking Level

To investigate whether or not there was a significant association between changes in students’critical thinking abilities and the two assigned individual reports, the three critical thinkinglevels of 105 students who submitted both individual reports were used to conduct a Chi Squareanalysis. A median split technique was employed to assign the two individual report scores intothree critical thinking levels. For individual report I scores ranging from 1 to 70 were assignedto the ‘Low’ level (1), from 71 to 82 to the ‘Medium’ level, and from 83 to 100 to the ‘High’level. For individual report II, 1 to 75 (Low), 76 to 82 (Medium), 83 to 100 (High) levels wereassigned. These three levels were used to conduct a Chi-square analysis.

Table 3 presents the category of critical thinking level for individual reports I and II. Overall,for individual report I, out of a total of 105 students, almost 50 % (n056) of the students scored atthe ‘Low’ level for critical thinking, 21 students (20 %) scored at the ‘Medium’ level, and 28

Table 2 Paired ComparisonResults from Individual report I toIndividual report II

n0105. p<.05

Individualreports

Meandifference

SD t df Sig. (2-tailed) Cohen d

I

II 7.31 21.20 3.53 104 (.00*) 0.45

Innov High Educ (2013) 38:223–235 229

Page 8: Active Learning and Critical Thinking

students (27 %) in ‘High’ level. For individual report II, 53 (51 %), 21 (20 %), and 31 (30 %)students scored at the Low,Medium, and High level respectively. This result indicates that almost50 % of the students scored at the ‘Low’ critical thinking level for both of the Individual reportsand the other 50 % of students scored at either ‘Medium’ or ‘High’ critical thinking levels.

In terms of students’ critical thinking level changes over time, we present the followingsummary. Out of 105 students, overall almost 50 % of students (n047) stayed at the same levelof critical thinking. In contrast, the critical thinking level for 31 students (30 %) increased fromindividual report I to individual report II, including 13 students from ‘Low (L)’ level to‘medium (M)’ level, 13 students from low level (L) to high level ‘(H)’, and 5 students frommedium level (M) to high level (H). However, eleven students’ critical thinking level (10 %)score decreased, which included 12 students from level (M) to level (L) and 11 students fromlevel (H) to level (L).Those 11 students who dropped down from level (H) to level (L) would beof particular interest for further examination to explain the drop in performance. As bothvariables have more than two categories (three levels for each variable), the Cramer’s V statistic(.16) was used to determine if the change in critical thinking level between the two individualreports was statistically significant. Overall, there was no significant association betweenstudents’ critical thinking levels for the individual report I and II õ2 (4)0 .16, p0 .25, indicatingthat there was no significant change between two critical thinking level changes over time.

Discussion

This study addressed the need for dealing with common challenges to enhance student criticalthinking in a large undergraduate science class. Three active learning strategies were proposedas supportive mechanisms to enhance student critical thinking: small-group learning withauthentic tasks, scaffolding, and individual writing. Emphasizing a principled approach to the

Table 3 Cross tabulation of Three Critical Thinking Level Groups for Two Individual reports

GROUP_GW Total

Low Medium High

GROUP_HS Low Count 30 13 13 56

% within GROUP_HS 53.6 % 23.2 % 23.2 % 100.0 %

% within GROUP_GW 56.6 % 61.9 % 41.9 % 53.3 %

% of Total 28.6 % 12.4 % 12.4 % 53.3 %

Medium Count 12 4 5 21

% within GROUP_HS 57.1 % 19.0 % 23.8 % 100.0 %

% within GROUP_GW 22.6 % 19.0 % 16.1 % 20.0 %

% of Total 11.4 % 3.8 % 4.8 % 20.0 %

High Count 11 4 13 28

% within GROUP_HS 39.3 % 14.3 % 46.4 % 100.0 %

% within GROUP_GW 20.8 % 19.0 % 41.9 % 26.7 %

% of Total 10.5 % 3.8 % 12.4 % 26.7 %

Total Count 53 21 31 105

% within GROUP_HS 50.5 % 20.0 % 29.5 % 100.0 %

% within GROUP_GW 100.0 % 100.0 % 100.0 % 100.0 %

% of Total 50.5 % 20.0 % 29.5 % 100.0 %

Total n0105.

230 Innov High Educ (2013) 38:223–235

Page 9: Active Learning and Critical Thinking

design and evaluation of a learning environment that supports students’ active engagement inprocesses of critical thinking, this study focused on the effects of these active learning strategies.

The students' performance showed statistically significant improvements in scores betweenindividual reports I and II. This supports the belief that the instructional approaches incorpo-rated in the design of active learning can facilitate students engaging in critical thinking in thecontext of authentic problem solving. The finding confirms the previous studies on the effectsof external supports designed to help learners engage in articulation and a reflective process inopen-ended learning environments by providing the means to make the ongoing processesvisible (Lajoie et al. 2001). The use of scaffolding was expected to support student thinking asthey engaged in complex problems (Hannafin et al. 1999; Ge and Land 2003).

However, our findings also showed that the students' thinking level did not move beyond thecategory of "developing" over the semester. This finding could be explained by the results ofprevious research suggesting that the development and refinement of critical thinking is influ-enced by multiple factors including epistemological readiness (e.g., Brookfield 1987; King andKitchener 1994), the amount of time devoted to engaging in critical thinking tasks (e.g., Lynchand Wolcott 2001), and the availability of sustained opportunities to engage in critical thinkingtasks (Gellin 2003; www.ctlt.wsu.edu; Andriessen 2006). In terms of epistemological readiness,King and Kitchener (1994) argued that many traditional college age students tend to hold pre-and quasi-reflective epistemological assumptions about knowledge and knowing: that is, theymay tend to assume that all problems arewell structured and have definite answers or that, if theyacknowledge that problems are ill-structured, they are unable to see how evidence enables anappropriate conclusion. Students tend to progress in their epistemological sophistication fromfreshman to senior years; however, because our context was a large enrollment general educationundergraduate course, it is possible that the profile of the students falls largely into the pre- andquasi-reflective stages, thus leading to most student thinking being characterized as “develop-ing”. Another possible explanation for the lack of change could be the relatively short timeassigned between two modules, as well as the use of only two modules within the entire course.In reference to the “Steps for Better Thinking” developmental model, Lynch andWolcott (2001)stated: “It is unrealistic to believe that experience in a single course can produce major changes incomplex skills” (p. 7). Based on the findings of this study and other previous research, wesuggest two curricular implications for the development of students' critical thinking abilities: (a)integrating critical thinking skills into complex, student-centered environments across thecurriculum within the educational program and (b) assessing students' critical thinking develop-ment in diverse disciplines (Lynch and Wolcott 2001; www.ctlt.wsu.edu; Andriessen 2006).

In our study the active learning environment employed small-group learning with authentictasks. These strategies may have helped students to be engaged cognitively, resulting inenhanced student learning and critical thinking. In addition to report scores, we gathered surveyand interview data, which suggested the active learning strategies served a role in enhancingstudent engagement in various facets of critical thinking that are required in the geoscience field:applying, analyzing, evaluating, and synthesizing what they learned to solve real-life problems.Students reported small-group learning to be helpful in developing their ability to approach theproblem from various perspectives and to apply scientific concepts to real-world problems bygiving them a chance to share ideas, give feedback, and consider alternative views and multipleperspectives (Kim 2009). The following excerpts from interviews illustrate these points: “…Ithink in group activities you’re getting more different viewpoints, so when you’re actuallytalking about something, other people are giving their own feedback, so you are learning coupleof times more….”, “…having the group discussion and being able to compare your responsesand even get a new perspective from somebody in the class is very helpful”, “…They [small-group problem-solving activities] put me in realistic situations and offered new perspectives…”.

Innov High Educ (2013) 38:223–235 231

Page 10: Active Learning and Critical Thinking

In support of this notion, Blumenfeld et al. (2006) noted that it helps students to be morecognitively engagedwhen learning environments employ specific learning sciences principles suchas authenticity and collaboration. The concept of cognitive engagement includes “students’willingness to invest and exert effort in learning, while employing the necessary cognitive,metacognitive, and volitional strategies that promote understanding” (p.475). While cognitivelyengaged, students think deeply about the content and construct an understanding that entailsintegration and application of the key ideas of the discipline. The “authenticity” of learningenvironments that involves connections to the real world and to practice can enhance studentinterest and engagement in their learning. In addition, promoting collaboration in learning environ-ments may encourage students’motivation and cognitive engagement (Blumenfeld, et. al.). Whenstudents are productively engaged in explaining, clarifying, debating, and critiquing their ideas,collaboration can lead to cognitive engagement. Thus, employing small-group learning withauthentic tasks as a part of the active learning strategies in this study was expected to supportstudents’ cognitive engagement, resulting in enhanced student learning and critical thinking.

In terms of factors fostering critical thinking development, some previous studies havefocused on students’ perception of learning environments in support of their learning andcritical thinking in higher education. However, few studies on critical thinking among collegestudents have examined the impact of instructional factors (Tsui 2002). Tsui’s study (1999)revealed that self-assessed growth in critical thinking is positively related to such instructionalfactors as conducting independent research, working on a group project, giving a classpresentation, and taking essay exams (Tsui 1999; 2002). Based on the evidence derived fromthe case studies, the findings of her study suggested that the development of critical thinking islikely to be linked to an emphasis on writing and rewriting as well as class discussion (Tsui2002). Therefore, investigating the effects of specific instructional strategies using directindicators and observational data is needed because studies addressing classroom experienceswith active learning tend to rely on self-reported data (Tsui 2002). Our research was designedand implemented in an ecologically-valid instructional context with the goal of observing andassessing critical thinking using indicators tied to instructional products, rather than self-reportinstruments.

Individual reports, employed as a means of student engagement in the critical thinkingprocess, may have played an important role in facilitating student ability to construct argumentsby encouraging them to use data and evidence for their decision and reasoning (Takeo et al.2002). Writing scientific arguments is a complex task, which requires use of a set of complexcognitive skills (Takeo et al. 2002). Yet, prior studies have suggested that undergraduate studentsin introductory sciences courses are often limited in their ability to write responses to essayquestions as well as to construct arguments (Bunce and VandenPlas 2006; Takeo et al. 2002).Making the need for connecting data to theoretical assertions explicit in scientific writing canencourage students to use data and evidence (Takeo et al. 2002). Student use of question promptsprovided as scaffolding in this study may have assisted them to connect evidence and data to theirclaims (Ge and Land 2003; Mayer 1999). Data from interviews and surveys revealed that thequestion prompts afforded opportunities for engaging in critical thinking such as evaluatingresources and making justifications, which in turn may have assisted them in constructingarguments in their individual reports. Examples of students’ comments include: “…When youhave to assess ‘why something is important’ or, ‘which is most important’, that is going toimprove your critical thinking.”, “I think that [question prompts] is very helpful in order to laythings out, really rate what you think would be just an impact.” Further research is suggested for amore extensive investigation on the effects of scaffolding in facilitating the subcategories of thecritical thinking process including identifying the situation, considering multiple perspectives,justifying reasoning, and connecting evidence and data to claims.

232 Innov High Educ (2013) 38:223–235

Page 11: Active Learning and Critical Thinking

Conclusion

In conclusion, the findings of our study could inform instructors and instructionaldesigners about how to use active learning strategies to address the needs andchallenges of undergraduate science education and to ensure appropriate instructionaldesign supports for advancing critical thinking and scientific thinking within suchcontexts. Further research should investigate how each strategy supports students'active engagement in a higher level of thinking and constructive knowledge-buildingprocess.

Acknowledgement This material is based in part upon work supported by the National Science Foundationunder Grant Number 0607995. Any opinions, findings, and conclusions or recommendations expressed in thismaterial are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Appendix A

Table 4 Sample of Scoring Rubric for Individual Essay

Emerging Developing Mastering

1 2 3 4 5 6

1. Identifies decisions appropriately from group and community discussions

• Fails to provide any introduction toimportant issues raised in eitherdiscussion or only presents one ofthe issues.

• Clearly identifies issues raised ingroup and communitydiscussions.

• Clearly recognizes andsummarizes the embedded andimplicit danger and impact ofHurricane Smith.

• May summarize the mostimportant questions raised inboth groups and provide ownperspective.

• Identifies integral relationshipsessential to analyzing thisissue.

2. Identifies and presents evaluation of group and community decisions

• Offers own evaluation withoutany reference to group orcommunity discussions

• Provides own evaluation based ongroup and community discussions

• Clearly states evaluation ofgroup and communitydiscussions

• Does not provide reasoning orevidence to support evaluation

• Acknowledges differences/similarities with group andcommunity perspectives

• Acknowledges differences/similarities with group andcommunity perspectives

• Provides reasoning/evidencefor own evaluation

3. Provides a clear and appropriate solution.

• Offers an unclear or simplisticsolution or position.

• Offers a generally clear solution/position although gaps may exist.

• Offers a solution/position thatdemonstrates sophisticated,integrative thought and isdeveloped clearly.

• Presents position based ongroup/community discussionswithout any indication of ownconsideration

• Presents own position such that itincludes some original thinkingthat acknowledges, refutes,synthesizes or extends assertionsfrom group/community, althoughsome aspects may have beenadopted.

• Presents position in such away that it demonstratesownership for constructingknowledge or framingoriginal questions, whileintegrating and acknowledgingother influences.

Innov High Educ (2013) 38:223–235 233

Page 12: Active Learning and Critical Thinking

References

Andriessen, J. (2006). Arguing to Learn. In R. K. Sawyer (Ed.), The Cambridge handbook of learningsciences (pp. 443–459). New York, NY: Cambridge University Press.

Blumenfeld, P. C., Kempler, T. M., & Krajcik, J. S. (2006). Motivation and cognitive engagement in learningenvironments. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 475–488).New York, NY: Cambridge University Press.

Bransford, J., Brown, A., & Cocking, R. (Eds.). (2000). How people learn: Brain, mind, experience, andschool. Washington, DC: National Academy Press.

Brookfield, S. D. (1987). Developing critical thinkers: Challenging adults to explore alternative ways ofthinking and acting. San Francisco, CA: Jossey-Bass.

Bunce, D., & VandenPlas, J. R. (2006). Student recognition and construction of quality chemistry essayresponses. Chemistry Education Research and Practice, 7(3), 160–169.

Chapman, B. S. (2001, Nov). Emphasizing concepts and reasoning skills in introductory college molecularcell biology. International Journal of Science Education, 23(11), 1157–1176.

Edelson, D. C., & Reiser, B. J. (2006). Making authentic practices accessible to learners: Designing challengesand strategies. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 335–354).Cambridge, MA: Cambridge University Press.

Ennis, R. (1991). Critical thinking: A streamlined conception. Teaching Philosophy, 14(1), 5–25.Ge, X., & Land, S. (2003). Scaffolding students’ problem-solving processes in an ill-structured task using question

prompts and peer interactions. Educational Technology Research and Development, 51(1), 21–38.Gellin, A. (2003). The effect of undergraduate student involvement on critical thinking: A meta-analysis of the

literature 1991–2000. Journal of College Student Development, 44(6), 746–762.Gupta, G. (2005). Improving students' critical-thinking, logic, and problem-solving skills. Journal of College

Science Teaching, 34(4), 48.Hager, P., Sleet, R., Logan, P., & Hooper, M. (2003). Teaching critical thinking in undergraduate science

courses. Science Education, 12, 303–313.Halpern, D. F. (1999, Winter). Teaching for critical thinking: Helping college students develop the skills and

dispositions of a critical thinker. New Directions for Teaching & Learning, 80, 69–74.Hannafin, M., Land, S., & Oliver, K. (1999). Open learning environments: Foundations, methods, and

models. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm ofinstructional theory (Vol. II, pp. 115–140). Mahwah, NJ: Lawrence Erlbaum Associates.

Jonassen, D. H. (1997). Instructional design models for well-structured and ill-structured problem-solvinglearning outcomes. Educational Technology Research and Development, 45(1), 65–94.

Kim, K. (2009). Exploring undergraduate students' active learning for enhancing their critical thinking andlearning in a large class. Dissertation Abstracts International, 70(11), 3380934.

King, A. (1995). Inquiring minds really do want to know: Using questioning to teach critical thinking.Teaching of Psychology, 22(1), 13–17.

King, P. M., & Kitchener, K. S. (1994). Developing reflective judgment: Understanding and promotingintellectual growth and critical thinking in adolescents and adults. San Francisco, CA: Jossey-Bass.

Kronberg, J. R., & Griffin, M. S. (2000). Analysis problems-A means to developing students' critical-thinking skills: Pushing the boundaries of higher-order thinking. Journal of College ScienceTeaching, 29(5), 348–352.

Lajoie, S. P., Lavigne, N. C., Guerrera, C., & Munsie, S. (2001). Constructing knowledge in the context ofBioWorld. Instructional Science, 29, 155–186.

Land, S. M. (2000). Cognitive requirements for learning with open-ended learning environments. EducationalTechnology Research and Development, 48(3), 61–78.

Lynch, C. L., &Wolcott, S. K. (2001).Helping your students develop critical thinking skills. IDEA Paper #37.Manhattan, KS: The IDEA Center.

MacKnight, C. B. (2000). Teaching critical thinking through online discussions: Faculty can play a key role infostering critical thinking among students using web communication tools. Educational Quarterly, 4, 38–41.

Mayer, R. H. (1999). Designing instruction for constructivist learning. In C. M. Reigeluth, (Eds.), In C. M.Reigeluth, (Ed), Instructional-design theories and models: A new paradigm of instructional theory,Volume II. pp,141–160. Mahwah, NJ: Lawrence Erlbaum Associates.

McConnell, D. A. (2005). How students think: Implications for learning in introductory Geoscience Courses.Journal of Geoscience Education, 53, 462–470.

Moon, J. (2008). Critical thinking: An exploration of theory and practice. London and New York: Routledge,Taylor & Francis Group.

National Education Goals Panel (1991). The national education goals report: Building a nation of learners.Washington, DC: U.S. Printing Office.

234 Innov High Educ (2013) 38:223–235

Page 13: Active Learning and Critical Thinking

National Research Council (1996). From analysis to action: Undergraduate education in science, mathematics,engineering, and technology. Washington, DC: National Academy Press.

National Research Council (2012). A framework for K-12 science education: Practices, crosscutting concepts,and core ideas. Washington, DC: National Academies Press.

Paul, R. (1995). Critical thinking: How to prepare students for a rapidly changing world. Santa Rosa, CA:Foundation for Critical Thinking.

Perkins, D. (1998). What is understanding? In M. S. Wiske (Ed.), Teaching for understanding: Linkingresearch with practice (pp. 39–57). San Francisco, CA: Jossey-Bass.

Ruiz-Primo, M. A., Briggs, D., Iverson, H., Talbot, R., & Shepard, L. A. (2011). Impact of undergraduatescience course innovations on learning. Science, 331(6022), 1269–70.

Russell, A. A. (2004). Calibrated peer review: Awriting and critical-thinking instructional tool. In Y. S. George(Ed.), Invention and impact: Building excellence in undergraduate science, technology, engineering andmathematics (STEM) education. American Association for the Advancement of Science: Washington, DC.

Scardamalia, M., and Bereiter, C. (1996). Adaptation and understanding: A case for new cultures of schooling.In S. Vosniadou (Ed.), International perspectives on the design of technology-supported learning environ-ments (pp. 149–163). Mahwah, NJ.: Lawrence Erlbaum.

Scriven, M., & Paul, R. (1987). Defining critical thinking: A draft statement for the National Council forExcellence in Critical Thinking. Retrieved December 8, 2009 from the Foundation for Critical Thinking.http://www.criticalthinking.org/University/defining.html

Takeo, A., Prothero, W. A., & Kelly, G. J. (2002). Applying analysis to assess the quality of universityoceanography students’ scientific writing. Journal of Geoscience Education, 50(1), 40–48.

Tsui, L. (1999). Courses and instruction affecting critical thinking. Research in HigherEducation, 40(2), 185–200.Tsui, L. (2002). Fostering critical thinking through effective pedagogy: Evidence from four institutional case

studies. Journal of Higher Education, 73(6), 749–763.Vygotsky, L. S. (1978).Mind in society: The development of higher psychological processes. Cambridge, MA:

Harvard University Press.Yuretich, R. F. (2004). Encouraging critical thinking. Journal of College Science Teaching, 33(3), 40–45.

Innov High Educ (2013) 38:223–235 235