28
OUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011 College of Liberal Arts and Sciences, University of Colorado Denver Submitted by Jeff Franklin, Associate Dean for Undergraduate Curriculum and Student Affairs, June 2011 The College of Liberal Arts and Sciences (CLAS) now has established an active culture of assessment in academic programs and other areas. This is the result of major efforts in recent years by chairs/directors, their faculty, and College administrators. This is not to say that everyone that should be engaged is engaged or that there are not improvements to be made, but this is occasion for congratulations, especially to the chairs/directors and participating faculty. The College has come a long way, not merely in compliance but in coming to use assessment for the right reasons, which are for improving our programs, curricula, pedagogy, and student learning. To enhance the culture of assessment in the College, CLAS created an assessment web page--http://www.ucdenver.edu /academics/colleges/CLAS/faculty-staff/faculty-resources/ teaching/Pages/OutcomesAssessment.aspx --containing information on college-wide assessment activities as well as information on best practice in assessment. Assessment plans and resulting reports for every degree program (plus some minors), as well as assessments of the CLAS general-education (Core + CLAS graduation requirement) courses are on file with the University Director of Assessment, Dr. Kenneth Wolf, http://www.ucdenver.edu/about/ departments/ assessmentoffice/Pages/default.aspx . Additionally, assessment activities occurred in the Advising Office and the Writing Center. Based on the assessment activities in 2010-2011, modifications to courses and course sequencing are occurring, as are services in the Writing Center and the Advising Office. The purpose of this report is twofold: 1) to provide an overview of assessment activities within CLAS, along with examples of how different units have used assessment to revise 1

Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Embed Size (px)

Citation preview

Page 1: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

OUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011

College of Liberal Arts and Sciences, University of Colorado Denver

Submitted by Jeff Franklin, Associate Dean for Undergraduate Curriculum and Student Affairs, June 2011

The College of Liberal Arts and Sciences (CLAS) now has established an active culture of assessment in academic programs and other areas. This is the result of major efforts in recent years by chairs/directors, their faculty, and College administrators. This is not to say that everyone that should be engaged is engaged or that there are not improvements to be made, but this is occasion for congratulations, especially to the chairs/directors and participating faculty. The College has come a long way, not merely in compliance but in coming to use assessment for the right reasons, which are for improving our programs, curricula, pedagogy, and student learning.

To enhance the culture of assessment in the College, CLAS created an assessment web page--http://www.ucdenver.edu /academics/colleges/CLAS/faculty-staff/faculty-resources/ teaching/Pages/OutcomesAssessment.aspx--containing information on college-wide assessment activities as well as information on best practice in assessment. Assessment plans and resulting reports for every degree program (plus some minors), as well as assessments of the CLAS general-education (Core + CLAS graduation requirement) courses are on file with the University Director of Assessment, Dr. Kenneth Wolf, http://www.ucdenver.edu/about/ departments/ assessmentoffice/Pages/default.aspx. Additionally, assessment activities occurred in the Advising Office and the Writing Center. Based on the assessment activities in 2010-2011, modifications to courses and course sequencing are occurring, as are services in the Writing Center and the Advising Office.

The purpose of this report is twofold: 1) to provide an overview of assessment activities within CLAS, along with examples of how different units have used assessment to revise their practices; 2) to provide feedback and suggestions to chairs/directors and other faculty for making our assessments increasingly useful to us.

College-Level Assessment Activities in 2010-2011

Cross-College assessment activities occurred in three areas in 2010-2011: CLAS general-education assessment, the Writing Center, and the Advising Office. Each of these activities is summarized below.

General-Education Assessment: In fall 2010, 18 out of the total 26 CLAS academic units or programs (including majors, minors, graduate-only programs, and the Writing Center) participated in gen-ed-course assessments. Sixty CLAS courses and 141 sections of those courses were assessed (out of the total 400 sections offered by CLAS at all levels in fall 2010). This level of participation is alone a major accomplishment for the College. The learning of an estimated 3,807 students was assessed. To summarize at the broadest level, 70-90% of our

1

Page 2: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

students, on average, demonstrated that they are learning what we want them to learn, performing at the "exceeding expectations" or "meeting expectations" levels.

Details of this assessment appear in the "CLAS General-Education Assessment, Summary Report for 2010." This report includes examples of assessment practices and how assessment results have been used for revision of pedagogy and curriculum. It also includes recommendations for future gen-ed assessment within CLAS. These results were sent to the faculty of CLAS during the spring semester through the CLAS listserve. Additionally, they were sent to the Director of Assessment, the CU Denver Core Curriculum Oversight Committee (CCOC), and the CU Denver downtown campus Assessment Committee for comment. The report is (or soon will be) available on the CLAS assessment website.

The Writing Center: During the spring 2009 semester, the Writing Center defined a series of learning goals which they tested during summer 2009. The results of the pilot test, goals were revised and learning outcomes were clarified. The Writing Center uses a variety of data collection methods including student surveys, instructor surveys, individual session data management and analysis of student project where multiple drafts of a paper were available to determine improvement. Data collection for a full assessment occurred in the fall 2009 semester by an assessment committee for goals 1-3 (students can compose a clear, concise thesis statement; students can organize information into distinct paragraphs that support an argument; and students can develop ideas to support an argument). Specifically, each member of the committee read papers and scored them without knowing if the paper was an early draft written before consultation with the writing center or a later draft, after consultation with the writing center for each of these goals. A comparison of the scores for the early vs. late drafts demonstrates an improvement in student papers.

Advising Office: The CLAS Advising Office provides a broad range of valuable services to students and to the College, contributing to everything from orienting new students to transfer-credit checking to semester-by-semester advising to serving as staff liaisons to College committees to working with the major advisors to degree auditing and graduation checking, to name a few. Since their primary responsibility of advising students throughout their academic progress begins with the initial contact, their current "Assessment Plan" focuses on the effectiveness of their efforts to orient students about topics such as what a liberal arts education is, how to use the advising system, and what the various degree requirements are. They therefore designed a survey to be administered to students after attending one of their "Advising 1001" orientation sessions. The survey was piloted in summer 2009, refined, and administered in full in summer and fall 2010 orientations. The data, details of which are available from Assistant Dean Carol Morken, show the orientations already to be highly effective. Several years of data will allow longitudinal analysis and consideration of where improvements are taking place and could take place.

2

Page 3: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Program/Department Assessment Activities in 2010-2011

As Table 3 below shows, nearly all degree-granting programs within CLAS, as well as the Composition Program and the Religious Studies minor, generated program assessment reports for 2010-2011. This level of participation is in itself an indicator of engagement in assessment.

Table 3: Status of Program Outcomes Assessment Reports 2010-2011

Department and degree Current Chair/director 2010-2011 assessment reports submitted

Anthropology (BA, MA) Steve KoesterBiology (BS, MS) Diana Tomback Biology, Health Careers Charlie FergusonChemistry (BS, MS) Mark Anderson Department reports, BS, MSCommunication (BA, MA) Stephen Hartnett Department reports, BA, MAEconomics (BA, MA) Buhong Zheng Department reports, BA, MAEnglish (BA English, BA Writing major, MA)

Nancy Ciccone Department reports, BAs, MAs (Lit/Film, Writing, CW, APL)

Composition Program Amy Vidali Program reportEthnic Studies Donna LangstonGeography and Environmental Sciences (BA)

Brian Page Department report, BA

GES: Environ. Sci. (MS) John Wyckoff Program report, MSHealth and Behavioral Sciences (PhD)

Debbie Main Department report, PhD

History (BA, MA) Marjorie Levine-Clark Department report, BA, MAIndividually Structured Major (BA)

n.a. n.a.

Humanities & Social Sciences (MA & MS)

Omar Swartz & Margaret Woodhull

Program reports, MA, MS

Integrated Sciences (MIS) Mary Coussons-Read Program report, MISInternational Studies (BA) Greg Whiteside Program report, BAMathematical and Statistical Sciences (BS, MS, Ph.D.)

Mike Jacobson Program reports: BS, MS, PhD, gen-ed courses

Modern Languages (BA French, BA Spanish, MA Spanish)

Devin Jenkins French report, German report, Spanish report, BA, MA

Philosophy (BA) Rob Metcalf Department report, BA Religious Studies (minor) Sharon Coggan Program report, minorPhysics (BS) Weldon Lodwick Department report, BSPolitical Science (BA, MA) Jana Everett Department reports, BA, MAPsychology (BA, BS, MA, Ph.D.)

Peter Kaplan Department reports, BA, BS

Social Justice (minor) Chad KautzerSociology (BA, MA) John Freed Department reports, BA, MASustainability (minor) John BrettWomen & Gender Stud. (minor) Gillian Silverman

3

Page 4: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Rather than provide here a summary of each program's report, I have chosen to take three alternative actions: 1) to excerpt highlights from the CLAS program assessment reports and provide them in the appendices below; 2) to invite those interested in more detail to request exemplary program reports, which either Kenny Wolf or I would be glad to send; and, 3) to take the current opportunity to offer chairs/directors and other faculty some recommendations for conceiving of and undertaking next year's program assessment.

Recommendations for CLAS Academic Program Assessment in 2011-2012

The following recommendations come from two sources. First are the examples provided by this year's assessment reports. Different academic units and programs are at different stages in terms of their knowledge about and execution of program assessment. Some departments are fully engaged and making very constructive use of assessment for pedagogical and curricular improvement. Others are still struggling with the process. The CLAS Dean's Office would like for all departments/programs to come up to the level exemplified by the best of this year's program assessment reports. The second source is the body of scholarship on assessment, and many scholars in this field say the same things, some of which I now will repeat or elaborate.

1. Building engagement with assessment: Are your faculty aware of the program's learning goals/outcomes? Do they incorporate relevant learning goals in their syllabi? Do you distribute the assessment reports to faculty and then hold discussions of them? Do your faculty participate in deciding how to use assessment outcomes for revising teaching and curriculum? Has your faculty recognized that the assessment effort is paying off in terms of improving delivery of your learning goals? Is your faculty's involvement reflected in your program assessment report? See "Appendix A" below for selected examples from this year's program assessments.

2. Integrating your curriculum: This means that specific learning goals are tied to specific courses and that the curriculum is considered holistically as a sequence that builds student learning through 2000 to 3000 to 4000-level courses, for example. Here is a generic example:

Table 4: "Alignment Matrix (Curriculum Map)

Course Outcome 1 Outcome 2 Outcome 3 Outcome 4 Outcome 5100 I, D I101 I D102 D D D103 D200 D D229 D230 D, M M280290 M D, M M

I = Introduced, D = Developed & Practiced with Feedback, M = Demonstrated at the Mastery Level Appropriate for Graduation. Some Variations [to consider adding]: R = Review; review of basics added to junior-level courses to ensure that all students have the background for upper-division work, or review of basics for beginning graduate students. C = Consolidation; students given opportunities to consolidate their

4

Page 5: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

learning of outcomes that have been previously mastered in the curriculum." Also, one could add "A"s to this matrix to indicate where key assessments are being administered. (Source: Mary Allen, author of Assessing Academic Programs in Higher Education)

Is your curriculum designed to deliver you learning goals? Does an alignment matrix appear in your program assessment report? See "Appendix B" below for selected examples from this year's program assessment reports.

3. Including learning goals in syllabi: You and your students know that your learning goals are integrated with your curriculum when those goals appear in course syllabi at several course levels. Different courses will emphasize different learning goals, depending on the level of the course and its role in delivering particular learning at a particular time in your students' progression through your curriculum. Is anyone checking the syllabi of your program's courses to see that they are delivering the learning outcomes that your faculty has agreed they should? See "Appendix C" below for selected examples from this year's program assessments.

4. Using rubrics: I used to hate the very word "rubric" until I reluctantly designed one for an essay assignment and realized that I now understood the assignment and my own grading criteria for the first time and that I owed it to my students to give them that level of clarity about what I expect them to do. Has your program written a rubric for the senior project? For the MS or MA exam? For the MA thesis? See "Appendix D" below for selected examples from this year's program assessment reports.

5. Closing the loop: Is your faculty discussing the assessment outcomes? Beyond conversation, which in this case is inherently good, are you using the assessment results to make revisions to how learning goals are taught and how curriculum is structured? Does your program assessment report identify specific changes you have or will make to teaching or curriculum? Are you tracking those changes to determine whether they are having the desired effect on student learning? See "Appendix E" below for selected examples from this year's program assessment reports.

6. Assessing the assessment: Is your program assessment providing you with the most useful information you can imagine it providing? Has your program ever assessed the assessment? Are you making revisions to the assessment process itself in order to improve its usefulness to you? Is it time to revisit and rewrite your learning goals/outcomes? See "Appendix F" below for selected examples from this year's program assessment reports.

7. Practicing the full assessment cycle: The primary purpose of program assessment is not to produce a post-hoc report in May that shows that students did well enough last year. The purpose is to improve teaching, curriculum, and student learning now and next year. This is best facilitated by practicing a full assessment cycle: i) revisiting last year's assessment and the changes you told yourself you were going to try; ii) cementing consensus about learning goals and agreement on how to put them into practice in teaching and curriculum; iii) incorporating learning goals into syllabi, teaching, and curriculum; iv) agreeing upon assessment methods and

5

Page 6: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

planning the assessment. These are among the first steps that lead, next spring or summer, to using the 2011-2012 program assessment to make revisions to your courses and curriculum. The cycle is continuous, and next year's assessment begins with revised learning goals, assessment plans, and curricular adjustments now, this August.

"Appendix G" contains a table that may be useful next year for checking off the above activities in which your department/program has engaged.

Assessment Activities Planned for CLAS in the 2011-2012 Academic Year

Assessment data will be recorded, used for program improvements, and reported on in all CLAS departments in the 2010-2011 academic year. Those reports will be submitted to the Director of Assessment in May 2010. Data will be collected for the CLAS graduation requirements during fall 2010 semester and be the subject of faculty conversations in spring 2011. Additionally, assessment activities will continue to expand in both the Writing Center and the Advising center.

6

Page 7: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Appendix A: Building Engagement with Assessment

"The department faculty discussed the outcomes assessment process at the October 2010 faculty meeting. We decided that this year we would focus on the specific objective of "an ability to integrate and apply multiple social and behavioral science theoretical perspectives to particular health and health care problems" among our first year students. The faculty formed a committee of three people: David Tracer, Jean Scandlyn and Debbi Main (department Chair). To assess the outcomes for each student, we rated students' performance on three criteria taken from the core competency; we evaluated their final papers from each of the two required courses they take in their first semester in the doctoral program: HBSC 7031 Human Ecology and Environmental Adaptation and HBSC 7011 Theoretical Perspectives in Health and Behavioral Sciences 1. Together the committee developed a rubric based on the three criteria and then graded student final papers; the results and interpretation of our findings are presented below. -- Heath and Behavioral Sciences, Debbi Main

"Dr. Sonja Foss prepared a detailed draft document [in response to last year's assessment] for faculty discussion indicating the skills students should be able to demonstrate cumulatively in the writing produced in the Department’s 1000, 2000, 3000 and 4000 level courses. This document is now in the hands of Dr. Stratman and Adjunct Instructor Mary Domenico who will attempt to draft some required and discretionary writing assignment design and evaluation methods for use in faculty classes at each of these levels. This document will attempt to incorporate and be guided by what we have been learning from the use of the Department’s base scoring rubric in our annual assessments." -- Communication, Jim Stratman

"Over the last few years, we have been having discussions about the number of weak comprehensive exams in the department. We decided to institute a new comprehensive exam process. Rather than having students complete three essays in one day, we now give them one week, which means they receive the exam questions on a Monday morning and return the exam on a Friday afternoon. We also now limit students to 1500-1800 words for each essay question. The results have been much better exams across the board. Students have the time to think more productively, edit their writing, and concentrate on creating arguments and analysis." -- History, Marjorie Levine-Clark

"A subcommittee examined Math 3000 and Math 3140. The instructor of each course chose one question that s/he considered to be of “average” difficulty and indicative of standard course material. The subcommittee, which did not consist of the course instructor, then developed a common rubric that went from 0 to 5. This rating essentially corresponded with a grading scale of 5=A, 4=B, 3=C, 2=D, 1=F, and 0=no progress. Thus, it should in particular be noted that it is not linear. Each problem was then independently graded, and scores that differed were discussed until the committee reached a consensus. " -- Mathematical and Statistical Science, Mike Ferrara and Diana White

"In any case, a faculty retreat is planned for August 2011, part of which will address our ongoing procedures to assess learning outcomes. This report will be shared with the Philosophy faculty,

7

Page 8: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

with input from our two reviewers, so that we can discuss how we may better achieve these learning outcomes in our upper-level course." -- Philosophy, Robert Metcalf

"The results of the assessment were shared during our faculty meeting in March 2011, with our discussion focused on interpreting findings and identifying the need to reinforce mastery of the integration of theory in later years of the program." -- Heath and Behavioral Sciences, Debbi Main

"For 2010-11, the political science faculty conducted the outcomes assessment through an examination of four MA theses and seven MA projects that were completed in AY 2010-11. Five different faculty members each assessed two or three theses/projects (for which they served as Chair). The faculty members used the assessment rubric in their examination." -- Political Science, Jana Everett

"At the final seminar they [the students] were subjected to vigorous questioning by the faculty, and by some of their fellow students. Each group presentation, and the contribution of each individual, was graded by three members of the faculty (one organic chemist, one biochemist and one analytical chemist) in addition to the course instructor, all using the standard Department presentation grading rubric, which had been given to the students in advance of their progress presentations. Each student’s final presentation grade was based on a simple average of the four faculty grades." -- Chemistry, Mark Anderson

8

Page 9: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Appendix B: Integrating your Curriculum

"Briefly, in the fall 2010, all 4000 level instructors were contacted and requested to produce . . . a spreadsheet indicating which [department-wide] assessment goals and corresponding rubric items they saw their courses as addressing; in particular, they were to include an explanation of how and where each rubric item was addressed by the course. . . . Then, at the start of the spring term 2011, and based upon this spreadsheet, we requested all 4000–level instructors to include brief statements in their course syllabi explaining for a student which of the Department’s learning goals the course would address. . . . The idea . . . was to motivate instructors to think carefully about ways they could connect their existing course learning goals with those in the Department’s assessment plan, and to do more to pedagogically implement these goals in their course assignment, exercises, and classroom activities" --Communication, Jim Stratman

"Courses where assessment takes placeGoals

PolTheory

[PoliticalTheory]

Exper Ed[ExperientialEducation]

Res[earch] Methods

4000 Level Am[ericanPolitics]

4000 Level CP[ComparativePolitics]

4000 Level IR[InternationalRelations]

1 X2 X3 X X4 X X X5 X X X X X6 X X X X X X

Learning Goals [in the first column above]By the time of completion of the political science major, students will be able to:1. Place themselves in conversation with multiple philosophical perspectives, including those

outside of the traditional Western canon, by developing their own positions and by seeing how their positions build from and contribute to these perspectives.

2. Reflect on the relationship between theory and political practice and potential career paths through engagement in an internship, service learning, or other experiential learning.

3. Evaluate conflicting arguments, assemble and present empirical evidence using appropriate methods of research and data analysis, and make reasoned conclusions from the evidence available

4. Demonstrate their knowledge about the American political system, other political systems, and international relations.

5. Engage in critical thinking. 6. Express their views effectively in written and verbal communication."

-- Political Science, Jana Everett

9

Page 10: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

"EWRT [English Writing Major]– Create more coherency in the types of writing assignments throughout the program: provide students with a broad range of writing opportunities but not so many that their ability to gain significant mastery is curtailed by the number of different types of writing assigned. Faculty may want to consider choosing three or four genres which students are assigned more often throughout their academic career in addition to the great variety of experiences we currently offer. Faculty may also want to agree upon the multimedia skills most needed by our students and train themselves in the teaching of these skills within our current curriculum and its focus on critical rhetorics." -- English, Nancy Ciccone

"It is clear both from our assessment of student work and the comments of students that there are ways that we can improve the major. In spring 2010, we conducted a series of discussions – first among the department’s executive committee, and then two meetings of the whole department – to work on our curriculum structure. In particular, we are concerned that our undergraduate courses reinforce at different levels the skills that students learn in Theory and Practice." -- History, Marjorie Levine-Clark

10

Page 11: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Appendix C: Including Learning Goals in Syllabi

"As a result of our curriculum discussions, we have created guidelines for clearer and more thorough syllabi (adopting in principle the CLAS syllabus template), as well as expectations for the amount of writing our courses should contain (no less than 50% formal writing, in essay exams or papers), and participation (no more than 20% in classes larger than 25 students). This is in response to concerns among both faculty members and our students that we need more consistency across our courses. We agreed that ALL History courses must include:

Discussion of what history is, how to think historically, and what historians do Work with primary sources in their historical context Discussion of the relationships between primary and secondary sources Practice making an historical argument Assessment and development of writing skills

We are still working on making sure that all our courses actually address these elements." -- History, Marjorie Levine-Clark

"Briefly, in the fall 2010, all 4000 level instructors were contacted and requested to produce . . . a spreadsheet indicating which assessment goals and corresponding rubric items they saw their courses as addressing; in particular, they were to include an explanation of how and where each rubric item was addressed by the course . . . . Then, at the start of the spring term 2011, and based upon this spreadsheet, we requested all 4000–level instructors to include brief statements in their course syllabi explaining for a student which of the Department’s learning goals the course would address." -- Communication, Jim Stratman

11

Page 12: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Appendix D: Using Rubrics

Many departments/programs have developed rubrics. The rubrics vary considerably in thoroughness and detail across units. Many of the most thorough rubrics--such as those developed by Communication, English, and Geography/Environmental Science--are too lengthy for reproduction here. Giving on a few examples of the use or intended use of rubrics here, I refer readers to the program assessment reports online.

As a recommendation: "In 4000-level courses, professors should produce a clear grading rubric as to what is expected (for blog posts, research papers, take-home essay exams, oral presentations) and make exemplary examples of various types of writing available to students." -- Political Science, Jana Everett

History general undergraduate rubric:

"Exemplary Competent Inadequate

Research

Solid and imaginative exploration and use of appropriate primary sources, scholarly articles and monographs. Sources build on each other, open new avenues of thought, and establish argument with originality.

Too much reliance on secondary sources, or inadequate use of primary sources. Unimaginative but adequate exploration and use of materials.

Too little evidence of any kind to address analytical questions with originality and depth. Heavy reliance on a single source or fragmentary use of secondary sources. Note that any evidence of plagiarism will result in a failing grade.

Writing

Organization is unified and coherent. The order and structure of the paper, paragraphs, and sentences are compelling and move the reader along. Transitions are purposeful and clear. Grammar, spelling, punctuation, capitalization, and vocabulary usage are correct and appropriate. The tone is consistent and appropriate. Citations are thorough, accurate, and in correct format.

Writing’s logical order and structure may be inappropriate and do not advance the paper’s goals. Paragraphs, sentences, and/or transitions are sometimes effective but sometimes not. Grammar, spelling, punctuation, capitalization, and vocabulary usage contain some flaws that do not impede readability. The tone is inconsistent and/or inappropriate. Citations are accurate but formats are erratic.

Lacks clear structure and order. Paragraphs and sentences may be convoluted and difficult to understand, or they may be too choppy. Transitions are abrupt and unclear. Grammar, spelling, punctuation, and vocabulary usage contain major flaws that impede readability. Citations are missing, and/or they appear in erratic formats.

Analysis

Identifies and develops main themes with depth and completeness, strong support, and adequate detail. Uses evidence to argue a point. Asks interesting and novel questions of the evidence. Considers context, contingency, actors’ roles and purposes, and significance of findings. Applies concepts from course. Seeks explanation.

Narrative with some consideration of context and other explanatory factors. Crude or simple application of course ideas, methods, or materials. Identifies and develops main themes in a vague way, or not as deeply as they might be. Supporting evidence and analyses are lacking in detail or they are unclear.

Simply accumulates evidence within a narrative that lacks contextualization and other explanatory factors. No use of course ideas, methods, or materials. Identifies and develops main themes poorly or not at all. Analysis is missing, as is supporting evidence. "

12

Page 13: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

"Beginning in Fall 2011, we will begin adding an assessment rubric the two introductory MHMSS core courses in order to assess student growth from their first course in the program to their last." -- Masters of Humanities/Masters of Social Science, Margaret Woodhull and Omar Swartz

"As the attached plan further explains, faculty whose courses have been chosen in a given year [to have their courses be the focus of the assessment] will collaborate with the Department’s Outcomes Assessment Committee (OAC) to develop a scoring rubric that that can be used to evaluate students’ work in each course in relation to the first four objectives shown above." --Communication, Jim Stratman

"The heavy weighting on the written reports reflects one of the major aims of the course: that students should be able to present and interpret heir research results in an understandable, defensible form, using proper English. Each report is written in the format of the Journal of Organic Chemistry, the lead American Chemical Society journal of this sub-discipline. A grading rubric for each report (increasingly stringent) was posted for the students at least two weeks before the due date. The grade was reported as a set of scores, one for each of the areas described in the rubric. Extensive comments on each area of weakness (and on areas of strength) were provided." -- Chemistry, Mark Anderson

13

Page 14: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Appendix E: Closing the Loop

"The following suggestions will be presented to the Political Science faculty at first department meeting in Fall 2011, along with the attached reportsRecommendations:

1. In Comparative Politics courses, set aside class periods that enable the students to engage in comparative analysis and provide the students with models of comparative analysis (done by students as well as academic journal articles).

2. In 4000-level courses, professors should produce a clear grading rubric as to what is expected (for blog posts, research papers, take-home essay exams, oral presentations) and make exemplary examples of various types of writing available to students." -- Political Science, Jana Everett

"Because this assessment took place after only one semester of coursework . . ., we were pleased with the findings; however, we shared a common observation that integration of theory could be strengthened within other HBS courses, including our methods classes." -- Health and Behavioral Sciences, Debi Main

"Finally, given the importance of research in English 2030, I think it is time to work with the Auraria library to craft a sort of “module” for 2030 students (which I think might help raise this outcome)." -- Composition Program, Amy Vidali

"We still need to spend more time on very basic research skills. Though I thought I had talked endlessly about how to do a comprehensive bibliography, the students still didn’t do a good job. I’m preparing a sheet with criteria and instructions for use the next time I teach" Theory and Practice in History. -- History, Marjorie Levine-Clark

"Feedback LoopWe have learned valuable information regarding the level of our Program’s effectiveness from this exercise. We intend to address areas where we could improve effectiveness. To that end, we plan to implement these changes:1. First and foremost, we will need to examine our list of courses to identify where there might

be built in opportunities to better ground our students in basic methodologies utilized in the field. For example, opening lectures in many of our classes could include a summary of the birth of the field of Religious Studies out of Enlightenment “scientific” methodologies and the various dimensions of religion the discipline studies.

2. We need to refine our questions more so as to reduce the level of subjectivity and vagueness in the way the questions are posed and the subsequent evaluation of the answers.

3. We are still in the process of refining our data collection methods, working at devising better ways to embed the information and the questions we want to assess.

4. Likewise, we are still attempting to develop a better “triangulation” ideal, to provide different lines of evidence to discover whether they lead to the same results. However, this process is complicated by the problem detailed above.

5. Continue to refine and add to the master list of questions utilized in the Quizzes.

14

Page 15: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

6. We would like to see further discussion with our colleagues assessing Core requirements in the Humanities area to better clarify specific goals, and reduce the ambiguity of the specific Learning Objectives. For example, we have found it difficult to design specific embedded exam questions that can really assess how students can “explain fundamental concepts to analyze ethical and social issues in local and global contexts.”

7. Develop and administer Exit Survey to graduating seniors." -- Religious Studies Minor, Sharon Coggan

"We need to improve our performance in both [MATH]1070 and 1110. Much thought and time has been devoted to these courses and adjusting to the new student population (students coming in without the prerequisite knowledge necessary since the placement exam was discontinued). Student feedback from FCQ’s note that the recitation sections in 1110 have been beneficial in helping to reduce some of the deficiencies in students’ prerequisite knowledge. We will continue to explore other ways to help students in these two courses find the mathematics course best suited for their abilities and succeed in the course." --Math, Mike Jacobsen

"CRW [Creative Writing Program, English] -Develop curricular structures or practices that help students develop a structuralist vocabulary that enables them to discuss their work as a writer instead of solely evaluating the ideas of a work as a scholar or critic." -- English, Nancy Ciccone

"English [Literature] added a new gateway course: 2450 to prepare majors for upper division Critical Theory course already in place (3001). All the options were modified to adopt new course." -- English, Nancy Ciccone

"As was noted in last year’s review, the committee once again observed that the highest quality research was done by students who started their research projects well before the start of ECON 6073. As a result of the recommendations contained in last year’s MA outcomes assessment, the department encouraged more students to begin working on their MA research projects at least one semester before taking the capstone research seminar course. We believe that this is largely responsible for the aforementioned improvement in the quality of the MA papers. It is our recommendation that this practice continue." -- Economics, Buhong Zheng

"Again, these data are valuable to help with restructuring the orientation presentation as well as what ongoing workshops and information can be offered to students throughout their time at UCD. The advising office can also determine how well we are integrating the teaching/learning mission and objectives of the college into our advising programs. Furthermore, advisors can also use this knowledge to see how to structure their interactions with advisees and how to assess student learning." -- CLAS Advising Office, Jeff Schweinfest

"The department must do a better job of emphasizing the sequencing of the course curriculum to students, and at rigorously applying course prerequisites. This became clear this year based on the difficulty that many students who did not meet course prerequisites had with courses they were enrolled in. This will be a point of emphasis with department advising, and we will emphasize to faculty the need to check that students meet the prerequisites for courses. As a result of this, the department has reorganized its advising – adding additional faculty as student advisors. We also have discussed, but

15

Page 16: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

have not yet implemented, the requirement that students speak with an advisor before they are allowed to register for courses. The department decided to collect one additional year of data to determine if this step is necessary, or if the last year was anomalous." -- Chemistry, Mark Anderson

16

Page 17: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Appendix F: Assessing the Assessment

"This year, I sought to expand and improve the assessment process for the Composition Program. The improvements included:

(1) providing an online assessment procedure so more teachers could participate (in Fall 2010, seven classes were assessed, this semester 27 of 35 were assessed);

(2) supplying additional space on the online form for teachers to provide feedback on students who were struggling;

(3) including all online and UHL courses; and(4) providing more suggestions to “close the loop” (see end of report)."

-- Composition Program, Amy Vidali

"Second, at the same time, it seems that OAC [Outcomes Assessment Committee in the department] should spend more time meeting with faculty individually and collectively about the Department’s learning goals and their assessment. In this regard, in last year’s report we recommended that faculty evaluate a sample student paper or two using the Department’s rubric (as a pilot) to improve inter-rater reliability. Unfortunately, this activity did not occur, with lack of time again being the cause. Nevertheless, this activity would seem potentially valuable not just for the sake of calibrating OAC raters using the rubric but more importantly for helping faculty tighten the links between their assignment directions, their feedback to students when grading, and underlying course and Department learning goals." -- Communication, Jim Stratman

"Additional quantitative measures need to be added to evaluation of student work. Follow-up interviews of graduates should also be added to determine what worked well for them in the program and what could be improved, as well as how they ended up using their degree." -- Integrated Sciences, Mary Coussons-Read

"The following new additional outcomes assessment procedures will be put into place beginning in AY 2011/12. These changes were originally scheduled to be put in place during AY 2010/11 but were delayed when the department acted to establish a faculty assessment committee and decided to give that committee additional time to undertake their assigned tasks.a) Online Senior Exit ExamSenior exit exams will be administered each semester to our graduating seniors. The exam provides a quantifiable view and measures a student’s comprehension of the key courses in Geography and their option’s core curriculum. The following outline provides courses from which questions are generated for each Geography option.b) Online Senior Exit QuestionnaireThe exit questionnaire provides us with a qualitative measure on how a student perceives her/his experience in the Geography and Environmental Sciences department. Questions on the survey are ranked from 1 (disagree strongly) to 5 (agree strongly) and address both the academic and collegial aspects of our department. A “comments” section is also provided for additional student input. The questionnaire is anonymous, and results go directly to the Chair."

-- Geography / Environmental Science, Brian Page

17

Page 18: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Appendix G:

Program Assessment Recommendations Checklist for 2011-2012 (check each box that applies)Department/ Program

Building Engage-ment with Assess-ment

Integrating Your Curriculum

Including Learning Goals in Syllabi

Using Rubrics

Closing the Loop

Assessing the Assess-ment

Practicing the Full Assess-ment Cycle

Anthropology

Biology

Biology, Health Careers Chemistry

Communication

Economics

English

Composition

Ethnic Studies

Geography and Environ. Sci. GES: Environ. Sci. MS Health and Behavioral Sciences History

Individually Structured Major Humanities & Soc. Sci. MA/MS

18

Page 19: Web viewOUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011. C. ollege of Liberal Arts and Sciences, University of Colorado Denver

Integrated Sciences International Studies Mathematical and Statistical Sciences Modern Languages Philosophy

Religious Studies (minor) Physics

Political Science Psychology

Social Justice (minor) Sociology

Sustainability (minor) Women & Gender Studies (minor)

19