26
Assessing Assessment: Examining the Assessment Plans of 70 Political Science Programs in the United States John Ishiyama Professor of Political Science University of North Texas

Assessing Assessment: Examining the Assessment Plans of 70 Political Science Programs in the United States John Ishiyama Professor of Political Science

Embed Size (px)

Citation preview

Assessing Assessment: Examining the Assessment Plans of 70 Political Science Programs in the

United StatesJohn Ishiyama

Professor of Political ScienceUniversity of North Texas

• Two parts of my talk today– Provide a description of the context in the United

States that frames the current discussions over assessment (particularly program assessment)

• Explains focus in the US on program assessment as opposed to course based assessment

– An empirical examination of the kinds of techniques used in program assessment in political science

Understand the context in US

• Higher education accreditation system• Trends toward greater federal intervention

into accreditation and assessment• Reaction of the Political Science discipline and

APSA.

Accreditation System

• The accreditation system in higher education in the United States Decentralized– several regional accrediting organizations

• Resulted from renewals of Higher Education Act of 1965 (especially the renewal of 1974)

• The Federal Department of Education does accreditation– however, it is general, largely based on spending

• General higher education accreditation is conducted by the regional organization

• Accreditation is coordinated by the Council for Higher Education Accreditation (CHEA)

• Purposes: national advocacy for self-regulation of academic quality

• Self policing as opposed to direct government regulation

Seven regions• 1) Middle States Association of Colleges and Schools• 2) New England Association of Schools and Colleges • 3) North Central Association of Colleges and Schools• 4) Northwest Commission on Colleges and Universities• 5) Southern Association of Colleges and Schools• 6) Western Association of Schools and Colleges

Accrediting Commission for Community and Junior Colleges

• 7) Western Association of Schools and CollegesAccrediting Commission for Senior Colleges and Universities

• The regional associations are relatively autonomous--CHEA does not direct their activities

• Standards are different across regional associations

• A loss of accreditation could prevent a school from participating in federal financial-aid programs.

Political environment has also changed

• Before, university wide summative data was considered sufficient, and research universities were given a “pass”

• Texas Tech probation in 2007 for lack of providing data that students were acquiring skills sent a chill though higher education • course based, but only core classes

• HEA reauthorization 2008– wanted to impose externally summative goals for educational assessment (much like “no child left behind” for higher education)

– This was removed from the legislation• However accreditation associations are now pushing

hard for department program assessments

• Motivation: Desire for fiscal accountability by political leaders.

• Both sides of the aisle (Democrats and Republicans) support this

• Political science has been under particular pressure recently, apart from assessment

• Greater pressure on Political Science– Example of Senator Tom Coburn, only saved by

Elinor Ostrom winning the Nobel prize

• Reaction in political science mixed. Four groups– Actively hostile

• Two groups – those who don’t care about teaching• Those who see threats to academic freedom

– - positively supportive (political scientists who work in higher education administration and are largely outside the field)– very small

– Passively resigned– resigned to the inevitable– Growing number who are sensitive to the pressures

and believe we should take charge of the process, instead of leaving others to impose alien standard

• APSA has begun to address the directly• Assessment Track at APSA TLC• Assessment handbook (published 2009)• Standing Teaching and Learning Committee

(established 2007)• Assessment Task Force (established 2010)• No guidance on assessment (but lots of

inquiry)

Our study• Study we conducted in 2007- what existed?• Examined 70 program assessment plans across

the United States• Coupled with study by Kelly and Klunk (2003)

which employed as survey of department chairs (n= 213)

• Interested in techniques used (what they reported and what they say they did)

• And whether these were related to institutional types

Table 2: Learning OutcomesLearning Outcome % reported in

Ishiyama (2008)*% reported in Kelly

and KlunkKnowledge of Theories 65.2 54.0Knowledge of political institutions and processes

63.8 Not asked in study

Knowledge of Fields in Political Science

66.7 46.0

Critical thinking 68.1 55.7Methods/Research Skills 62.3 Not asked in studyWritten Communication Skills 66.7 57.1Oral Communication/Presentation Skills

53.6 30.7

Citizenship 24.6 Not asked in StudyCareer Goals 23.2 Not Asked in StudyCultural Diversity 17.4 26.5Ethics/Values 11.6 Not Asked in Study

• Distinguish between what Barbara Wright (2005) calls internal and external methods of program assessment

• Internal refers to technques that can be accomplished without any additional work by faculty members beyond the day to day operation of a class

• E.g. analysis of grades, reviewing existing coursework for signs of student learning, examinations of existing course syllabi, or student course evaluations.

• external approaches require effort (often collaborative between faculty members in a department) outside of classroom activities

• Includes comprehensive examination, exit interviews, alumni surveys, portfolio analysis, a capstone experience, or graduating student surveys/questionnaires.

Table 3: Most Frequently Mentioned Assessment TechniquesAssessment Technique % reported in

Ishiyama (2008)

% reported in Kelly and Klunk (2003)

Graduating Student Survey/Questionnaire

50.0 22.2

Analysis of Student Grades/Performance

45.7 Not asked in study

Senior Seminar/Capstone 35.7 39.6Comprehensive Exam 34.3 Not asked in studySenior Thesis 32.9 20.3Senior Exit Interview 24.3 24.1Portfolio 22.9 17.9Random reading of student papers

17.1 Not asked in Study

Student Course Evaluations

17.1 Not asked in study

Alumni survey/interviews 21.4 Not asked in studySyllabi Analysis 7.1 Not asked in studyPretest/Post-test Not in study 9.9Post-test only Not in study 14.2Faculty Observations Not in study 25.0

Institutional Characteristics• the distribution of external assessment techniques

used, by frequency of schools. • Nine programs employed none of the external

assessment techniques (i.e they only employed internal techniques)

• 16 employed at least one, • 22 at least two, • and eight at least three.• Only fifteen programs used four or more of the

external techniques (with three political science programs using all six).

Figure 2: External Assessment Techniques Score By Selectivity of Institution

2.28 2.17

0

1

2

3

4

5

6

selective/less selective more selective/most selective

Figure 3: External Assessment

Techniques Score by Private or Public Institution

2.27 2.25

0

1

2

3

4

5

6

public private

Figure 4: External Assessment Techniques Score

By Student Faculty Ratio

2.11 2.41

0

1

2

3

4

5

6

student faculty ratio more

than 17

student faculty ratio less

or equal to 17

Figure 5: External Assessment Techniques Score By Highest Degree Offered by Department

1.812.71

0

1

2

3

4

5

6

Graduate Degree Offered Bachelor's Degree Only

Figure 6: External Assessment Techniques Score by Size of Student Population

1.74

2.85

0

1

2

3

4

5

6

student population 9000

and over

student population less

than 9000

Conclusions• wide variety of learning outcome expressed and

assessment tehniques employed; not a single model to conduct student learning assessment.

• discernable patterns-- which is all the more remarkable given that there has been little discipline wide guidance as to how to construct an assessment program.

• many political science departments have hit upon similar learning outcomes and similar kinds of assessment techniques.

• Most of the learning outcomes expressed deal directly with the content of the political science discipline and the promotion of critical thinking, writing and oral communication skills as opposed to the promotion of citizenship and ethics/values.

• internal techniques more likely used than external techniques

• the use of external assessment activities does not vary by the level of institutional selectivity, whether they were public or private institutions, or the institution’s student-faculty ratio

• Primarily undergraduate departments and smaller institutions are more likely to employ external assessment techniques than larger graduate degree granting departments.

• This might indicate that the external techniques are more easily implemented at smaller institutions that have smaller class sizes and fewer majors than the larger state institutions.

• However, it might also reflect the greater emphasis smaller, primarily undergraduate institutions place on teaching and learning, and these departments seek to go beyond merely employing the easiest assessment techniques to meet the “letter of the law.”