19
Assessment & Technology UH-M COLLEGE OF EDUCATION COE Outreach & Technology

Assessment & Technology

Embed Size (px)

DESCRIPTION

Assessment & Technology. UH-M COLLEGE OF EDUCATION. COE Outreach & Technology. Electronic Exhibit Room. Systematic assessment of evidence of student learning at multiple points in program Program assessment data compiled to an internal website - PowerPoint PPT Presentation

Citation preview

Assessment & Technology

UH-M COLLEGE OF EDUCATION

COE Outreach & Technology

Electronic Exhibit Room

• Systematic assessment of evidence of student learning at multiple points in program

• Program assessment data compiled to an internal website

• NCATE reviewers access at their leisure• Data remain available between reviews• Easy for program faculty to maintain

Program Assessment

Assessed by:– Candidates (exit surveys, course evaluations)– Alumni (survey, focus groups)– Employers– Mentor Teachers– COPR Process– Learned Societies– Candidate Learning Outcomes Review

Assessing Programs by Assessing Candidates

Candidate Assessment

Candidate Assessment– Evidence Collection = Portfolio

Grade Reports Exam Scores Work Samples Faculty Observation Summaries Students’ Work Samples

– Candidate Portfolio Tools PowerPoint (hyperlinks, external files,

branching) Task Stream – online CD-R

Assessing Learning Outcomes

1. Define Program Objectives

2. Define Points of Measurement

3. Define Evidence for Objectives

4. Define Rubric for Assessing Evidence

5. Delineate Who Evaluates and When

Example (part 1: Program defines Objectives)

ABC ProgramObjective 1: (Knowledge) Candidates know . . .

(Skill) Candidates are be able to …

(Disposition) Candidates exhibit…

Objective 2:

Objective 3:

Objective 4:

Secondary Program Objectives

SECONDARYPROGRAMOBJECTIVES

EDUC 401 EDPSYC311/611

EDUC 402

EDUC 404

TECS 440

SPED 445

ETEC 414

EDUC405-6

Introduction to

SecondaryEducation

EducPsychology

Field Methods Multi-Cultural

SpecialEducation

EducTech- nology

Student Teaching

Professional Legal

Responsibilities

x x x X x X

Foundations ofSecondaryEducation

X x x x x

Example (part 2: Program Chooses Points of Measurement)

Program will Assess Candidates

– Beginning (defined: immediately upon admission)

– Middle (defined: conclusion of EDUC XXXX course and/or prior to student teaching)

– End (defined: conclusion of field experience XXXX)

Example (part 3: Program assigns evidence to objectives)

Objective 1: Professional Legal Responsibilities

The teacher candidate demonstrates an understanding of (knowledge) and ability to apply (skill) and model (disposition) legal responsibilities expected of professional educators.

Sample artifacts

Program Objectives Suggested Artifact Evidence RubricScale

1. Professional Legal ResponsibilitiesThe teacher candidate demonstrates an understanding of and ability to apply and model legal responsibilities expected of professional educators.

Case study responsesReflective journals and logsPerformance evaluationEvidence of ability to maintain required reports, records, and legal documentsIEP from a case study report

1 2 3

Example (part 4: Program defines rubric scale for evidence)

3 Target: Evidence reflects in-depth knowledge and understanding of standard; outstanding data and evidence of application

2 Acceptable: Evidence indicates knowledge and understanding of standard; satisfactory data and evidence of application

1 Unacceptable: Evidence shows little or inadequate knowledge of standard; limited data and evidence of application

Example (part 5: Program states who will measure and when)

Candidate Outcomes Review Faculty assigned to review candidate

outcomes Review Committee determines program

completion for candidate Candidate outcomes aggregated Summary data provided to Associate Dean

on cohort

Example (part 6: Composite candidate scores defined, measured)

Summarize each Candidate’s Assessment

Mid-Point Assessment:– e.g. Overall Unacceptable: 1 or more

unacceptables– e.g. Overall Acceptable: 0 unacceptables, <4

superiors– e.g. Overall Superior: 0 unacceptables, 5 or more

superior scores

Example (part 7: Summary data submitted)

ABC ProgramMidpoint Assessment 2004

Incomplete Overall

Unacceptable

Overall

Acceptable

Overall

Superior

1 1 23 12

NCATE Review Website (mock-up)

Measurements: 5 year period

Use of Technology

Candidate use to collect and present evidence

Program use to assess candidate learning

Program use of aggregated data for program review

College use to maintain data overtime for accreditation purposes

Challenges

Requires a shift in thinking: from grades to authentic assessment of learning outcomes

Program objectives must be made explicit

Faculty agreement on rubrics and scales

Need to identify ways to manage the process

Technology must be helpful, not burdensome

Linda Johnsrud
Linda Johnsrud