Upload
baldwin-chase
View
215
Download
0
Tags:
Embed Size (px)
Citation preview
Program Evaluation
Evaluation
• Systematic investigation of merit or worth using information gathered to make that decision (Guskey, 2000)
• Needed in physical education to– Keep program current and dynamic– Inform curricular change decisions
• Should evaluation strengthen ends or means?
Purpose of Program Evaluation
• To determine a new program plan • To document the validity and/or importance of
the expectations• To document the way in which the program is
being implemented• To determine the effect of the program on
participants• To provide recommendations for revisions
based on identified weaknesses
Curriculum Evaluation
• Examine curricular goals
• Student performance assessments
• Views of stakeholders
• Teacher evaluations
• Facilities assessments
Defensible Data
• Considers– Reliability
• Findings are replicable
– Validity• Appropriateness of measures
Program Implementation
• Are the students enrolled in the program representative of the type of students for whom the program was planned?
• Is the program being implemented by representative teachers in the teacher-student ratios?
• Has the content planned for inclusion been taught?
Program Effectiveness
• Program evaluation seeks to describe the number of students who are making gains on the program objectives– Evaluation of the program is merely an
extension of the evaluation of individual students
Program Effectiveness
• Did change occur?• Was the change
statistically significant?
• Was the effect educationally significant?
• Can effects be replicated?
• Did the observed effects result from the program?
• If students don’t meet the program outcomes, one must consider:– Characteristics of the teacher– Characteristics of the students– Characteristics of the instructional setting or
context– Characteristics of program implementation
• Strength of relationships often provide insight regarding potential program revisions
Program Improvement
• Document individual student achievement and assess the nature & impact of the hidden curriculum as well as intended outcomes
• Consider possible changes in program objectives or modifications of the existing program standards
Determining needed changes
• Knowledge of ‘what’ to improve must be supplemented with information suggesting ‘why’ the weakness exists
• Weakness observed in program implementation usually results from a lack of knowledge about the process that is involved in planning, implementing, or evaluating; therefore, INSERVICE
Evaluation Models
• Desired outcome model– Primary focus is student achievement– Eval. limited by those outcomes that can
be precisely stated and for which objective measures can be developed
• Insensitive to ‘process’ and humanistic aspect of education
Evaluation Model
• Goal-free model: – Attention goes beyond outcomes to all that
is relevant– Follow a checklist
• Use a wide variety of techniques– Product tests e.g. fitness tests, motor skill tests
• Self-reports may be utilized• Use dress outs, absences, assignments
Evaluation Model
• Goal-free– Primary value: evaluation is more complete
and representative– Disadvantage: may rely too heavily on
subjective information; may not look at a full range of evaluative needs
Developing the eval. plan
• Look at total picture rather than isolated “units”• Plan to evaluate the effects of program that do not
easily lend themselves to measurement– e.g. affective development
• If state mandated standards are in curriculum, eval must be structured with mandates in mind
Selection of Eval Instruments
• Outcomes based eval will use quantitative data from objective tests to assess changes in students– these can provide formative & summative
data– check individual curr. models for examples
of eval. tools
Qual. vs Quan. Eval
• Qualitative eval processes are description, disclosure of meaning, and judgment– used when the perspective of curriculum does not
specify mastery
• Quantitative eval: inferences made by statistical significance on the most easily observed characteristics of the environment– most often used during content mastery
Quantitative vs Qualitative
• Curr. eval will generally use both quantitative and qualitative types of eval.
• Types of instruments suggested for program needs assessment may also be used for program evaluation
The instructional process
• Qualitative evals tend to be used
• Study student-teacher interactions – study the target of the teacher’s attention– verbal interactions– nature of discipline– classroom climate
Preformative Evaluation
• Prior to activity, program, or project
• Identifies goals
• Estimates impact
• Analysis of program implementation
• Helps to avoid costly mistakes
Formative Evaluation
• Occurs during activity
• Helps to redirect– Time, money, personnel, and resources
• Proactive
• Occurs multiple times
Summative Evaluation
• Occurs at conclusion of project
• Determines what was accomplished
• Used for accountability
• Frequently uses quantitative data
Indirect Measures
• Afterschool program participation
• Non-school program participation
• Student readiness
• Enrollment in elective classes
• Attendance, dress, and participation
Systematic Model
• A performance assessment based on authentic assessment
• The work sampling system assesses and documents a full range of skills, behaviors & values– components: developmental checklists,
portfolio collection, summary reports
Student Fitness Levels
• Many schools choose to focus on– How to get fit or devising personal plans
• Caution about– Expecting all students to achieve a certain
level– Setting criterion for particular tests (e.g., a
6-minute mile)– Curriculum aligning with fitness goals
NASPE STARS
• Time• Teacher
– Qualifications– Professional
development– Professional involvement– Student ratio
• Student health and safety
• Facilities and equipment
• Program mission• Curriculum• Instructional practices• Student assessment• Inclusion • Communication• Program evaluation
PECAT
• Physical Education Curriculum Analysis Tool• Based on NASPE standards• Developed by CDC in partnership with
experts• http://www.cdc.gov/HealthyYouth
Student Assessment: Portfolio Use
• Keeps track of student progress• Provides students an opportunity to assess
own accomplishments• Determines the extent to which learning
objectives have been mastered• Helps parents understand their child’s effort &
progress• Serves as basis for program evaluation
Portfolio EvaluationMethods
• Reflection: by students; by parents; by peers– all should compare the entries to the standards for the
evaluation
• Conferences: meetings with individuals, small groups to discuss individual growth & achievement compared to teacher’s judgment
• Progress report: look holistically, create rubrics
Developmental checklists
• Used for observing, recording, & evaluating behaviors• The performance indicators reflect expectations for
developmentally appropriate activities; ratings by “not yet”, “in process, “proficient”– e.g. uses strength and control to perform fine motor tasks– uses eye-hand control to perform fine motor tasks
Portfolio Collection
• Samples are selected that are common to all learners
• Other items that capture the uniqueness of individual learners may also be chosen– the learner is allowed to be involved in the
selection process and may judge the quality of own work
Summary Report
• The checklists & portfolios are reviewed & judged– judgments in terms of “developing as
expected” or “needs improvement”– progress is “as expected” or “not as
expected”
• Report gives comments on strengths & weaknesses as well as steps to support the learner’s academic growth
Evaluation Summary
• Good evaluation – Informs programmatic change– Occurs on a regular basis– Is planned– Is based on multiple data sources
• Data should inform decision, not make it
• How do you look at instructional effectiveness, course productivity, and program effectiveness in regards to the curriculum that you are mapping?