17
1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

Embed Size (px)

Citation preview

Page 1: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

1

Elements of Compelling Evaluation Reporting

Jon K. PriceK-12 Evaluation Research Manager

Page 2: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

2

To learn how to improve the effectiveness of the program

To collect data on, and to observe the extent and quality of teacher implementation of new techniques in the classroom.

To determine the effectiveness and impact of K-12 programs on teachers classroom performance.

To communicate effectiveness, thus encourage participating teachers to continue learning and implementing new techniques and encouraging nonparticipating teachers to participate.

To provide evidence for an effective curriculum, pedagogy and processes of classroom interaction that directly influence learning.

IIE K-12 Evaluation Goals

Page 3: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

3

Basic Report Design Executive Summary I. Methodology and Data Sources II. Impact of Essentials Course

Teacher Use of Technology Student Use of Technology Variations Analysis and Synthesis of Data and its

meaning III. Conclusion Appendices:

Intel® Teach End of Training or Impact Survey

Reported Class Size (as a range) References

Page 4: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

4

Making Your Evaluation Data Work for You

What do you need from the data? Quality Control Ministry support Policy and/or Education Reform

Background: Some details of the program being evaluated. Some details of the Evaluation methods –What, To Whom, When,

Where, How, For how long? Does it correlate to existing data, country standards or efforts?

Findings: What the practice was that needed to be changed… What the evaluation said…

Did you do what you set out to do? Did you do it well? Did you do it to whom you intended? Did they learn what you expected them to learn? Did they change the way they do things as a result of what you did? Were there any unintended outcomes? * What barriers to implementation did you find?

Page 5: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

5

Making Your Evaluation Data Work for You Intel Teach:1. Use End of Training Evaluation as your initial quality measurement.

We will no longer collect EoT data for a global dataset & report. How will you manage this quality measurement internally?

2. Impact Evaluations start with measuring integration of technology in the classroom and develop into the tool for reform. Examples:

Sharepoint site - http://teamsites.ch.ith.intel.com/sites/Education/K12/evaluation/default.aspx

• Core surveys – can still be used to benchmark against 5 years of data.• Geo Evaluation Catalog – contains proposals, instruments, & reports.

Optional Qualitative Research Modules. Sharepoint Case Study folder.

Actions: Highlights (Capture key points) Issues (Capture key points) Plans to address Issues/next steps What did you do to change the program?

(Shifted/reduced/changed/added program support/money/people/policy etc., Other?)

Are there ongoing evaluation efforts? Recommendations: Marketing tips?

Page 6: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

6

Terms/DefinitionsTerm Definition

Evaluation A detailed study for the purpose of program review and continuous improvement

Assessment A detailed study of the impact/outcome of student centered interventions

Survey Ad hoc interviews with users, where a set of questions is asked and the users' responses recorded.

Questionnaire Written lists of questions that are distributed to a target response audience.

Interview A method of formal/structured field observation that allows the interviewer to directly interact with individual respondents to investigate their opinions, experiences and preferences regarding the product.

Focus Group A method of formal/structured field observation that allows the interviewer to directly interact with a group of respondents to investigate their opinions, experiences and preferences regarding the product. The interaction among multiple participants may raise additional issues, or identify common issues. * Often, the issues identified by interviews and focus groups work well to construct surveys and questionnaires.

Journals/Self Report Logs

On-line or paper-and-pencil journals in which users are requested to note their actions and observations while interacting with a product. This technique allows an evaluator to perform user evaluation at a distance.

Formative A continuous improvement study to assist in the formation or development of a program.

Summative A study of outcome to determine effects of an intervention, (causal relationships between the intervention and the outcome measures).

Quantitative Analysis Procedures taken to analyze numeric data using inferential statistical techniques.

Qualitative Analysis Procedures for deriving meaning from non quantified narrative information, often involving inductive, interactive, and iterative process.

Page 7: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

7

Evaluation Methods

Method Definition

Evaluation Plan A living document that identifies evaluation project evaluators, stakeholders, scope of work, budget, participants, methodology, localization plans, timeline forecast and deliverables.

End of Training Survey

A set of questions asked where the users' responses recorded immediately following an Intel Teach to the Future training session. Data collected should provide feedback on the training context, content and process.

Impact Survey A set of questions asked where the users' responses recorded no earlier than 6 months following an Intel Teach to the Future training session. Data collected should provide feedback on participant (Master) Teacher use and application of material in the classroom.

Additional Evaluation Efforts

Field observations such as Interviews, Focus Groups, Case Studies or Journals that provide qualitative data regarding opinions, experiences and/or application of Intel Teach to the Future pedagogy.

Page 8: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

8

Suggested Themes for ExplorationSuggested Themes for Program Implementation Did teachers like the training? Do teachers ask more essential questions? Did teachers use unit plans? Do teachers use technology more? How many teachers were trained? What are barriers to implementation? What makes teachers successful?Suggested themes for Teacher studies. Teachers perceptions of technology Teachers experience with technology. Teachers professional development using technology Teachers involvement in innovative curriculum development/reform Teachers perceptions of administrative support for technology use Teachers perceptions of student application of knowledge using technology. Teachers perceptions of application to student’s general life skills and attitudes Teachers perceptions of application to subject skills Teachers perceptions of “21st Century Thinking Skills”Suggested Themes for Student studies. Students experience with technology Students attitudes of technology Students views of subjects taught using technology Students with ‘Special Needs’ Gender issues. ‘Disaffected’ students

Page 9: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

9

Elements of Compelling Evaluation Compelling numbers (high or low depending on context)

Demonstrates progress of government objectives or initiatives Shows the success of the program with the participants EXAMPLE: Learner completion rates of on average 97% for an

informal education program Testimonials that are emotional and evoke strong emotion in

the audience Displays program success in terms of the individual on personal level Conveys importance and impact to people’s lives EXAMPLE: “Intel® Teach to the Future is amazing. It’s changed my

teaching practice. Now, I am utilizing technology in my curriculum and seeing a difference in my students’ critical thinking skills.” – PT 9/2005, Chiapas, Mexico

Change, improvement, and milestones Reveals the chain reaction of change which result in education

reform, economic growth, technology adoption, increase in technology literacy and 21st century skills, or improvement of public services

Shows areas for improvement and constructive feedback on how to improve the program

Page 10: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

10

Elements of Compelling Evaluation Reporting

BAD EXAMPLE: “Question 5: Since your training, have you implemented some or all of the unit plans you developed in your Intel® Teach to the Future training? 44.21% of the teachers answered: Yes, more than once; 21.03 % answered: Yes, once; 19.28% answered: Not yet, but I plan to use the lesson before the end of this school year; and 15.48% of the teachers answered: No, never.”

GOOD EXAMPLE: “Majority of the respondents reported positive changes in their teaching practices like using more of the following: essential questions to structure lessons, computer technology to present information to students and create handouts, and rubrics to evaluate students. Several MT and PT respondents claimed positive effects of the ITTF program on their students, such as greater concept understanding, development of higher-level thinking skills, increased motivation and involvement in class, and more students working together. In-depth evaluation validated positive effects of the program on development of ICT skills of MTs and PTS. In cases where the MTs and PTs implemented their unit plans, the students demonstrated improved ICT skills, motivation, team work, class participation, and multiple intelligences in their outputs.”

Page 11: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

11

What Matters Most? How to read results:

1. End of Training

Look for teacher reactions Look for indication of teacher learning

2. Impact Look for organizational support Look for classroom implementation

Impact + Use qualitative methods

Look Impact on school ecosystem, policies

Look for evidence of classroom interaction that directly influence learning

* Handouts: “Evaluation Report Checklist” & “Making Evaluation Meaningful…”

Microsoft Excel Worksheet Adobe Acrobat 7.0

Document

Page 12: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

12

Benchmark Key Objectives Global Benchmark Objective: To identify Intel Teach Essentials End of Training and Impact

Evaluation benchmarks that will enable immediate measurement of local evaluation data when compared to established indicators.

End of Training Benchmarks Resulting from the analysis of existing longitudinal End of Training

evaluation data. Benchmarks identified based on 3 questions that look at training

effectiveness. Question 2. To what extent do the following statements describe the Intel®

Teach to the Future training in which you participated? (Great and Moderate Extent).

Benchmarks identified based on 3 questions that look at the teachers reported readiness to implement technology in their classrooms.

Question 3. Having completed your training, how well prepared do you feel to do the following activities with your students?

(Very well and Moderately Prepared) A review of new program data for the first three quarters where data was

submitted indicates there is no significant deviation from the sustaining benchmarks.

However, the data indicates that most countries receive relatively high scores initially, followed by a dip the next quarter, followed by an increase in scores and stabilization in following quarters.

In addition, overall scores are higher for the training description items then they are for the teacher preparedness items.

Page 13: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

13

Impact Benchmark Resulting from the analysis of existing longitudinal Impact evaluation data. Benchmarks identified based on 4 questions that look at responses

indicating the level of classroom implementation of key program components.

Question 7. Have you used technology with your students in new ways since you participated in the training?

(Yes). Question 14. Since completing your Intel® Teach to the Future training, has

there been a change in how frequently you do the following? (Do ‘listed Activities a-f’ more)

Question 14. Since completing your Intel® Teach to the Future training, has there been a change in how frequently you do the following?

(Do ‘listed Activities g-k’ more) Question 5. Since your training, have you implemented some or all of the

unit plan you developed in your Intel® Teach to the Future training? (Yes, more than once and Yes, once)

Page 14: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

14

Global Benchmarks End of Training Benchmarks1. 89% of teacher respondents indicate the training focused on

integration of technology into their curriculum. 2. 81% of teacher respondents indicate the training provided teaching

strategies to apply with their students. 3. 86% of teacher respondents indicate the training illustrated effective

uses of technology with students. 4. 80% of teacher respondents indicate they are prepared to implement

teachings that emphasize independent work by students. 5. 85% of teacher respondents indicate they are prepared to Integrate

educational technology into the grade or subject they teach. 6. 82% of teacher respondents indicate they are prepared to support their

students in using technology in their schoolwork.

Impact Benchmarks1. 75% of teacher respondents indicate increased use of technology

activities with their students2. 80% of teachers increase use of technology for lesson planning and

prep3. 60% of teachers increase use of project-based approaches in their

teaching4. 75% of teachers use the unit/lesson they developed in training back in

their schools

Page 15: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

15

Success Criteria Deliver Essentials with comparable quality levels to

established benchmarks. End of Training evaluation data indicates a comparable

score as established benchmarks. To be reviewed individually at the country level.

Impact evaluation data indicates a comparable score as established benchmarks.

To be reviewed individually at the country level. We no longer require country evaluation data to be

submitted for a global roll up and report. It is vital that countries continue evaluation efforts, complete reports and submit the reports in order to maintain visibility into the quality of our program.

Key Stakeholders accept and support Essentials data.

Page 16: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

16

Marketing Considerations for Intel Teach Essentials Benchmarks For Teacher Audience:

Ensure teachers understand Intel’s involvement. Communicate course design and desired outcomes with

the teachers. Have a means to track usage and results which will help

us tell the story with proof points/data. Establish a long-term relationship with teachers

Achieve better understanding of teacher usage and results for impact stories and continuous improvement

For MOE Audience: Consistent communication of Intel messaging throughout

the program, (pre, during and post) Establish a user friendly, easy to navigate resource for

communicating training and impact evaluation results. Evidence of Impact Web Pages (Evaluation Web

Resources) Enable co marketing opportunities with Ministries of

Education

Page 17: 1 Elements of Compelling Evaluation Reporting Jon K. Price K-12 Evaluation Research Manager

17