16
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

Embed Size (px)

Citation preview

Page 1: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

Office of Institutional Research, Planning and Assessment

January 24, 2011

UNDERSTANDING THE DIAGNOSTIC GUIDE

Page 2: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

INTRODUCTION

• Effective teaching is complex

• Purpose of student ratings is to improve instruction

• Student ratings do not provide all of the information needed by an instructor to improve instruction

• Student ratings should not account for more than 50% of an instructors annual review

Page 3: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

DIAGNOSTIC FORM REPORT

The IDEA Diagnostic Form Report is designed to respond to five questions:

• 1. Overall, how effectively was this class taught?

• 2. How does this compare with the ratings of other teachers?

• 3. Were you more successful in facilitating progress on some class objectives than on others?

• 4. How can instruction be made more effective?

• 5. Do some salient characteristics of this class and its students have implications for instruction?

Page 4: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

RELIABILITY AND VALIDITY

Reliability - consistency of a set of measurements or of a measuring instrument

Validity – study (or instrument) answers the questions it is intended to answer

Example – Bathroom scale• Someone weighs 200 pounds and steps on the scale 10 times and gets

readings of 15, 250, 95 and 140, etc., the scale is not reliable• If the scale consistently reads 150, then it is reliable, but not valid• If it reads 200 each time, then the measurement is both reliable and valid

Are the findings of your diagnostic report reliable? Look at the top of the report in the shaded area.

Even if the findings are not reliable, they may still be useful as feedback

Page 5: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

AVERAGE AND CONVERTED SCORES• Average scores - based on a five point rating scale

• See small box on left side of page one

• Criterion Referenced

• Error for classes in the range of 15-24 is ± 0.2

• Error is slightly higher for smaller classes and lower for larger classes

• Converted scores – all have an average of 50 and a standard deviation (measure of variability) of 10 (Also called standard scores) For comparative purposes

• See large box on right side of page one

• Norm Referenced

• Both average and converted scores are presented in the “raw” or unadjusted and “adjusted” forms

Page 6: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

OVERALL, HOW EFFECTIVELY WAS THIS CLASS TAUGHT?• Examine student ratings of progress on Important or Essential Objectives

• Average rating provides a good indication of effective teaching, especially if

• At least 75% of enrollees responded

• At least 10 students provided ratings

• Progress rated on 5-point scale

• 1=no progress

• 2=slight progress

• 3=moderate progress

• 4=substantial progress

• 5=exceptional progress

• Average of 4.0 indicates “substantial progress” is appropriate for summarizing progress

Page 7: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

OVERALL INDEX OF TEACHING EFFECTIVENESS

• Progress of Relevant Objectives combines ratings of progress on the objectives identified by the instructor as important (weighted 1) or essential (weighted 2)

• IDEA Center regards this as its single best estimate of teaching effectiveness

Page 8: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

SUMMARY OF TEACHING EFFECTIVENESS

• Progress on Relevant Objectives (A)

• Relevant objectives are those selected by the Instructor on the FIF

• Weighted average of student ratings of progress on “important” or “essential”

• Overall Ratings

• Average student ratings that the teacher was excellent (B)

• Average student ratings that the course was excellent (C)

• Average of B and C is (D)

• Summary Evaluation: Average of A and D

Page 9: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

HOW DO YOUR RATINGS COMPARE WITHTHOSE OF OTHER TEACHERS?

• Refer to the comparisons shown on the right hand side of Page 1 of the IDEA Diagnostic Form Report.

• Converted Averages compared to three groups

• All classes in the standard IDEA database

• All classes in the same discipline

• All classes at RSU

• Institutional and disciplinary norms are updated annually and include the most recent five years of data.

• The IDEA database is updated on a periodical basis

Page 10: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

WERE YOU MORE SUCCESSFUL INFACILITATING PROGRESS ON SOME CLASS OBJECTIVES THAN ON OTHERS?

• Refer to the upper portion of Page 2 of the IDEA Diagnostic Form Report.

• Main purpose of this table: help you focus on your improvement efforts

• Twelve objectives listed and show ratings on those objectives identified by Instructor as either important or essential from the FIF

• Ratings for those objectives listed as Minor or None not included

• In the last column,

• Percentage of students rating in the two lowest categories of 1 or 2

• No apparent progress or slight progress

• Percentage of students rating in the two highest categories of 4 or 5

• Substantial progress and exceptional progress

Page 11: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

PROGRESS ON RELEVANT OBJECTIVES AS COMPARED TO GROUP AVERAGES• Converted scores in the right hand section and compared with the three norm groups

• All classes in the IDEA database

• Discipline (IDEA data)

• RSU (Institutional data)

• The status of each relative to other classes in the comparison group

• Much higher (highest 10%)

• Higher (next 20%)

• Middle (40%)

• Lower (next 20%)

• Much lower (lowest 10%)

Page 12: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

HOW CAN INSTRUCTION BE MADE MOREEFFECTIVE?

• Refer to Page 3 of the IDEA Diagnostic Form Report.

• Main purpose of instruction is to facilitate progress on objectives that the instructor selects as Important or Essential

• Progress is affected by many factors in addition to teaching methods, e.g., student motivation, willingness to work hard.)

• Teaching methods are of critical importance to facilitate progress

• Teaching methods have been grouped into 5 categories which include the relevant objectives selected by the Instructor

• Review your average score, percent of students rating 4 or 5 and suggested action

Page 13: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

SUGGESTED ACTION COLUMN

• “Strength to retain” – retain these methods regardless of other changes you may make in teaching strategy

• “Consider increasing use” – Infers that increasing use of these methods, may result in more success in facilitating progress

• “Retain current use or consider increasing” – methods currently employed with typical frequency. Increasing frequency may positively effect the learning outcomes

Page 14: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

DO SOME SALIENT CHARACTERISTICS OF THIS CLASS AND ITS STUDENTS HAVE IMPLICATIONS FOR INSTRUCTION?

• Refer to the bottom portion of Page 2 of the IDEA Diagnostic Form Report

• Course Characteristics. Students described the class by comparing it to other classes they have taken in terms of

• (1) amount of reading,

• (2) amount of work in non-reading assignments

• (3) difficulty

• Average ratings are compared with “All classes” in the IDEA database; if sufficient data were available, comparisons are also made with classes in the broad discipline group in which this class was categorized and all other classes at your institution. Because relatively large disciplinary differences have been found on these three characteristics, the disciplinary comparison may be especially helpful.

Page 15: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

DO SOME SALIENT CHARACTERISTICS OF THIS CLASS AND ITS STUDENTS HAVE IMPLICATIONS FOR INSTRUCTION?

Student Characteristics

Students described their motivation by making self-ratings on the three items listed at the bottom of Page 2. These characteristics have been found to impact student ratings of progress.

Page 16: Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE

DETAILED STATISTICAL SUMMARY

Page 4 of the Report provides a detailed statistical summary of student responses to each of the items on the

IDEA form as well as to optional locally devised items,

if any.