80
Regents Item Analysis Reports CFN 603 – Larry Pendergast, Network Leader Gary Carlin [email protected] Michael Tancredi

Regents Item Analysis Reports

Embed Size (px)

DESCRIPTION

Regents Item Analysis Reports. CFN 603 – Larry Pendergast, Network Leader Gary Carlin [email protected] Michael Tancredi [email protected]. A Definition …. Assessment is a process of gathering and documenting information ( data ) about the: Achievement Skills Abilities - PowerPoint PPT Presentation

Citation preview

Regents Item Analysis Reports

CFN 603 – Larry Pendergast, Network LeaderGary Carlin [email protected]

Michael Tancredi [email protected]

2

A Definition …

• Assessment is a process of gathering and documenting information (data) about the:

– Achievement – Skills– Abilities– Personality variables of an individual

3

Assessments Allow Us …

• to learn more about the competencies and deficiencies of the individual being tested.

• to identify specific problem areas and/or needs.

• to evaluate the individual's performance in relation to others, a set of standards or goals.

4

And Also …

• Provide teachers with feedback on effectiveness of:– Instruction– Equity Strategies– Interventions

• Predict an individual's aptitudes or future capabilities.

5

It’s All About Data“Data literacy” means that a person possesses a basic understanding of how data can be used to inform instruction.

ARIS

ARIS Private Community

• This will be done for you by your supervisor!

• Need to download the tool that was uploaded into your ARIS private community

• Enable all the macros of the tool

Tool Tabs• Tab for each of the Regents subject areas

(only scanned exams)

• Click on tab: all students who were registered to take the exam

• Will NOT show students who took a Regents at another school

This section displays individual student answer selections. Incorrect answers are shaded white. Answers that read ‘M’, ‘-‘, or ‘X’ mean students either left it blank or bubbled in more than one answer. Students who were registered for the exam but did not submit an answer sheet will not have information in this section.

For multiple choice questions, this section shows you the percentage of students who selected each answer choice. The percentage of students who answered correctly is in the ‘check’ column.

This section shows you the question type (MC, CR, etc), question #, and the correct answer.

These columns display the content information each question is linked to.

Student Information• Student Name, OSIS number, and DBN of

school they were active in at time of test • IEP & ELL status: ‘0’ - No, ‘1’ - Yes. • Ethnicity code: numerically (before last school

yr) and by letter (past school yr). • Status: ‘A’ – active, ‘D’ - discharged students • Cohort and Grade Level (at time of test)• Score: Each student’s cumulative score. 65+

(green); 55-64 (yellow). At the top of row (S), the average score earned by a student.

Ethnicity (# - before last school yr)

• 1 – American Indian or Alaskan Native• 2 – Asian or Pacific Islander• 3 – Hispanic• 4 – Black (not of Hispanic origin)• 5 - White (not of hispanic origin)• 6 - Parent refused to sign• 7 – Multi-Racial

Ethnicity (letter - past school yr)• A – Hispanic• B – American Indian or Alaskan Native• C – Asian• D – Native Hawaiian or Other Pacific Islander• E – Black• F – White• G – Multi – Racial

– Other

Cohort• N - 2008• O - 2009• P - 2010• Q - 2011• R - 2012

For Each Question # …• Content Standard (strand.)

• Question Type (T); MC, CR, etc.• Question Number (#)• Correct Answer ( ) A, B, C, D

• Distractor Analysis (A, B, C, D)• Correct Answer Correct (G,Y,R)• Individual Student Answers (A, B, C, D)

Individual Student Answers

• Incorrect – White• Correct – Light Green• M – Multiple Answers• X – Missing Answer (left blank)• (-) - ???

Row

•Green - High•Yellow - Middle•Red - Low

19

Multiple Choice Tests• “Well-designed multiple choice tests are

generally more valid and reliable than essay tests …” – they sample material more broadly– discrimination between performance levels is

easier to determine– scoring consistency is virtually guaranteed”

The Center for Teaching and Learning, UNC, 1990

20

Item Analysis

• A method of assessment of how a question on a test measures the performance of a student.

• Analyzing each item on a test to determine the proportions of students selecting each answer.

• Evaluate student strengths and weaknesses;

Main Uses of Regents Examination Item Analysis

• Understanding patterns of achievement in your school-wide Regents performance

• Identifying particular areas in which students need assistance in

22

What Does Item Analysis Tell Us?

• Questions students were guessing on.• Most difficult questions (reteach).• Misconceptions based on incorrect responses.• Areas of strength (compact).

• Flaws in the test (eliminate, not count).• “… it is five times faster to revise items that

didn’t work, using item analysis, than trying to replace it with a completely new question.”

23

In Addition …• The proportion of students answering an item

correctly affects its discrimination power.

• Items answered correctly or incorrectly by a large proportions of students (+85%) have markedly reduced power to discriminate.

• For optimum discrimination power comes from items that will be answered correctly by 30% to 80% of the students.

24

Multiple-Choice Stem• Poses a problem or states a

question.

• Direct questions are best, but incomplete statements are necessary sometimes.

• Rule: Students should be able to understand the question without reading it several times and without having to read all the options.

25

Distractors• Incorrect choices for a test item.

• A distractor should be selected (no less than 5% of total) by some of the students for all test questions.

• Information on the (frequency of) selection can help teachers identify student/class/school misconceptions and problems.

26

Remember the Standards

• Assessments must be valid in accurately assessing what students learn in relation to the standards.

• Instructional unit objectives must be linked to the state content standards.

28

Activity #

Interpreting the Item Analysis

29

Sample 1Question

Answer

Selected

by (#)

A 60

B 10

C 70

D 10

O (omit) 0

30

1. Ambiguous Items• Almost equal numbers of

students choose A (6).

• Students didn’t know material?

• Item was defective? –could be defensible.

• Reteach/Rescore

31

Sample 2Question

Answer

Selected

by (#)

A 43

B 38

C 50

D 36

O 0

32

2. Equal Responses• Students are responding about equally to all

alternatives.

• Guessing?

• Wasn’t taught yet.• Too difficult.• Too badly written.

• Remove/Rescore• Reteach• Occasional challenge question – top students

33

Sample 3Question

Answer

Selected

by (#)

A 6

B 85

C 0

D 9

O 0

34

3. Distractors Not Chosen• No one selected D.

• Replace alternative D.• If a 4th distractor is NOT

possible, use 3 or don’t reuse item.

• Each distractor should attract at least 5% of the students!!!

35

Sample 4Question

Answer

Selected

by (#)

A 85

B 15

C 17

D 59

O 0

36

4. Distractor Too Attractive• Too many students (Upper and Lower) select A.

• No one distractor should get more than the key (or more than about half the students).

• Use this time.

• Weaken distractor in the future.

37

Sample 5

Question

Answer

Selected

by (#)

A 3

B 990

C 2

D 1

O 0

38

5. Too Easy• Almost everyone got the question correct

(Upper and Lower).

• Won’t discriminate well. Difficult is over 85% it’s of little value.

• Remember the higher the Difficulty Index – the easier the question.

• Too Difficult = 30% - 35%

39

Sample 6Question

Answer

Selected

by (#)

A 13

B 19

C 17

D 156

O 32

40

6. Omitted Questions

• Omitted test questions.

• Were they near the end of the test? On the back of the page? (format problem)

• Was the test too long or too difficult?

42

Where did Students do Well?

43

Where Students

had Difficulties

Before Viewing IA• Provide subject area teachers with a copy of

their specific Regents examinations.

• Task:• Identify 5 questions (each) from the M/C section

that they believe students would find the …

• “Least Challenging” (circle) – Why?• “Most Challenging” (star) – Why?

Item DiscussionWhat is this question asking students to know (content) and be able to do (process/skills)?

What is/are the reason(s) this was a difficult question for our students?

How does this question assess student mastery of an idea, concept, … , and/or skill/process?

What are the implications for instruction?

Most Challenging QuestionsThe Stem: Where’s the Problem?

Pre-Question Wording Question Phrase/Incomplete Statement

Diagram(s)/Table/Graph Content-Related

Why are the Distractors Attractive? - Cross out answer box - List “attractive” characteristics

1. Uses a key word that was stressed in the course

2. Answer contains more content information than the other choices

3. Answer connects to an incomplete or wrong reading of the question

4.

Next …• Put your 5 Most/Least Challenging questions

into RANK ORDER.

– “Least Challenging” –1. Easiest --- 5. More Complex

– “Most Challenging” – –1. Hardest --- 5. Less Complex

Now …

• For each question, put the answer choices (1, 2,3, 4) for each question into rank order based on what you believe will be the student’s frequency of response for your students.

• 1- most frequent --- 4 – least frequent

• Explain your ranking

Distractor AttentionThe distractor that attracted the …

… Most attention, why? … Least attention, why?

… “Middle value”, revise it to make it more challenging.

… Least attention, revise to make it more challenging to attract some attention (if close to or zero).

Let’s not forget the Standards• How well do you think your students did on

the Regents exam in terms of the Standards and/or Key Ideas?

• In your content area groups, look at the “Map to Core Curriculum” (MCQ by Standard) for your Regents. Discuss the questions in each Standard and/or KI.

• Rank them: Most to Least Challenging

English Standards• Standard 1, 2, 3• Listening, Reading, Writing

Integrated Algebra Content Strands• Number Sense and Operations• Algebra• Geometry• Measurement• Statistics & Probability

Global/US History Standards• 1. US and NY History• 2. World History• 3. Geography• 4. Economics• 5. Civics, Citizenship, and Government

Living Environment Standards/KI• Standard 1 and 4, Appendix A, Part D, Labs 1-4• Standard 4: Key Idea 1-7

Earth ScienceStandards1,2,4,6,7Key Ideas

ESRT

Reflections on Item Analysis

What surprised me most about the data was …

From the data I learned …

The impact of the data on my future teaching/assessment will be …

In thinking about revising the curriculum, the data clearly indicates …

MCQ and Blooms Taxonomy• In most MCQ test we find 3 levels:• 1. Knowledge: remembering (recalling) of

appropriate, previously learned info.

• 2. Comprehension: Grasping (understanding) the meaning of informational materials.

• 3. Application: The us of previously learned information in new and concrete situations to solve problems that have single or best answers.

Verbs to Express Level• 1. Knowledge: Tell, list, describe, relate,

locate, write, find, state, name

• 2. Comprehension: Explain, interpret, outline, discuss, distinguish, predict, restate, translate, compare, describe

• 3. Application: Solve, show, use, illustrate, construct, complete, examine, classify

Comprehension Question Stems

• Can you write in your own words...?Can you write a brief outline...?What do you think could have happened next...?Who do you think...?What was the main idea...?Who was the key character...?Can you distinguish between...?What differences exist between...?Can you provide an example of what you mean...?Can you provide a definition for...?

Application Question Stems

• Do you know another instance where...?Could this have happened in...?Can you group by characteristics such as...?What factors would you change if...?Can you apply the method used to some experience of your own...?What questions would you ask of...?From the information given, can you develop a set of instructions about...?Would this information be useful if you had a ...?

Knowledge Question Stems• What happened after …?• How many …?• Who was it that …?• Can you name …?• Describe what happened at …?• Who spoke to …?• Can you tell why …?• Find the meaning of …?• What is …?• Which is true/false …?

Activity #: Ranking Regents Questions with Bloom’s

Taxonomy#

Level1, 2, 3

Question Stem/Verb used to Express Level

Your Evaluation:Appropriate (A), Reasonable Challenge (C), Too Difficult (D) --- Why?

1

2

3

4

Importing IA to Excel

“Tagging”• Collecting data on a Regents M/C Questions that goes

beyond “Standard” and “Key Idea” (as seen in the scoring guide).

• Some data is discrete (yes or no) and some data varies (type of diagram used in the question).

• Allows us to determine exactly how different conditions (as many as you identify) in a question affect student performance.

Tagging Regents Questions

• Standards• Key Ideas--------------------• Text, only• Diagram• Graph• Table/List• Arrows

Question Tags: Yes/NoTAG TAG OPTIONS

“Which Statement …” Yes, No

“Most likely/direct/closely” Yes, No

“Best describes/represents” Yes, No

Diagram Yes, No

Inference/Conclusion Yes, No

“Inc/dec/red/decline” Yes, No

Tags: Variable DataTAG TAG OPTIONS

Question Type 1. Definition/Example, 2. Sequence/Equation, 3. Explain, 4. Result, 5. Pair, 6. Proof/Justification/Calculation/Read/Infer

Diagram Type 1. Compare & Contrast, 2. Before & After, 3. Step-by-Step, 4. Representational, 5. Parts, 6. Flow Chart, 7. One-Picture Story, 8. Web, 9. Pyramid, 10. Tree, 11. Map

Arrow Type 1. Solid, 2. Dashed, 3. Hollow, 4. With Words, 5. Uni-direction, 6. Multi-direction, 7. Straight, 8. Curved, 9. Wavy, 10. Single, 11. Double line, 12. R, 13. L, 14. Up, 15. Down, 16. Compass

Answer Type 1. Text, 2. Diagrams, 3. Graphs, 4. Tables, 5. Combination, 6. Numbers

Answer Text 1.) 1 word set, 2.) 1-2 word set, 3.) 1-3 word set, 4.) Phrase/Incomplete Sentence, 5.) 1-Sentence, 6.) Lists, 7.) Pairings

Content/NonContentVocabulary

Number Varies (# / #)

7 Diagram Types

• 1. Compare & Contrast• 2. Before & After• 3. Representational • 4. Parts• 5. Step-by-Step• 6. Flow Chart• 7. Complete Process or 1-Picture Story

Six Types of Questions• 1. (Concept/Process) Def/Ex• 2. Sequence/Equation• 3. Explanation• 4. Result• 5. Pairing• 6. Proof/Just (Calc/Read/Infer)

Non-Content Vocabulary June 2011A B C D E

F G H I J

K L M N O

P Q R S T

U V W X Y-Z

Test Item Analysis (TIA)• Use released test items as the starting points for

designing standards-based instruction.

• After the test item is thoroughly analyzed, the focus shifts toward researching the associated Learning Expectation using information found in the National/State/CCLS standards documents.

• The TIA process outlined in STEMresources.com provides a mechanism for developing an engaging standards-based lesson with an accompanying formative assessment.

Test Item Analysis (TIA)• Using the IA look for items in testing categories

for which student performance has lagged (or in topic areas that have proven to be problematic for instruction).

• Teachers go to www.nysed.gov to locate the released NYS Regents test items.

• Or Standardized Test Item Finder to quickly and conveniently access a broad array of test items available on the Web.

Teachers identify the …• Major Concepts and Big Ideas that students

would need to understand to successfully complete this test item.

• The requisite process skills (inquiry or problem solving) that are embedded in the test item

• The cognitive demand as indicated by the Webb Depth of Knowledge level are identified.

Start with Basic Item Information-Stimulus: the section of the test item that

creates the context for the question or task. -Stem: statement or question used to frame

a multiple-choice item. -Diagram:

-Distractors: incorrect option within a test item.

-Correct answer choice:

-What makes this the correct choice?

Then consider …• Concepts and Big Ideas

- What major concepts/ideas do students need to understand to successfully complete this test item?

• Process (Inquiry or problem solving) Skills- What abilities do students need to successfully complete this test item?

• Webb’s Depth of Knowledge (DOK) Levels- What is the DOK Level for this test item?

• Recall • Skill/Concept • Strategic Thinking • Extended Thinking

Instructional Alignment

• Teachers design a Learning Experience that is directly related to the content found in the original test item.

• Includes: lesson and assessment