CURRICULUM-BASED MEASURES: MATH Kat Nelson, M.Ed University of Utah

Preview:

Citation preview

CURRICULUM-BASED MEASURES: MATHKat Nelson, M.Ed

University of Utah

Objectives

1. You will be able to define a CBM and articulate the big ideas of using math CBM with the CCSS and the MTSS model.

2. You will be able to administer and score screener and progress monitoring probes.

3. You will be able to use the problem solving process to interpret the data produced from the math CBM.

CBM: Big Ideas(Kelly, Hosp, Howell, 2008)

• “CBM is a quick and reliable method for gathering information about student

performance and progress.”

• CBM is…• Aligned with Curriculum • Valid and Reliable• Standardized measures• Provides low-inference Information

CBM: Big Ideas(Kelly, Hosp, Howell, 2008)

• CBM probes are repeated measures that are efficient, and sensitive to growth.

• Sensitivity to growth = Informing your instruction frequently.

• Information about performance and growth can be easily shared with stakeholders

• Indicator of future reading and math achievement

Curriculum-Based Measurement And The Common Core State StandaradsBig Ideas

Common Core & CBM(Shinn, 2012)

• The Common Core State Standards (CCSS) provide sets of College and Career focused outcomes and annual Criterion Referenced Tests to measure student learning as a summative evaluation.

• The assessment implications of CCSS are clearly related to summative evaluation and accountability

• No single test is sufficient for all the data-based decisions, screenings, intervention planning/diagnosis, progress monitoring, accountability/program evaluation that schools make in their attempts to identify student learning needs.

Common Core & CBM(Shinn, 2012)

• Assessment of CCSS need not be separate items or tests for each standard, but may include “rich tasks” that address a number of separate standards.

• AIMSweb’s Curriculum-Based Measurement (CBM) tests typically are based on these rich tasks that are validated as “vital signs” or “indicators” of general basic skill outcomes.

Common Core & CBM(Shinn, 2012)

• AIMSweb’s CBM tests are consistent with the CCSS. They are content valid.

• AIMSweb’s CBM tests are complementary to the assessment requirements to attain proficiency on the CCSS.

Curriculum Based- Measurement And Multi-tier System Of SupportBig Ideas

Multi-Tiered System of Support

• Schools identify students at risk for poor learning outcomes

• Monitor student progress • Provide evidence based interventions and adjust the intensity and nature of those interventions depending on a student’s responsiveness.

(NCRtI, 2010)

Key Features of MTSS (Sugai, 2008)

• Universal Design

• Data-based decision making and problem solving

• Continuous progress monitoring

• Focus on successful student outcomes

• Continuum of evidence-based interventions

• A core curriculum is provided for all students• A modification of this core is arranged for students who are identified as non-responsive

• A specialized and intensive curriculum for students with intensive needs

• Focus on fidelity of implementation

Problem Solving Process

Using CBM within MTSS• Tier 1 Universal Screening

• Establishes benchmarks three times throughout the school year

• Tier 2 Progress monitoring• Monitoring students at-risk by assessing monthly

• Tier 3 Intensive Progress monitoring• Frequent assessment for students at risk or significant needs

Conducting A Math CBMDirections and Scoring Procedures

Selecting the Measure • At Kindergarten or Grade 1

• Oral Counting • Quantity Array• Number Identification• Quantity Discrimination• Missing Number

• At Grade 1-8• Computation (Mixed and/or Facts)• Concepts & Applications

• As appropriate (Grade 9?)• Algebra

Let’s take a Look

• Early Numeracy

Measures

Let’s Take a Look

• Concepts and

Applications or M-Cap

Let’s Take a Look

• Computation

Administration of Computation Probe• The number of correctly written digits in 2 minutes from the end-of-year

curriculum• Correct digits

• Not correct problems or answers• Why?

• 2 minutes• Depends on grade and publisher

Computation

• Student(s) are given a sheet of math problems and pencil

• Student(s) complete as many math problems as they can in 2 minutes

• At the end of 2 minutes the number of correctly written digits is counted

Directions for Computation• Give the child(ren) a math sheet(s) and pencil• Say

“The sheet on your desk is math facts. There are several types of problems on the sheet. Some are (insert types of problems on sheet). Look at each problem carefully before you answer it. When I say ‘please begin’, start answering the problems. Begin with the first problem and work across the page. Then go to the next row. If you cannot answer the problem, mark an ‘X’ through it and go to the next one. If you finish a page, turn the page and continue working. Are there any questions?”

Directions – Your Turn• The sheet on your desk is math facts. There are several

types of problems on the sheet. Some are (insert types of problems on sheet). Look at each problem carefully before you answer it. When I say ‘please begin’, start answering the problems. Begin with the first problem and work across the page. Then go to the next row. If you cannot answer the problem, mark an ‘X’ through it and go to the next one. If you finish a page, turn the page and continue working. Are there any questions?”

Directions Continued

• Say “Please begin” and start your timer• Make sure students are not skipping

problems in rows and do not skip around or answer only the easy problems

• Say “Please stop” at the end of 2 minutes

Scoring• If the answer is correct, the student earns the

score equivalent to the number of correct digits written using the “longest method” taught to solve the problem, even if the work is not shown

• If a problem has been crossed out, credit is given for the correct digits written

• If the problem has not been completed, credit is earned for any correct digits written

Scoring Continued

• Reversed digits (e.g., 3 as E) or rotated digits, with the exception of 6 & 9 are counted as correct

• Parts of the answer above the line (carries or borrows) are not counted as correct digits

• In multiplication problems, a “0”, “X”, or <blank> counts as a place holder and is scored as a CD

Scoring Continued

• A division BASIC FACT is when both the divisor and the quotient are 9 or less. If the answer is correct the total CD always equals 1

• In division problems, remainder zeroes (r 0) are not counted as correct digits

• In division problems, place holders are not counted as correct digits

Scoring

Computation Scoring – Your Turn

Put It To PracticeBenchmarking, Survey Level Assessment, and Progress Monitoring

Tier 1- Universal Screening Big Ideas(Hosp, Hosp, Howell, 2007)

• Provides a reliable and valid way to identify• Students who are at risk for failure• Students who are not making adequate progress • Students who need additional diagnostic evaluation

• Students’ instructional level.

• 3 times a year for the entire school• 3 probes are given and you take the median score

What is Proficient?How Much Progress can we Expect?(Hosp, Hosp, Howell, 2007)

•Benchmarks - Use standards for level of performance that are empirically validated by researchers.

•Norms – Compare a student’s score to the performance of others in her grade or instructional level

Proficiency Levels or Benchmarks for Math CBM(Burns, VanDerHeyden, Jiban, 2006)

Grade Placement Level Correct Digits

2-3 Frustration <14

Instructional 14-31

Mastery >31

4-5 Frustration <24

Instructional 24-49

Mastery >49

Norms for Math CBM: Correct Digits(AIMSweb, 2006)

Grade Percentile Fall (CD) Winter (CD) Spring (CD)

2 90% 31 39 43

75% 20 30 42

50% 12 24 24

25% 8 16 17

10% 5 10 12

5 90% 68 76 85

75% 53 59 69

50% 37 45 52

25% 25 33 39

10% 16 23 27

Making Informed Data Based-Decisions

Student Median Score

1 22

2 35

3 37

4 10

5 42

6 47

7 13

8 27

9 42

Spring Benchmark Data for 2nd Grade

Making Informed Data Based-Decisions

Spring Benchmark Data for 2nd Grade

Student Median Score

6 47

9 42

5 42

3 37

2 35

8 27

1 22

7 13

4 10

Survey Level Assessment(Hosp, 2012)

•Purposes• To determine the appropriate instructional placement level for the student• The highest level of materials that the student can be

expected to benefit from instruction in

• To provide baseline data, or a starting point for progress monitoring• In order to monitor progress toward a future goal, you

need to know how the student is currently performing

Survey Level Assessment (Hosp, 2012)

1. Start with grade level passages/worksheets (probes)

2. Administer 3 separate probes (at same difficulty level) using standard CBM procedures

3. Calculate the median (i.e., find the middle score)

4. Is the student’s score within instructional range?• Yes: this is the student’s instructional level• No: if above level (too easy), administer 3 probes at

next level of difficulty • No: if below level (too hard), administer 3 probes at

previous level of difficulty

Survey Level Assessment

2 7 12 1010 F1 25 23 27 25 I

4 2 6/13/13

x x

Progress Monitoring Big Ideas: Tier 2 & 3

• Purpose: (Hosp, Hosp, Howell, 2007)

• To ensure that instruction is working• To signal when a change is needed• To guide adjustments in the program

• Frequency:• Tier 2: Monthly – to show progress and to inform instruction• Tier 3: Weekly to Bi-Weekly – to ensure that students who are the

most treatment resistant are making progress.

Progress Monitoring:Determine the Goal

Calculating Aim Line

• Median Score from SLA or Benchmark + (Number of Weeks x Rate of Improvement) = Goal

• Student 4• 25 + (20 x .50) = 35• Goal = 35 Correct Digits in

20 weeks

Weekly Growth Rates for Math CBM: Correct Digits

Grade Realistic Growth rates per week (CD)

Ambitious growth rates per week (CD)

1 0.30 0.50

2 0.30 0.50

3 0.30 0.50

4 0.70 1.15

5 0.75 1.20

6 0.45 1.00

(Fuchs, Fuchs, Hamlett, Walz, and Germann 1993)

Your Turn: Calculate Goal for Student 1

2nd grade: Spring Benchmark Scores

Student Median Score

6 47

9 42

5 42

3 37

2 35

8 27

*1* 22

7 13

4 10

Calculate

Grade Realistic Growth rates per week (CD)

Ambitious growth rates per week (CD)

2 0.30 0.50

Median Score from SLA or Benchmark + (Number of Weeks x Rate of Improvement) = Goal

Making Informed Data Based-Decisions

Benchmark

Week 1

Week 2

Week 3

0

5

10

15

20

25

30

35

2523

27

33

Student 41st Grade Probe

Student 4

---Aimline

• Is our intervention working?• What changes should we

make?

Progress Monitoring: Another Look

Week 1

Week 2

Week 3

Week 4

Week 5

Week 6

Week 7

Week 8

0

5

10

15

20

25

30

35

40

2523

27

3335

28 27

34

25

0

Student 4: 1st Grade Progress Monitoring

Student 4AimlineIntervention

Recommended