34
Use of Interim Assessments 1 Running head: USE OF INTERIM ASSESSMENTS Exploring the Use of Interim Assessments John Palmer Bunker December 5, 2010 University of Colorado, Denver Research in Information & Learning Technologies Dr. Laura Summers

Use of Interim Assessment Data

Embed Size (px)

Citation preview

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 1/34

Use of Interim Assessments 1

Running head: USE OF INTERIM ASSESSMENTS

Exploring the Use of Interim Assessments

John Palmer Bunker

December 5, 2010

University of Colorado Denver

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 2/34

Use of Interim Assessments 2

Section 1: Focus and Framing 

Introduction and Problem Statement 

The enactment of the No Child Left Behind Legislation (NCLB) in 2002 has caused an

increase focus on the collection and use of student achievement data as a means of improving

school performance. The amount of student data being collected and tracked, the investments in

technological infrastructure in order to store and report data, and the number of professional

developments focused on improving teachers‘ use of this data are clear indicators of this trend

(Bambrik-Santoyo, 2008; Borja, 2006; Burch 2010; Hoff, 2006; Honawar, 2006; Zehr, 2006).

Burch (2010) sites a survey of large urban school districts which indicated 82% of respondent

had invested in interim assessment technology which corresponds with the 70% of 

superintendents administering interim assessment and 10% planning to start as reported by Lynn

Olson in a 2005 article in Education Week.

The tests students are taking and the data that they provide land on the desks of 

teachers, principals, curriculum directors, and district administrators with an almost implied

sense that "if we test and have the data, we will improve." What is written about interim

assessment is quite speculative in nature, amounting to conclusions that many of these systems

consume precious instructional time and resources, are unlikely to lead to student learning, and

may actually harm students through diminished motivation to learn and decreased self-efficacy

(Shepard, 2008).

I i d h li k d h i d f i b l i i h

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 3/34

Use of Interim Assessments 3

evaluate the ways the large amount of data is actually being used in practice by teachers, assessment

coordinators, coaches, and administrators.

Purpose and Intended Audience 

Interim assessments are gaining popularity for helping educators make instructional decisions. These

systems claim to diagnose gaps in student learning, evaluate instructional approaches and curricula, and

provide useful information to improve classroom instruction and increase student learning. To

substantiate these claims, advocates rely on extensive research showing how formative assessment

practices can substantially impact student learning (e.g., Marzano, 2006). Few studies, however, have

been conducted to examine if interim assessment systems are being used for formative purposes or their

impact on teaching and learning.

This action research project is intended for teachers, instructional coaches, and administrators to

reflect upon and implement best practices for using interim assessment data in education.

Research Questions This paper investigates the following research questions:

  How are teachers and administrators across state of Colorado using interim assessments in

their education practices?

  What are best practices for using interim assessment programs in education?

Section 2: Review of Literature 

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 4/34

Use of Interim Assessments 4

educators are keenly interested in leveraging the power of both formative and summative

purposes to boost student performance on state tests used for accountability purposes, analyze

instructional approaches and curricula, and providing information for improving classroom

instruction and increasing student learning.

This privatization of traditionally public institutions has caused controversy regarding

how interim assessments are classified. Marian Perie, Scott Marion, and Brian Gong (2009) with

the National Center for the Improvement of Educational Assessment, Inc. spelled out a clear

definition of interim assessments as follows:

―Assessments administered during instruction to evaluate

students‘ knowledge and skills relative to a specific set of 

academic goals in order to inform policymaker or educator

decisions at the classroom, school, or district level. The specific

interim assessment designs are driven by the purposes and

intended uses, but the results of any interim assessment must be

reported in a manner allowing aggregation across students,

occasions, or concepts.‖ 

Periel et al. (2009) provide a framework for evaluating these popular assessments, which

are being marketed to states and districts as ―benchmark,‖ ―diagnostic,‖ ―formative,‖ and/or 

―predictive,‖ with potential for improving student performance and meeting requirements set

forth in NCLB.

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 5/34

Use of Interim Assessments 5

that the characteristics of frequency, brevity, and identify possible learning disabilities make this

form of assessment useful, and they are user-friendly to educators. Professor Lai claims, ―The

theory of action underlying interim assessment makes intuitive sense (2009).‖ Additionally,

formative assessment has significant empirical data supporting its use (Black & William, 1998).

The literature on formative assessment focuses interim assessment‘s potential for the

identification of content and skills that students have not mastered, targeting instruction to

ability-grouped students, selecting alternative pedagogies, and diagnosing students with

particular learning difficulties who may profit from more intensive instruction (Bloom, Hastings

& Madaus, 1971; Cowie & Bell, 1999; Popham, 2006). Black and William (1998) draw upon

this research to examine the effects of formative assessments practices on students‘ cognitive and

affective outcomes and summarized their findings of approximately 250 empirical studies:

1) assessment information is actually used to modify teaching; 2)

instruction is modified in real time, as opposed to after the

instructional cycle is complete; and 3) the intent or purpose is to

improve student learning. (Black and William, 1998)

In fact, the research behind formative assessment is so vast, Nichols, Meyers, and Burling

(2009) go as far as to claim, ―In technical discussions, the use of the phrase ‗formative

assessment‘ is an implied claim of validity. Just as validity refers to a particular interpretation

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 6/34

Use of Interim Assessments 6

Many advocates and vendors are invoking the term ―formative‖ to convey the

effectiveness of their products (Popham, 2006). With this label, proponents draw on the

extensive research done by Black and William (1998) and others for empirical evidence because

they believe interim assessments might have the same potential for making a difference in

student learning outcomes. Assessments are designed to increase student achievement on

summative tests, the comparison to formative assessment appears valid. Many educators,

however, may not fully appreciate the differences between the types of assessment and end up

investing scarce resource and professional development time in an essentially untested

product.(Lai, 2009).

The importance of assessment systems is going ―beyond simply providing data,‖ and

providing educators with strategies for the interpretation and use of data in modifying classroom

instruction. Vital to a successful interim assessment system is knowing its intended purpose in

the teaching and learning cycle (Perie et al, 2009).

Formative and Summative Purposes of Assessment 

Michael Scriven was among the first to distinguish formative from summative evaluation,

although he was writing about program evaluation rather than evaluation of students. He points

out that formative evaluation goes hand-in-hand with implementation. Information from

evaluation is used to modify and improve programs when a program is still malleable. On the

other hand, summative evaluation occurs after the program has concluded, primarily for the

f i d f ki d i i di h i d

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 7/34

Use of Interim Assessments 7

guidance for complementing summative evaluation with formative student assessments. Scriven

(1996) has also emphasized that this dichotomy is misleading; what distinguishes the two is their

use, which is highly context-dependent.

Black and William (1998) have argued that ―formative-ness‖ is not a quality of the

assessment instrument itself, but rather describes the way it is used. (Periel et. al. 2010; Perie et

al. 2009) categorize interim assessments by three major purposes

  Predictive-forecasting students‘ success on end-of-year, summative tests

  Evaluative-provision of information on the effectiveness of particular teaching

methods or curricula

  Instructional-improvement of teaching and learning. Arguably, instructional use

of data relates most closely with the definition of formative assessment.

Predictive and evaluative uses of the data are more summative in nature and fall

outside of the realm of the studies conducted by Black and others (Popham,

2006).

Nichols et al. (2009) argue that for assessments to be accurately labeled as ―formative,‖

they must be accompanied by demonstrated data analysis that results directly lead to increased

student achievement They contend that interim assessments have been marketed as ―formative,‖ 

offering to increase student achievement without actually offering any evidence that specific

products lead to increased achievement (Linn, 2007; McMillan, 2007; Nichols, Meyers, &

Burling, 2009; Perie et al., 2009; Popham, 2006; Shepard, 2008; Shepard, 2009; Wiliam &

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 8/34

Use of Interim Assessments 8

Commercialization and Critics 

Many of the venders of these assessments describe formative assessments as interim

assessments, but using assessments to impact learning has very little research behind it. The

connection to interim assessments is purely speculative, as the use and purpose of interim

assessments vary greatly. Many of these interim assessment do not even appear to qualify under

the definition of formative assessment laid forth by Black and William (1999 & Popham, 2006).

Assessment vendors stand to reap the benefits from this surge in the use of interim

assessments. As Olson (2005) reports, one market research firm estimated that by the year 2006

sales of interim assessment systems had reached over $320 million annually. Applying

institutional theory, Patricia Burch (2010) from the University of Southern California looked at

the complex dynamic behind the highly profitable interim assessments being sold by private

vendors. She notes how early institutional theory in education exists outside of the private sector

and points to the importance of closely watching the growing trend and the relationship between

the public and private sectors. With the marketing of these assessments as ―formative,‖ its

definition becomes critical in their evaluation.

Notably, the Council on Chief State School Officers (CCSS) established a division

known as FAST (Formative Assessments for Students and Teachers) with its first focus being to

establish this clear definition. Popham (2006) summarizes this definition as follows:

―An assessment is formative to the extent that information from the

assessment is used, during the instructional segment in which the

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 9/34

Use of Interim Assessments 9

need for formative assessment to involve student participation in the process, to provide

feedback to students, and the use of high quality instruments. Other components of successful

formative assessment are clear learning targets (Brookhart, 2008; Sadler, 1989; Wiliam &

Thompson, 2008), learning progressions with next steps for students (Wilson & Draney, 2004)

assessment tasks that are curriculum-embedded and designed to reveal students‘ thinking

processes, and instructionally meaningful (Shepard, 2006); and the timely — almost immediate — 

availability of results (Popham, 2006). Specifically, Sadler envisioned a situation in which both

students and teachers are engaged in the process of interpreting assessment results with respect to

their own learning goals and their progress toward those goals (1989).

Often referring to interim assessments as ―mile wide and inch deep,‖ critics contend the

results are too superficial to provide reliable estimates of sub-skills (Linn, 2007) or specific,

diagnostic information (Shepard, 2008); are too removed from classroom instruction to relate to

desired learning goals (Pellegrino & Goldman, 2008; Shepard, 2008); are too limited in item

format to elicit evidence of students‘ thinking processes (Shepard, 2008); and generally are not

properly suited to provide useful information to impact instruction. Another critique is that the

narrow focus of the test had teachers focusing on the ―bubble‖ students at the expense of the

moderate and high achievers (Linn, 2007).

Effective Instructional Use

Assessments should be useful and provided actionable information for improving

instruction and be feasible and worth the money and time schools invest in them Abrams (2007)

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 10/34

Use of Interim Assessments 10

assessments should represent important learning goals, be substantively linked to instructional

units, be consistent with curriculum sequencing, and provide information that is not available

from other sources.

Multiple assessment types should complement one another with each being customized to

serve its purpose and linked to the same content standards (Pellegrino & Goldman, 2008).

Interim assessments may function as an intermediate level between external accountability

assessments used for summative purposes and classroom assessments used for formative

purposes (Perie, et al., 2009; Perie, Marion, Gong, & Wurtzel, 2007; Pellegrino & Goldman,

2008).

In attempting to address multiple purposes, Pellegrino and Goldman, (2008) argue that

compromises will be unavoidable. Changes that bring the interim assessment closer in line with

classroom instruction will provide relevant results for improving teaching and learning but may

diminish the relationship between performance on the interim assessment and performance on

the accountability test.

In hopes of discovering best practices of how interim assessments can be used to improve

student learning, Lai (2009) from the University of Iowa offered the first empirical evidence

regarding the way assessments were actually being used and the first known instrumentations for

measuring teachers‘ interim assessment practices. Her research location, Iowa, provided a

unique opportunity to study how interim assessment data is used.

Iowa state legislation requires two additional types of assessments, in addition to

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 11/34

Use of Interim Assessments 11

―multiple measures‖ requirement, and its purpose is to facilitate comparisons between Iowa

students and other students in the nation. Second, all districts are required to administer a

diagnostic reading assessment to all students in grades K-3 at least twice a year so that parents

can be notified of their child‘s reading progress (Lai, 2009).

Her results indicated that score reports did not provide enough detailed information about

student performance to enable her to give students high-quality feedback regarding strengths and

areas for improvement. Thus, combination assessments appear to lack many of the features

hypothesized to facilitate formative use of results.

Lai‘s study points to several factors associated with significant learning gains:

1) qualities of the instruments themselves (item formats, alignment

of tasks with classroom instruction and content standards), 2)

aspects of use (communication of learning targets and quality

criteria, provision of effective feedback, and student involvement),

and 3) institutional supports likely to facilitate formative use

(professional development) were used to construct survey

instruments designed to measure how schools and teachers are

using interim assessment results. (Lai, 2009)

While school administrators try to use one instrument to satisfy several purposes, Lai‘s

(2009) survey results suggest this strategy may not be successful, as the characteristics that make

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 12/34

Use of Interim Assessments 12

Assessments used for external accountability are far removed from classroom instruction

on a number of dimensions: 1) the content represented on the tests is not a complete match to

content emphasized during instruction; 2) the timing of the assessment is distal from the

instructional cycle — it occurs after relevant instruction has ended, which does not allow results to

feed back into instructional improvement; 3) the purpose of the assessment is to enable

inferences about the effectiveness of state and district educational systems and for allocation of 

resources, rather than providing information concerning an individual student‘s

accomplishments; the primary, intended users are policy-makers and administrators rather than

teachers and students; and 4) the locus of control regarding assessment selection and

administration is centralized (e.g., residing with the state or district) rather than decentralized

(e.g., residing with the classroom teacher). In many ways, interim assessments constitute a

compromise between these extremes (Pellegrino & Goldman, 2008).

Summary of Literature Findings Through an analysis of the available literature, it becomes clear that little debate exists

over the potential behind interim assessments, but the key factor in their ability to increase

student achievement lies in its instructional use. Many vendors and proponents of interim

assessments rely on studies related to formative assessment, however, their actual use in a

formative matters seems to be at question.

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 13/34

Use of Interim Assessments 13

first few weeks of school devoted a great deal of time to testing. While I desperately wanted data to

 provide information on the students‘ knowledge base, I also had four weeks of teaching (with these

interruptions) when my information about the students was limited to my own classroom assessment. In

fact, it was two weeks after the exams were administered before I received any results.

When the assessment results finally arrived in my email, it simply contained a proficiency level and

percentage for each individual student by subject as well as aggregated data. I identified student ability

groups quickly, but it failed to provide me with useful information for adjusting my instruction.

When I had previously taught 4th grade in another local school district, I had received item by item

analysis organized by individual student and learning objective. The computerized reports provided me

with in-depth item analysis including detractor,s which provided information as to why students may have

selected the incorrect answer. I was able to adapt instruction to better suit the needs of my classroom

based upon common misunderstandings that emerged from analysis of the assessment data. Meetings

with my school instructional coach helped assure that my data inferences were correct, and she offered

guidance on teaching strategies focused on the specific needs of my students. While analyzing and

discussing the data required a lot of time in meetings and working at home, it played a direct role in my

instructional approach and focusing on specific needs of students.

My new classroom consisted of 17 students almost evenly divided among third, fourth, and fifth

graders. Their ability levels varied from complete illiteracy to advanced proficiency at each grade level,

and this wide range made it critical to be as efficient as possible in instruction. If the type of data I had

previously experienced been available, I could have identified specific objectives and targeted small

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 14/34

Use of Interim Assessments 14

Having experienced a data-rich environment, I knew the role interim assessment programs could play

in driving instruction within the classroom. I wondered how others used interim assessments within their

schools and districts.

Preliminary Interviews 

I began my research by interviewing five teachers in Denver area schools to investigate the process

and tools used by other teachers in analyzing interim assessment results. The interviews provided the

opportunity for these teachers to reflect upon and explain his or her assessment practices and experiences

in their own terms. ―The interview process not only provides a record of participants‘ views and

perspectives but also symbolically recognizes the legitimacy of their experience (Stringer, 2007).‖ 

Participating teachers were selected from five different schools across four districts in the Denver

area in order to better represent varied approaches to use of various interim assessment programs.

Teaching experience ranged from four to 26 years with a mean of 14 years of experience. The variety of 

teachers was selected in order to create greater transferability of the study for use in a wide range of 

teaching situations (Stringer, 2007).

Keeping personal experiences and possible bias in mind, I formulated a series of grand-tour type

questions to guide the interviews. The following questions were purposefully created to provide focus for

the interview and help bracket my personal experience and bias potential (Stringer, 2007).

1.  What type of interim assessments do you use?

2.  What is the availability of data, how is it accessed, and what information is provided?

3 What functions of the data system do you find most and least helpful?

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 15/34

Use of Interim Assessments 15

8.  What are your overall views of using data provided by interim assessments?

I met with interview participants in local coffee shops and classrooms after school, and we briefly

discussed the purpose of my research and the interview. In the practice of informed consent, participants

read and signed a consent form and were asked if they had any questions. In addition to taking notes,

each interview was recorded using a portable digital audio device to later be imported into audio editing

for annotation and archival.

While our discussions all focused around the above guiding questions, most of them we answered

without prompt from an actually question. Throughout the interviews, I asked follow-up questions to

gain further insight or clarification.

Analysis of Preliminary Interviews 

Reviewing the interview notes, I began by looking at quotes and details, labeling them according to

the guiding questions. During this process, I noted general themes and sorted the information into

categories of test format, access to data, reporting functions, analysis process, use of information,

professional development, and alignment with curriculum. Each statement was then coded with a specific

unit of meaning and categorized by general topic such as test format, access to data, data analysis,

purposes, and uses of interim assessment data. The coded units of meaning directly related to the purpose

and use of assessment were then sorted into predictive, evaluative, and summative categories following

the principles set forth by Perie, et. al. (2009) and discussed within the review of literature.

Survey Construction 

A mining of the literature on formative assessment identified three factors likely to facilitate the

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 16/34

Use of Interim Assessments 16

2.  aspects of use (communication of learning targets and quality criteria, provision of effective

feedback, and student involvement), and

3.  institutional supports likely to facilitate formative use (professional development) were

used to construct survey instruments designed to measure how schools and teachers are

using interim assessment results. (Lai, 2009)

A survey was conducted based upon these factors and combined with themes from the preliminary

interview data analysis. The survey consisted of primarily Likert Scale items with participants indicating

the degree to which they agree with a series of statements. Other questions asked the frequency of 

participating in various activities related to their use of interim assessments.

A draft survey was then administered to the original five interview participants who were told to ask 

questions for clarification and make annotations next to survey items for further discussion if needed.

Changes were made to the survey to clarify questions or provide more targeted questioning.

Survey Administration

In October 2010, email invitations to complete the online survey were sent to the assessment

coordinators of each of Colorado‘s 186 school districts via email address obtained through the Colorado

Department of Education (CDE). A brief email explained the purpose of my study with a link to the

survey (Appendix ). In order to maximize the response rate, the email invitation was sent on a Monday

night with the intention of the assessment coordinators opening their email Tuesday morning. This email

can be found in Appendix .

All survey data was collected via a Google Form and maintained within a Google Document

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 17/34

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 18/34

Use of Interim Assessments 18

02468

101214

Figure 5. Evaluate Teachers' Pacing

Teacher

AssessmentCoordinator

Administrator

Twenty-five respondents reported they used

interim assessment data to review with parents at

least some of the time, leaving less than 17%

claiming they rarely or never review the assessment

results with parents (Figure 4). All interviewees also

mentioned using data to guide parent-teacher

conferences by demonstrating areas of strength and

weakness. Reviewing and communicating results with the parents arguably can be considered to be more

of an evaluative use of assessment.

Pacing Ensuring teachers are staying on track is common.

Almost 39% responded with neutrality toward the

statement: I use interim assessment data to: Ensure

teachers are staying on track in terms of teaching the

curriculum in a timely manner (i.e. pacing). Forty-five

per cent of respondents indicated disagreement with

this statement. Interestingly, 80% of respondents indicated they used interim assessment data to evaluate

how well the students have learned the material taught to date. It would seem logical to think in order to

use interim assessment data in evaluating teacher pacing, the test would need to align with the classroom

curriculum and instruction.

Several interviewees mentioned this as being inconsistent As a fourth grade teacher told me ―I don‘t

0.00%10.00%20.00%30.00%40.00%50.00%

60.00%

Figure 4. Reviewing Data with Parents

Teacher

AssessmentCoordinator

Administrator

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 19/34

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 20/34

Use of Interim Assessments 20

When setting goals, however, teachers appear

to focus more on the individual. In the Figure 8, a

majority of 65% of respondents indicated they

used the data to set individual student goals most

or every time. This is in comparison to only 44%

falling into the most or every categories in regards

to group goals. In fact, only 6% (2) of respondents

indicated they practiced group goal-setting every

time compared to 23% (7) for individual goals.

Teachers appear to be using data in driving more

group-oriented instruction but set goals more

individually.

This is possibly due to instruction being

conducted on more of a group level, but data

typically relates to the individual student. ―I group

the kids by ability, but it‘s the individual students

who are on the bubble to I focus on,‖ was

mentioned in an interview with a teacher from a local charter school. He continued, ―We look at which

kids can be pushed up to the next level.‖ These ―bubble students‖ appeared as a theme throughout my

teacher interviews, as it was mentioned by four of six interviewees. Teachers are setting goals more for

individual students and setting goals more specifically on students on the threshold between proficiency

Everytime

Most of the time

Some of the time

Rarely/ Never

NotApplica

ble

Teacher 2 3 2 3

AssessmentCoordinator

5 5 2

Administrator 5 2 1 1

0

2

4

6

8

10

12

14

Figure 8. Individualize Student Learning

Goals

Everytime Most of the time Some of the time Rarely/ Never

Not

Applicable

Teacher 2 3 1 4

AssessmentCoordinator

7 2 1 2

Administrator 2 4 2 1

0

2

4

6

8

10

12

14

Figure 9. Small Group Learning Goals

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 21/34

Use of Interim Assessments 21

tracking their own progress, reviewing results with individual students, reviewing the test as a class, and

motivating and providing feedback to students about their learning.

The survey indicated widespread student participation with 45% of respondents indicating they went

over results — including strengths and weaknesses — with individual students. An almost equal number

(42%) indicated this using this practice some of the time. The remaining 12% were evenly split between

Rarely/Never and Not Applicable.

The data reveals one of the major reasons

assessment data for reviewing data for motivating

students, as represented in Figure 10. Seventy-seven per

cent of respondents indicated they agreed or strongly

agreed with the statement that they used data to motivate

and provide feedback to students about their learning. 

Reviewing with whole class 

Only one teacher reviewed strengths and weaknesses with the whole class every time with five

replying ―most of the time.‖ The majority reported ―Rarely/Never‖ to this question. Ten replied some of 

the time. This would arguably be a formative use of the data, but it does not appear to be done

consistently across schools and classrooms.

Implications for Practice 

The literature review revealed the importance of using interim assessment with formative purpose in

mind in order to increase student achievement As my data revealed many schools and teachers are

16%

61%

7%

16%

Figure 10. Motivate Students

StronglyAgree

Agree

Neutral

Disagree

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 22/34

Use of Interim Assessments 22

level, the instructional components seem to be on the group level. The use of data for individual students

would arguably be more summative and predictive in nature. However, the research data indicates

teachers are practicing the more formative purposes for group instruction. Hence there is a gap between

what the data is intended to do and what the data is actually being used for. Date and research both

indicate that using interim assessment on the individual student level can provide feedback and motivate

students to learn

Section 5. Conclusion 

By comparing the literature review with the findings of this action research, we can

reflect upon our practices and make changes to our approach to using data to use a more

formative approach consistent with this research on formative assessment. This has implications

on the education of our students and the way school districts are investing in these assessment

programs. It also has implications for professional development in school systems. The results

of this research can be used to drive professional development by informing teachers of both best

and common practices. The focus of professional development workshops should shift from

analyzing data with the aim of group instruction to individual instruction.

The limitations of this study include a small sample of 31 total surveys. The respondents

were split almost evenly into thirds among teachers, assessment coordinators, and administrators.

Future studies should include more teachers, as they are the ones in the classroom using the

actual data to drive instruction.

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 23/34

Use of Interim Assessments 23

References 

Bambrick-Santoyo, P. (2008). Data in the driver‘s seat. Educational Leadership, 65 (4), 43-46.

Black, P., & William, D. (1998). Assessment and classroom learning. Assessment in Education:

Principles, Policy & Practice, 5(1), 7.

Bloom, B.S., Hastings, J.T., & Madaus, G.F. (1971). Handbook on formative and summative

evaluation of student learning. New York: McGraw Hill.

Borja, R. R. (2006). District initiative. Education Week, 25 (35), 24-31.

Burch, P. (2010). The Bigger Picture: Institutional Perspectives on Interim Assessment

Technologies. Peabody Journal of Education, 85, 147-162.

Brookhart, S. M. (2008). Feedback that fits. Education Leadership, 65 (4), 54-59.

Cowie, B., & Bell, B. (1999). A model of formative assessment in science education.

Assessment in Education: Principles, Policy & Practice, 6(1), 101.

Hoff, D. J. (2006). Delving into data. Education Week, 25 (35), 12-22.

Honawar, V. (2006). Tip of their fingers. Education Week, 25 (35), 38-39.

Lai, E. R., ―Interim assessment use in Iowa elementary schools‖ (2009). Theses and

Dissertations. Paper 393.

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 24/34

Use of Interim Assessments 24

Marzano, R. J., Pickering, D. J., & Pollock, J. E. (2004).Classroom Instruction that Works:

Research-Based Strategies for Increasing Student Achievement (ASCD). Alexandria,

VA: Prentice Hall.

Nichols, P.D., Meyers, J.L., & Burling, K.S. (2009). A framework for evaluating and planning

assessments intended to improve instructional achievement. Educational Measurement:

Issues and Practice, 28(3), 14-23.

Olson, L. (2005). Benchmark assessments offer regular achievement. Education Week, 24 (14),

1-19.

Pellegrino, J. W. & Goldman, S. R. (2008). Beyond rhetoric: Realities and complexities of 

integrating assessment into classroom teaching and learning. The future of assessment:

Shaping teaching and learning (pp. 7-52). New York: Lawrence Erlbaum Associates.

Popham, W. J. (2006). Phony formative assessments: Buyer beware! Education Leadership, 64

(3), 86-87.

Perie, M., Marion, S., & Gong, B. (2009). Moving toward a comprehensive assessment system:

A framework for considering interim assessments. Educational Measurement: Issues

and Practice, 28(3), 5-13.

Sadler, D. R. (1989). Formative assessment and the design of instructional systems.

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 25/34

Use of Interim Assessments 25

Scriven, M. (1996). Types of evaluation and types of evaluator. American Journal of 

Evaluation, 17 (2), 151-161.

Shepard, L.A. (2006). Classroom assessment to improve learning. Educational Leadership, 52

(5), 38.

Shepard, L.A. (2008). Formative classroom assessment: Caveat emptor. The future of 

assessment: Shaping teaching and learning (pp. 279-303). New York: Lawrence

Erlbaum Associates.

Shepard, L.A. (2010). What the marketplace has brought us: Item-by-item teaching with little

instructional insight. Peabody Journal of Education, 85, 246-257.

William, D. & Thompson, M. (2008). Integrating assessment into learning: What will it take to

make it work? Inh C.A. Dwyer (Ed.), The future of assessment, Shaping teaching and

learning (pp. 53-82). New York: Lawrence Erlbaum Associates.

Wilson, M. & Draney, K. (2004). Some links between large-scale and classroom assessments:

The case of the BEAR assessment system. In M. Wilson (Ed.), Towards coherence

between classroom assessment and learning (pp. 132-154). Chicago: University of 

Chicago Press.

Zehr, M. A. (2006). Monthly checkups. Education Week, 25 (35), 36-37.

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 26/34

Use of Interim Assessments 26

Appendix A. Consent Form

Date:   Val id for Use Through:  

Study Title: How are data from interim assessment used?

Principal Investigator: John P. Bunker

HSRC No:

Version Date: October 20, 2010

Version No: 1 

You are being asked to be in a research study. This form provides you with information about the study. A member of the researchteam will describe this study to you and answer all of your questions. Please read the information below and ask questions aboutanything you don’t understand before deciding whether or not to take part.

.

Why is this study being done? 

I am conducting a research study to investigate: How are data from interim (sometimes referred to as benchmark or periodic) assessments used?

You are being asked to be in this research because you work within a school which utilizes interim assessments in the Denver metropolitan area.

Up to 500 people will participate in the study.

What happens if I join this study?

If you join the study, your survey and interview responses will be collected by the investigator/researcher, John Bunker, upon completion.All comments will be de-identified after the research after the investigator/researcher reviews them. Only individuals who consent to this studywill be interviewed and have surveys collected. The only extra time on your part will be the completion of the survey either online or by pen andpaper which takes 5-10 minutes. Any interviews will take a maximum of 45 minutes and be scheduled at your convenience outside of your workschedule.

As part of the data for my study, interviews will consist of questions related to your personal experience using data from interimassessments given periodically at your school. I will correspond with you outside your work day at your convenience to set up and conduct theinterview.

Your participation will be confidential and pseudonyms will be used in all documentation, analyses, and written findings. I am notassociated with any school, district, state, or federal agencies and participation will not have any influence on performance evaluations or

licensing.

What are the possible discomforts or risks?

Discomforts you may experience while in this study include social embarrassment if your identity is identified and you do not like how yourcomments are analyzed As the researcher I will be the only individual reviewing and analyzing any of the individual comments Participant

U f I i A 27

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 27/34

Use of Interim Assessments 27

What are the possible benefits of the study?

This study is designed for the researcher to learn more about how data from interim assessments is used to impact student learning. Thisdata will be reported to professional educational journals and research-based presentations with the intention to inform best practices. This studymay not benefit you personally, but hopefully will benefit future students’ learning through the use of interim assessments. 

Is my participation voluntary?

Taking part in this study is voluntary. You have the right to choose not to take part in this study. If you choose to take part, you have theright to stop at any time. If you refuse or decide to withdraw later, you will not lose any benefits or rights to which you are entitled.

Who do I call if I have questions?

The researcher carrying out this study is John Bunker. You may ask any questions you have now. If you have questions later, you may callJohn Bunker at 720-989-1501.

You may have questions about your rights as someone in this study. You can call John Bunker with questions. You can also call the HumanSubject Research Committee (HSRC). You can call them at 303-556-4060.

Who will see my research information?

We will do everything we can to keep your records a secret. It cannot be guaranteed.

Both the records that identify you and the consent form signed by you may be looked at by others. They are:

Federal agencies that monitor human subject research

Human Subject Research Committee Regulatory officials from the institution where the research is being conducted who want to make sure the research is safe

The results from the research may be shared at a meeting. The results from the research may be in published articles. Your name will bekept private when information is presented.

Agreement to be in this study

I have read this paper about the study or it was read to me. I understand the possible risks and benefits of this study. I know that being in thisstudy is voluntary. I choose to be in this study: I will get a copy of this consent form.

Signature: Date:

Print Name:

Consent form explained by: Date:

Print Name:

Investigator: Date:

U f I t i A t 28

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 28/34

Use of Interim Assessments 28

Appendix B. Interview Data

Categories Emerged Themes

Test Format ScantronComputerizedFew Constructed ResponseDistrict CreatedStandardized

Commercially Produced/Textbook CompaniesValidity QuestionedHigh Time Requirements

Access toData

Easily AccessibleNo Access to Individual Test ItemsUnknown Availability of Results

Distributed Report is Too GeneralLarge Amount of Data

ReportingFunctions

Data Collection System OnlyValidity QuestionedUnrealistic StandardsCross-School ComparisonStandards Alignment

Item Analysis by StudentUse of Diagnostic DistractorsReport NeedsLarge Amount of DataUser-friendly

AnalysisProcess

Gaps In ProficiencyUse of Diagnostic DistractorsValidity Questioned

―Bubble Students‖ Use of Information

Student Feedback Motivate StudentsAbility GroupingToo Much Information

High Time RequirementsCurriculum Pacing Prevents Re-teaching

ProfessionalDevelopment

Grading TimeSystem Expert with Training inBuilding

Low Priority of AdministrationNeed Helpful Training

Purpose of Assessment Unsure of PurposeShould Guide Instruction, ButDoesn‘t 

AlignmentwithCurriculum

Not Aligned with Curriculum

Use of Interim Assessments 29

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 29/34

Use of Interim Assessments 29

Comment

Scantron style test with spaces in test booklet for constructed response items (approximately 5 constructed response items per 25

item test with the rest being 4 option multiple choice)

District created

―If the kids aren‘t used to a format of a test, it is tricky for them. Sometimes I think the students have the knowledge but arethrown off by the format of the question.‖ 

―If I take a test, I practice the test. That‘s why there are prep guides- so you know what to expect.‖ 

Test takes 2-3 days during week 

Online through teacher portal

Access to data for all students taught regardless of subject area (Example: I have a student for reading, but he has my teammatefor math. I receive his math data in addition to his reading.)

Access to test copy with individual items for 3 weeks –  ―It would be nice to be able to reference the individual items anytime tosee their style and content. It could help me figure out why students missed the items they did.‖ 

―The last time I logged on looking for item analysis, it wasn‘t up.‖ 

School secretary distributes general data showing proficiency level of individual students by subject and standard about 1 week following the test

 Never notified of data availability. ―Guess and check.‖ 

Can access data from prior year for students

Acuity is used for data collection and reporting but not for the actual test items

Standards linked to individual test items –  ―The standards are not realistic to what I teach. It‘s hard to figure out what testquestion types are going to cover what standard.‖ 

Report shows comparisons of school, region, and district

―This report shows each individual item number, which standard it correlates to, the correct response, and individual studentresponses when incorrect.‖ 

―It would be nice to have a report that identified why the student might have gotten it incorrect. I have to look at each individualtest item and try to figure out why they might have selected B instead of C.‖

―There is just so much information here to mine through, but the system is pretty user -friendly.‖ 

Look for gaps in achievement by identifying standards with higher numbers of students below proficient under each standard

―I notice the trends in the type of questions students missed. ― 

―A lot of my kids missed this one.‖ 

―I look for common wrong answers on frequently missed items and notice trends in the type of questions students miss.‖ 

R i h i di l f ll i i h h h l i b i ―I ‘ ll b i f h kid b I hi k i ‘

Use of Interim Assessments 30

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 30/34

Use of Interim Assessments 30

Principal does not make it a priority to use the system

Had 3 1-hour training sessions for all teachers last year, ―but that wasn‘t helpful at all.‖ 

―I‘m not sure if the test covers higher order thinking skills, but I‘m not sure if that is the idea behind the test.‖ 

―It should be used to guide instruction but I‘m not sure what it actually does.‖ 

―I don‘t understand what is being tested. Is it a pre-test, a post-test, because it doesn‘t align with what we are doing in class? Itwould be nice if we knew what was actually being tested.‖ 

―Writer‘s workshop doesn‘t cover some of the important skills they are being tested on such as punctuation and grammar.‖ 

―The test material doesn‘t seem to be aligned with our classroom pacing.‖ 

Use of Interim Assessments 31

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 31/34

Use of Interim Assessments 31

Appendix C. Survey

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 32/34

  0 1 2 4 5 6 7 8 2 9 1 1 2 1 1 2 7 8 8

  ! " " # $ " " % " & % % "

  % " ' ( # " ) ! % * # % ! ! + ! ! ! %  & " " " " " " " % $ , ! ! + ! ! & ! ! % ! $  - . / %  8 1 4 3 9 3 9 9 2 7 8 9 4 2

  ! ! ! ! :

 

  , % "

  ; " ! )

  , " " " " ) %

  < =   > 4 ? 7 2 9 1 4 5 8 2 7 @ 2 A B 2 9 2 7 2 C 4 4 3 D 2

  1 8 9 8

  8 7 8 2 9 1 1 2 1 1 2 7 8 1 C 4 4 3 3 1 2 7 4 3 9 1 1 9 4 4 E 1 4 4 E C 1 8 9 8

  8 1 3 F G 2 8 9 2 1 9 2 4 D 2 9 2 C F 8 2 1 2 7 8 2 9 1 1 2 1 1 2 7 8 1

  . %   H   I  

  ! % "

  < =

 J K L M L K N J N O P Q S T U V W Q X Y Z [ P P Q P P Z Q V W \ ] W ]

 P ^ X Q ] _ P ` Q Q W P N a b S S b c Q a d S Z L e Y Q f T S X Z g ` c h J L M

 0 1 2 3 1 5 6 7 8 6 9 7 8 3 1 2 2 3 1 2 2 1

  2

  0 1 2 2 1 2

  0 1 3 ! 2 1 2 2 2   # $ % & ' ( )  * % + ,  - . ' + ' ( )

  / ' $ ( $

  / ' % / + & ' $ 4

  + , $ .

  1 2 1 1 2 :   ; 4 $ ' ( + $ . ' < % 4 4 $ 4 4 < $ ( + & % + % +   / + . ( ) =  > ' 4 % ) . $ $  > ' 4 % ) . $ $ ? $ + . % @ ) . $ $   / + . ( ) =  @ ) . $ $  A B % % + $ , C C $ + , $

  4 + & $ ( + , % 4 $ % . ( $ &

  + , $ < % + $ . ' % + % ) , + +

  & % + $

  D . $ & ' + 4 + & $ ( + 4 E

  F $ . G . < % ( $ ( H / @ D

  > $ + $ . < ' ( $ C , $ + , $ .

  ( $ F $ & % ) ) ' %

  % F F . % , ' 4 < . $

  $ G G $ + ' B $ ' ( + $ % , ' ( )  < % + $ . ' % + , % ( % ( + , $ .  D . B ' & $ 4 F $ ' G '   G $ $ & I % J ( ) % F 4 ' ( %  F % . + ' % . 4 + & $ ( + E 4

  J ( C $ & ) $

  > $ + $ . < ' ( $ C , $ + , $ .

  4 + & $ ( + 4 % . $ ( + . % J

  + 4 $ $ & ( H / @ D

  D . B ' & $ . . $ + ' B $

  G $ $ & I % J + , $ F

  4 + & $ ( + 4 4 $ $ & (

 K L M N M L O K O P Q R T U V W X R Y Z [ \ Q Q R Q Q [ R W X ] ^ X ^

 Q _ Y R ^ ` Q a R R X Q O b c T T c d R b e T [ M f Z R g U T Y [ h a d i L M N

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 33/34

  0 1 2 3

  4 5 6 7 8 9 6 9 5 8 7

  9 6 5 6 6

  9 5 6 6 7 9 7

  6 9 6 6

  5 5 6 6   8 9 9 6 7 7 6 6 7 5   7 9   9 5   6 9 9

  6 9 7 5 6 9 7

  6 5 6 9 7 6

  7 7 9 6 7

  9 7 9 7 !

  " 5 6 6 # 7 6 7

  6 9 5 5

  9 7 7 6 8 5

  7 6 6 7 5

  0 5 7 9 6 6 6  5 # 7 6 9 6   $ 9 5 6 5 #  6 9 7 6 7   9 6 9 6 9

  % 5 5 # 6 7

  9 6 9 7 5

  9 7 7 6 9 6 7 5

  3 5 8 7 6 9 6 7 6 7 9 9 6 9

  5 5 6 7 5

  & ' ) * + , - , * . / ) + - . - / * - - , / , . / . , / - ' , . / + . . ' , :

  ; . + . / + , . * - ) - , . ' - . , * . - - ) ' < = / + , . / . + , ' < , - ' , . / + . . ' , + , + >

 ?

  8 6 7   4 5 6 5 6   6 7   1 5 5 6   6 7   @ 9 A B 8   B 5 6  2 7 9   $ 5 6 6 #   6 9 6 9 7 6

  9 6

  $ 5 6 6 9 5 6

  5 # 6 9 6

  # 7 5

  " 5 5 8 5 6

  9 # 7 9

  9 8 7 7 8 7 9

 C D E F E D G C G H I J L M N O P J Q R S T I I J I I S J O P U V P V

 I W Q J V X I Y J J P I G Z [ L L [ \ J Z ] L S E ^ R J _ M L Q S ` Y \ a b E F

  0 1 2 3 4 5 1 0 1 7 8 9 1 4 7

  7 7 4 0 0 9 8 7 1 5

  1 4 7 5 0 9 7 4 0

  4 7 7 4 0 2 1 0

  5 9 2 3 5 0 1 7 4 5 1 0

  8 5 3 4 8 5 4 0 0 4 0 1   1 4 4 9 8 0 0  4 7 7 4 0 2 1 0   5 9 2 3 5 0 1 7 4 5 1 0  8 5 3 4 8 5 4 0 0 4 0 1

  5 3 3 2 8 0 1 2 3 4 5 1 0

  4 7 7 4 0 2 1 0

  5 9 2 3 5 0 1 7 4 5 1 0

  8 5 3 4 8 5 4 0 0 4 0 1

  8 7 4 5 1 0

  4 1 7 7 4 0 4 4 8 7 5 5

  8 0 1 5 3 3 2 8

  0 1 2 3 4 5 1 0 8 0 4 3 5

  7 4 0 2 1 0  4 1 7 7 4 0 4 4 8 7 5 5   8 0 1 0 8   7 2 0 0 1 2 3 4 5 1 0  8 0 4 3 5 7 4 0 2 1 0

  3 4 5 1 9 5 1 4 5 1 0 0

  7 7 4 1 4 8 9 5 2 0 5

  8 0 0 4 0 0 4 5 1 3 8 1 8

  4 4 9 1 4 4 4 3 1 4 1 0 7

  1 4 7 8 1 4 7 8 0 8 0 4 3

  2 5 8 0 0 4 0 0 4 5 1

  3 8 1 8

  8 5 0 4 5 3 3 2 8

  0 1 2 3 4 5 1 4 8 7 5 5 5 4 4 3 0  8 0 4 3 5 8 0 0 4 0 0 4 5 1  3 8 1 8  8 5 0 4 7 2   4 8 7 5 5 5 4 4 3 0 2 0 5

  8 0 0 4 0 0 4 5 1 3 8 1 8

  3 1 4

  8 9 4 0 4 2 4 5 9 4 1 4

  9 2 7 7 9 2 2

  0 0 4 5 8 1 4 8 5 3

  3 0 9 2 0 0 7 4 0 2 1 0 1

  1 4 7 1 4 8 9 4 7 0 8 5 3

  0 2 7 1 0 1 8 1

  9 7 3 5 8 1 4  5 0 1 7 2 9 1 5 8 4 7 1 0 ! " " # $ % & ' ( ) * + , . $ / ' ( , ( , $ * & ) ) ( * ) $ # * * & * # * 6 + ) ( ) * # $ ' " + * ) & ) . *

  ( * ' * ) :

  ; 4 0 2 1 0 8 7 4 8 8 8 4 4 3 8 1 4 8 1 4 7 1 4 0 1 8 3 5 0 1 7 8 1 5

  ; 4 0 2 1 0 8 7 4 8 8 8 4 1 5 8 4 3 8 0 1 4 0 1 5

  ; 4 0 2 1 0 8 7 4 8 8 8 4 1 5 8 4 4 1 4 0 1 5

  ; 4 0 2 1 0 8 7 4 8 8 8 4 1 5 8 4 4 4 0 1 4 0 1 5

 < = > ? > = @ < @ A B C E F G H I C J K L M B B C B B L C H I N O I O

 B P J C O Q B R C C I B @ S T E E T U C S V E L > W K C X F E J L Y R U Z [ > ?

8/7/2019 Use of Interim Assessment Data

http://slidepdf.com/reader/full/use-of-interim-assessment-data 34/34

  1 2 3 4 3 5 6 7 7 5 8 7 9 6 6 7 6 6 7 4 5 7 6 5 6

  5 8 7

  ! " # "   $ " " %  & 7 6  ' 3

  ' ( )

  ! " " * $ %

  & 7 6

  ' 3

  ' ( )  + " , - " , " %

  & 7 6

  ' 3

  . - ! / " 0

  / " " " - , " ,

  " 0 " 0 - " %

  & 7 6

  ' 3

  : : ; < = ? @ A B C = D E F G < < = < < F = B C H I C I

 < J D = I K < L = = C < : M N ? ? N O = M P ? F Q E = R @ ? D F S L O T U

  0 1 3 4 5 6 7 8 9 5 3 4 9 9 4 5 5

  4 9 5 4 0 8 9 3 1 9 9 3 4 5 4 9 8

  ! " # $

  % & ' ( ) ( * ! , - & & . / ( & 2

  ( & ) $ ! 2 ( ( ) " 2 & ( ) # ( * * # $ # & : ; / ( ) " 2

 < = > ? > = @ < @ A B C E F G H I C J K L M B B C B B L C H I N O I O

 B P J C O Q B R C C I B @ S T E E T U C S V E L > W K C X F E J L Y R U Z ? > ?