20

Pennsylvania Student Learning Objective - · PDF fileStudent Learning Objective: ... Students should be at benchmark by the end of first grade as demonstrated through the AIMSweb M

Embed Size (px)

Citation preview

Pennsylvania Student Learning ObjectiveMath (Grade 1)

May 2013

2

Table of ContentsIntroduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

What Is an SLO? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

What Is an Annotated SLO? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

How to Use This Document . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Pennsylvania Contextual Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

Student Learning Objective: Math (Grade 1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

Element List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

Teacher Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Content Area . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Student Learning Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Data and Targets Used to Establish the SLO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

Assessment/Performance Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

Administration of the Assessment/Performance Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

Evidence of Student Achievement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Strategies/Actions to Achieve the SLO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Teacher Effectiveness Measure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

Teacher Effectiveness Ratings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

Implementation Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

SLO Process Dimension . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Overview of Pennsylvania Math (Grade 1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Appendix: Tool for Comparing SLO Elements Across Jurisdictions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

3

Introduction What is an SLO?

As States and school districts implement educator evaluation systems that include measures of student growth, one of the challenges they face is identifying measures for non-tested grades and subjects. The use of student learning objectives (SLOs) is one promising approach to addressing this challenge. Structurally, an SLO consists of several “elements” that describe a specific learning objective for a particular student population as well as a specific, systematic process for how an educator can identify and implement strategies to track progress toward that goal and achieve it.

What is an Annotated SLO?

The Reform Support Network (RSN) has developed a series of annotated SLOs to orient readers around their structure, provide analysis and suggest specific actions to strengthen the SLO’s quality. Each annotated SLO, such as the one in this document, provides analysis and suggestions for improvement for each individual element within the SLO as well as the SLO as a whole. States, school districts, colleges, universities and others can use the RSN’s collection of annotated SLOs, the “SLO Library,” to prepare teachers and administrators to develop high-quality SLOs or to improve SLOs that they have already developed.

The SLO Library is not a collection of exemplary SLOs. The RSN designed the library as a teaching tool, so most of the jurisdictions intentionally provided the library with SLOs that vary in quality. They also vary in their subject areas and grade levels. Each SLO review identifies and discusses both strengths and areas for improvement. It is up to the reader, then, not to mimic the SLOs found in the library but to extrapolate lessons learned from them to produce new, original and high quality SLOs.

How to Use This Document

The RSN intends for the SLO Library to support any stakeholder actively engaged in learning about or implementing SLOs: State departments of education, school districts and schools, teachers implementing SLOs, administrators leading an SLO process and colleges of education interested in adding SLO coursework to their teacher or administrator preparation programs.

Each annotated SLO begins with contextual information for the jurisdiction that produced the SLO and then presents each element of the SLO in sequence. Each element begins with the jurisdiction’s actual description of it, which is followed by the text of “an author” from the jurisdiction. Think of the author as the teacher(s) or school district administrator(s) who actually wrote the SLO. The language from the jurisdiction’s description comes from the jurisdiction’s SLO template or other guidance materials. The author’s text comes from the SLO provided by the jurisdiction. Both sections are unedited.

The subsequent section, “Review of the Author’s Text and Potential Improvements,” is the focus of the library and should be of greatest interest to the reader. This section analyzes the text written by the author from the jurisdiction and provides considerations for improving the quality of the individual element.

An overall summary of the entire SLO follows the presentation of the elements and concludes the review of the SLO.

The appendix contains what the RSN calls an “element comparison tool,” which links the name of the element used by this jurisdiction to the standardized term used in the SLO Library. The comparison table intends to provide readers with the means to compare elements across SLOs, even if they are called by different names.

4

Pennsylvania Contextual InformationSLO Implementation TimelineSchool year the jurisdiction piloted or plans to pilot SLOs without stakes for teachers1

2013–2014

School year the jurisdiction piloted or plans to pilot SLOs with stakes for teachers2

Schools may choose to implement SLOs in 2013 but are required to implement them in 2014–2015.

School year began or plans to begin large scale implementation 2014–2015

SLO Development and ApprovalWho develops SLOs? Individual teachers, grade- or content-level teams

of teachers and school curriculum administrators

Are collectively developed SLOs permitted (for example, by teams of teachers and administrators)?

Yes

Who approves SLOs? The local educational agency (LEA), most likely the school administrator, decides.

SLO Use in EvaluationAre SLOs required or optional for use in evaluating educators? Required

Are SLOs the sole measure of student growth in the evaluation system? If not, what other measure(s) does the jurisdiction use?

For teachers without eligible data from the Pennsylvania Value-Added Assessment System (PVAAS), yes. For teachers with eligible PVAAS data, no; these data will be used.

Does the jurisdiction use SLOs to determine educator compensation? The LEA decides.

What weight does the SLO carry in determining the summative rating for teachers in the jurisdiction’s evaluation system?

For teachers without eligible PVAAS data, 35 percent. For teachers with eligible PVAAS data, 20 percent.

What weight does the SLO carry in determining the summative rating for administrators in the jurisdiction’s evaluation system?

20 percent

SLO ImplementationHow many SLOs are required for most teachers? TBD

How many SLOs are required for most school administrators? TBD

Which teachers and administrators are required to use SLOs? All teaching and non-teaching certified professional employees; all principals

SLO AssessmentWho selects which assessments are used for SLOs? Individual teachers, grade- or content-level teams

of teachers, and school curriculum administrators

Are there standards or required development processes for assessments created by teachers, schools, or districts? If so, what are they?

Yes. An assessment literacy process is being developed, as well as content-specific models of SLOs and accompanying student performance measures and scoring tools.

What types of assessments are permitted? District-designed measures and examinations, nationally recognized standardized tests, industry certification examinations, student projects pursuant to local requirements and student portfolios, pursuant to local requirements

Are performance or portfolio-based assessments permitted for SLOs? Yes

Are commercially available assessments permitted for SLOs? Yes

1 SLOs will not be used in educator evaluations2 SLOs may be used in educator evaluations

5

Student Learning Objective: Math (Grade 1)Element List

Teacher Information ....................................................................................................................................................6

Content Area...................................................................................................................................................................7

Student Learning Objective.......................................................................................................................................8

Data and Targets Used to Establish the SLO..........................................................................................................9

Assessment/Performance Task...............................................................................................................................10

Administration of the Assessment/Performance Task....................................................................................11

Evidence of Student Achievement........................................................................................................................12

Strategies/Actions to Achieve the SLO.................................................................................................................14

Teacher Effectiveness Measure................................................................................................................................15

Teacher Effectiveness Ratings..................................................................................................................................17

Implementation Timeline........................................................................................................................................18

SLO Process Dimension............................................................................................................................................19

6

Teacher InformationStandardized Name

Other InformationJURISDICTION’S DESCRIPTION OF THE ELEMENTThe jurisdiction left this section blank .

AUTHOR’S TEXT FOR THE ELEMENT

1 . Teacher InformationTeacher Name (See #12: SLO Process Dimension)School Name School Name RemovedDistrict Name District Name Removed

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThis element asks for the identities of teacher, school and school district and has been left blank intentionally.

7

Content AreaStandardized Name

Learning ContentJURISDICTION’S DESCRIPTION OF THE ELEMENTThe jurisdiction left this section blank .

AUTHOR’S TEXT FOR THE ELEMENT

2 . Content AreaCourse Title or Content Area MathPA Standards Subject Area MathGrade Level(s) 1Brief Description of the course/ content area Self-contained classroom setting where math is a significant

component of instruction.Number of students per class/session 25Frequency of classes/sessions DailyNumber of minutes per class/session 30-45Total number of classes/sessions 180

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThis section identifies key information about the course and is quite specific about the total amount of time available for the SLO.

8

Student Learning ObjectiveStandardized Name

RationaleJURISDICTION’S DESCRIPTION OF THE ELEMENT• Learning objectives that can be validly measured to document student learning over a defined period of

time. • Why is this objective important and meaningful to your students? • How is your student learning objective measurable and clearly understood by students in this grade/course? • Based upon this objective, how will students independently demonstrate their learning?• What are the big ideas, essential questions, concepts and competencies (as identified by PA’s Curriculum

Framework) to be measured? www.pdesas.org• Based upon the big ideas, essential questions, concepts and competencies selected, what content standards

are aligned to this objective?

AUTHOR’S TEXT FOR THE ELEMENT

3 . SLO (Student Learning Objective)Student Learning Objective

Students will learn to successfully compute in the base-ten number system and identify and apply patterns that exist within a set of numbers.

SLO Rationale This objective is measured using the AIMSweb M-Comp and TEN (Test of Early Numeracy) probes. Independent demonstration of student learning is required by these probes.

Standards Alignment

2 .1 .1 .A: Demonstrate the relationship between numbers and quantities, including place value, one-to-one correspondence, rote counting, counting by twos to 20, counting by tens and fives and comparing values of whole numbers up to 100.

2 .1 .1 .D: Apply place value concepts and base-ten numeration to order and compare whole numbers up to 100

2 .2 .1 .A: Apply concepts of addition and subtraction to solve problems up to ten.

2 .2 .1 .B: Demonstrate strategies for addition and subtraction in order to solve single- and double-digit addition and subtraction problems.

2 .8 .1 .C: Recognize, describe, extend, replicate and transfer number and geometric patterns.

CC2.1.1.B.1 Extend the counting sequence to read and write numerals to represent objects.

CC2.1.1.B.2 Use place value concepts to represent amounts of tens and ones and to compare two digit numbers.

CC2.1.1.B.3 Use place value concepts and properties of operations to add and subtract within 100.

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTS

The SLO identifies a general goal and an assessment and lists a set of standards. The author identifies multiple essential competencies (for example, students must apply concepts of addition and subtraction and use place value concepts). These standards address the demands of the Pennsylvania Curriculum Framework and the Common Core State Standards.

Explaining why this content is critical to student learning, now and in the future, would strengthen this element.

9

Data and Targets Used to Establish the SLO

Standardized Name

Student Growth Targets

JURISDICTION’S DESCRIPTION OF THE ELEMENT• What, if any, are the course or grade prerequisites?

• What do you know about the students in this class at the beginning of the learning process?

• What may typical and not-so-typical student progress look like on the way to achieving the learningobjective?

• What is the expected outcome or goal by the end of the designated instructional period (i.e., year, semester,course length, etc.)?

• Using student baseline data, are differentiated targets/goals needed? If yes, what are those targets/goals?

• Are the targets ambitious yet realistic, and how will you monitor progress along the way?

AUTHOR’S TEXT FOR THE ELEMENT

4 . Data and Targets Used to Establish the SLOStudent Preparedness/Baseline Data

Most students will have experienced Kindergarten math and AIMSweb protocols. Prior data shows that 80% of the students met the end-of-year Kindergarten AIMSweb TEN math benchmarks.

Targets Students should be at benchmark by the end of first grade as demonstrated throughthe AIMSweb M-COMP and TEN assessment probe scores.

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThe SLO provides a general statement about past performance of the identified group of students on kindergarten tests. It identifies the target as being “at benchmark by the end of first grade.” It is unclear what “being at benchmark” means. The SLO neither establishes baselines for individual students nor differentiates targets for them.

To strengthen this element, the author might explain the meaning of “at benchmark.” Must all item scores be at benchmark? Should a set number of items across the assessments average to a certain level? The answers to these questions are currently unclear. Including a roster with specific student baselines would add clarity and nuance to the 80 percent figure and allow for more individualized target-setting if appropriate, particularly if the teacher wants to set growth as opposed to mastery goals. (However, the author says in a later element that the SLO establishes a mastery goal and therefore she does not need to provide baseline data.) Gathering and analyzing additional performance data for student learning in kindergarten courses would help justify targets for growth goals and provide valuable information to the teacher about how she might need to differentiate instruction as students grow toward specific goals or toward mastery.

10

Assessment/Performance TaskStandardized Name

AssessmentsJURISDICTION’S DESCRIPTION OF THE ELEMENT

• Who is the developer of the assessments/performance task used (e.g., teacher-made, district-developed, commercial, etc.)?

• What is a description of the assessment/performance task that will be used to measure the student learning objective (SLO)?

• Are there any products or artifacts that will be gathered as part of the data collection process?

• Describe how the assessment/performance task authentically reflects the student learning objective (SLO).

• How does this assessment/performance task measure student mastery and/or growth toward the PA standards?

• How do the assessment/performance task outcomes inform instruction?

AUTHOR’S TEXT FOR THE ELEMENT

5 . Assessment/Performance TaskName of theAssessment/Performance Task

AIMSweb M-COMP and TEN

Description of the Assessment/Performance Task

The AIMSweb assessment protocols are commercially designed benchmark assessments.

Assessment/Performance Task Objectives Rationale

The tasks inherent in the probes relate to skills found in both the PA and Common Core Standards for first grade math. Mastery is assessed, and additional probes are available to monitor progress and inform focused instruction.

Growth or Mastery Check one: Growth (change in student achievement across two or more points in time)x Mastery (attainment of a defined level of achievement) Growth and Mastery

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThe SLO identifies the Mathematics Computation (M-COMP) and Test of Early Numeracy (TEN) assessments and points out a connection between the skills they require students to demonstrate and State and Common Core Standards. These are commercially developed assessments.

The author might consider clarifying if teachers will use the first grade M-Comp and TEN as both pre- and post-assessments or if the student’s end-of-year kindergarten test will provide the teacher with the baseline data (keeping in mind that, in a later element, the author says that baseline data for the SLO are not necessary). Identifying and describing these assessments would help teachers and evaluators track student progress toward mastery. Finally, the author might consider identifying the additional probes that are available and when the teacher will administer these probes to monitor progress. This will help teachers and evaluators determine if the probes are in fact aligned with relevant standards.

11

Administration of the Assessment/Performance Task

Standardized Name

Assessments

JURISDICTION’S DESCRIPTION OF THE ELEMENT• How often and when is this assessment/performance task administered?

• If measuring growth, are multiple assessment windows in place?

• What unique or specific equipment, technologies, or resources are needed to complete this assessment/performance task?

• What assessment/performance task adaptations are needed to assist diverse learners and/or students with disabilities?

• Can this assessment/performance task be administered by an equivalent peer (educator in a similar content area)? If not, please explain.

• Does a district policy exist with regard to assessment/performance task administration?

AUTHOR’S TEXT FOR THE ELEMENT

6 . Administration of the Assessment/Performance TaskFrequency of Assessment/Performance Task Administration

Probes for both the M-COMP and TEN assessments are done in the fall and spring, with a mid-year and several progress monitoring probes available.

Resources Required The school district must purchase the AIMSweb materials.

Adaptations for Diverse Learners and/or Students with Disabilities

Adaptations found in student IEP or 504 accommodations will be administered.

Personnel This assessment can be administered by an equivalent peer.

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThe author identifies three-month intervals (fall and spring) when the teacher can administer probes to students. According to the text, teachers will administer the accommodations for students with IEPs and 504 plans.

To improve this element, the author might explain the adaptations for students with special needs. Also, the author might consider being more specific about when students will complete the probes. Is the fall probe a potential pre-test or does it measure skills developed up to that point in time? If it is a pre-assessment, the author could include information about it in other parts of this SLO, particularly element 5. Finally, the author might be more specific about when to administer the mid-year and other progress- monitoring probes so that administrators know when they might discuss the teacher’s effectiveness and support changes as necessary.

12

Evidence of Student AchievementStandardized Name

AssessmentsJURISDICTION’S DESCRIPTION OF THE ELEMENT

• How will individual student growth or mastery be determined (defined and scored) using this assessment/performance task? Include the specific rubric/scoring scale that will be used.

• Does the rubric and/or scoring scale correlate with the assessment/performance task?

• In what format will data be collected (e.g., database, graphed, portfolio, etc.)

• Is a pre-post test being used? (If so, please describe.)

• How frequently will data be collected?

• How was baseline data collected? (If baseline data was not collected, please explain.)

• Can baseline data be compared with the results of this assessment/performance task?

• What evidence will be presented to principal/evaluator to support the teacher effectiveness measure?

• How will data be presented to the principal/evaluator (e.g., database, graphed, portfolio, individual student artifacts, etc.)?

• How can the assessment/performance task results be interpreted in the same way across equivalent peers?

• Is there a reliable and valid scoring and interpretive process (e.g., state developed, district-based, commercial, standardized, etc.) that is associated with the assessment/performance task? If so, please describe.

AUTHOR’S TEXT FOR THE ELEMENT

7 . Evidence of Student AchievementRubrics/Scoring Scales

Scores from the probes are compared to the benchmark score for that probe and assigned a “color” to describe proficiency in achieving that benchmark.

Data Collection Probes are teacher scored, and data from all probes administered will be input into a database.

Scoring Student Progress

Since the assessments describe mastery, no baseline data is required. However, the design of the assessment system is one of ongoing assessment, data collection and subsequent progress monitoring.

Formative Assessment Information

AIMSweb protocols are designed as formative assessment/progress monitoring protocols.

Data Presentation Evidence will presented through aggregated scores from the database.

Data Analysis and Interpretation

AIMSweb M-COMP and TEN are valid and reliable commercially developed assessments.

13

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSIn response to a prompt from jurisdiction text, the author claims no baseline data are required, because the SLO sets a goal of mastery, not growth. The author plans to assess student work on a regular basis, which will allow adjustments in practice where appropriate. The description of how probes are scored and proficiency determined is unclear, however. For instance, the author does not identify the specific “scoring” colors and correlate them with specific levels of performance. The author does not include the rubrics used for scoring either, making it difficult to determine how instructors will evaluate student performance or the extent to which scoring will be consistent.

The jurisdiction does not always require baseline data, and the author makes the case that mastery goals do not require them. However, as students are still growing toward mastery, baseline data might help determine individual student needs as they start on their journey toward mastery. More explanation for each of the subelements and the addition of a scoring rubric also would strengthen this SLO.

14

Strategies/Actions to Achieve the SLOStandardized Name

Instructional StrategiesJURISDICTION’S DESCRIPTION OF THE ELEMENT

• What formative assessment information lets you know if your instructional practices will lead to successful completion of the SLO?

• Based upon reflection, what instructional practices would you like to change or strengthen?

• What professional learning and/or other type of support will help you to achieve this SLO?

AUTHOR’S TEXT FOR THE ELEMENT

8 . Strategies/Actions to Achieve the SLOAssessment for Learning AIMSweb protocols are designed as formative assessment/progress

monitoring protocols. They are also diagnostic by design, helping teachers to better attend to gaps in student understanding of concepts.

Alignment with the Danielson Framework for Teaching

The teacher will develop and implement a bank of strategies to address differentiated learning as demonstrated through AIMSweb progress monitoring probes. (Danielson 1C)

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThe use of formative assessment protocols aids in identifying gaps in learning and making informed mid-course corrections to better address student needs. The SLO does not identify key instructional strategies beyond the administration of progress-monitoring probes, however. To improve this element, the author might provide a more detailed list of strategies that the teacher can deploy and explain how these strategies will advance the learning goal (which may include evidence from research or experience).

15

Teacher Effectiveness MeasureStandardized Name

ScoringJURISDICTION’S DESCRIPTION OF THE ELEMENT

How will the aggregated scores of the “Evidence of Individual Student Achievement” results be used to define teacher effectiveness?

AUTHOR’S TEXT FOR THE ELEMENT

9 . Teacher Effectiveness MeasureClassroom Objective How will the aggregated scores of the “Evidence of Individual Student Achievement” results be used to define teacher effectiveness?

Failing: few students achieve content mastery or growth

Needs Improvement: less than a significant number of students achieve content mastery or growth

Proficient: A significant number of students achieve content mastery or growth

Distinguished: An exceptional number of students achieve content mastery or growth

Using the AIMSweb M-Comp assessment, less than 64% of students will meet or be above benchmark (or green). Using the AIMSweb Early Numeracy assessments, less than 64% of students will meet or be above benchmark (or green).

Using the AIMSweb M-Comp assessment, 65%-79% of students will meet or be above benchmark (or green). Using the AIMSweb Early Numeracy assessments, 65%-79% of students will meet or be above benchmark (or green).

Using the AIMSweb M-Comp assessment, 80%-89% of students will meet or be above benchmark (or green). Using the AIMSweb Early Numeracy assessments, 80%-90% of students will meet or be above benchmark (or green).

Using the AIMSweb M-Comp assessment, 90%-100% of students will meet or be above benchmark (or green). Using the AIMSweb Early Numeracy assessments, 90%-100% of students will meet or be above benchmark (or green).

Targeted Student Population ObjectiveHow will the mastery or growth of targeted student populations be described and used to define teacher effectiveness?

Failing: Did not meet goal, little to no student mastery or growth

Needs Improvement: Did not fully meet goal but showed some student mastery or growth

Proficient: Met goal or otherwise demonstrated significant student mastery or growth

Distinguished: Surpassed goal otherwise demonstrated significant student mastery or growth

Targeted Population: Students who begin first grade below benchmark.Mastery and/or growth goal: All targeted students will demonstrate growth toward or continued maintenance of proficiency from their beginning baseline “color” as demonstrated on the AIMSweb M-comp and Early Numeracy assessments.

16

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThe SLO partially explains how outcomes on the two assessments will be evaluated for the classroom objective (that is, to be counted as proficient, students must reach the benchmark for both measures). It is not clear how “split results” translate to the teacher-effectiveness measure, however. For example, what if 80 percent of students reach the benchmark for one assessment, but 60 percent of students reach the benchmark for the other assessment? What overall rating would the teacher earn?

The classroom objective sets high expectations for students; however, the targeted objective lacks rigor and specificity. It indicates students are expected to grow, but does not indicate the amount of growth expected. To address this problem, the author might consider identifying the exact amount of growth expected for the targeted student population. When using multiple assessments, the author might specify how the combined performance of students on both measures will affect the teacher-effectiveness measure.

17

Teacher Effectiveness RatingsStandardized Name

ScoringJURISDICTION’S DESCRIPTION OF THE ELEMENT

What were the results of the assessments/tasks and how do they relate to the classroom and targeted objectives?

AUTHOR’S TEXT FOR THE ELEMENT

10 . Teacher Effectiveness Ratings

What were the results of the assessments/tasks and how do they relate to the classroom and targeted objectives?Classroom Objective

Failing Needs Improvement

Proficient Distinguished

Notes/Explanation

Targeted Objective

Failing Needs Improvement

Proficient Distinguished

Notes/Explanation

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThe implementation timeline for this SLO identifies a completion date of June 1, at which point this element should be complete.

18

Implementation TimelineStandardized Name

Other InformationJURISDICTION’S DESCRIPTION OF THE ELEMENT

The jurisdiction left this section blank .

AUTHOR’S TEXT FOR THE ELEMENT

11 . Implementation TimelineDate SLO is due to principal SeptemberDate(s) for Assessment and Data Collection

September, January, May

Dates to complete Data Interpretation

May 31st

Date to present Teacher Effectiveness Measure

June 1

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThe interval appears to be appropriate for the learning content addressed. To strengthen this element, the author might consider including specific beginning and end dates for the teaching period. For example, September 1 – June 1 would allow for nine months of instruction, whereas September 30 – June 1 would allow for eight. This level of specificity would play an important role in helping teachers and evaluators determine whether the interval matches the intended learning experiences and the related targets.

19

SLO Process DimensionStandardized Name

Other InformationJURISDICTION’S DESCRIPTION OF THE ELEMENT

The jurisdiction left this section blank .

AUTHOR’S TEXT FOR THE ELEMENT

12 . SLO Process DimensionSLO –Assessment Developers-Expertise

2 developers’ names removed

Grain Size Small: Assesses only the math component of the first grade curriculum

Exemplars of Student Work

REVIEW OF AUTHOR’S TEXT AND POTENTIAL IMPROVEMENTSThe SLO indicates that the grain size is small, and provides a brief explanation for the designation. What are the implications for teachers if the grain size of their SLO is small?

Overview of Pennsylvania Math (Grade 1)This mathematics SLO addresses significant learning content aligned with Pennsylvania State Standards and the Common Core. It highlights the importance of monitoring student progress over time and provides general achievement data from the previous year for the same students. Certain elements within the SLO would benefit from greater explanation and clearer language. For instance, the “evidence of student achievement” element would be clearer with more explanation of the color code and inclusion of the actual scoring rubric. Finally, while the author suggests that baseline data are not essential to this SLO, the document provides them (the kindergarten data in element four) or hints at their collection. (In the implementation timeline, element nine, the author refers to a September data collection date). The kindergarten data and September assessment provide the opportunity to disaggregate the data by student and use the baseline information to inform instruction, even if the jurisdiction does not require baseline data for mastery.

20

Appendix: Tool for Comparing SLO Elements Across JurisdictionsPennsylvania Element Name Standardized Name

Teacher Information Other Information

Content Area Learning Content

Student Learning Objective Rationale

Data and Targets Used to Establish the SLO Student Growth Targets

Assessment/Performance Task Assessments

Administration of the Assessment/Performance Task Assessments

Evidence of Student Achievement Assessments

Strategies/Actions to Achieve the SLO Instructional Strategies

Teacher Effectiveness Measure Scoring

Teacher Effectiveness Ratings Scoring

Implementation Timeline Other Information

SLO Process Dimension Other Information

An earlier version of this document was developed under the auspices of the Reform Support Network, with funding from the U.S. Department of Education under contract #GS-23F-8182H. This publication features information from public and private organizations and links to additional information created by those organizations. Inclusion of this information does not constitute an endorsement by the U.S. Department of Education of any products or services offered or views expressed, nor does the Department of Education control its accuracy, relevance, timeliness or completeness.