Balancing Assessments Module Two 11.7.13 Plymouth Church

Preview:

Citation preview

Balancing AssessmentsModule Two

11.7.13

Plymouth Church

Today’s Agenda

I. Welcome & Outcomes for our Work

II. Balanced Assessment System

III. Defining Data Teams

IV. Rigor Analysis

V. Classroom Formative Assessment

VI. Assessment Feedback

VII. Wrap Up

REMINDERS AND LOGISTICS

The Purpose of this Module

This module will provide rationale and practice in formative assessment design and use, summative assessment design and use, and the interaction between assessments and instruction.

• Why do we assess?

• What do we do after we assess?

• Are we asking the right questions?

• What is classroom formative assessment?

• How do I give classroom assessment feedback?

Norms

• Be present – Limit side conversations– Self-monitor use of electronics– Avoid working on other tasks – stay

focused on the topic at hand

• Be respectful of your peers and the facilitator

• Ask Questions

• Participate!

WELCOME AND INTRODUCTIONS

Introductions

Share:

• Name

• School

• Position Title

• Previous Experience

BALANCED ASSESSMENT SYSTEM

WHY DO WE ASSESS?

DMPS Balanced Assessment System

Classroom Formative Assessments

Classroom Summative Assessment

Common Formative Assessments

District Interim/Benchmark Assessments

External Summative Assessments

Purpose

To inform instruction within and between lessons for both student and teacher

To give a grade To determine if students have learned the materials and how to respond instructionally

To support building and district teams in assessing curriculum, instructional strategies, and pacing. To serve as a predictor for success on External Summative Assessments

To support building and district teams in determining whether curriculum, instructional strategies and pacing were appropriate

Examples of Practice

Student response systems, whiteboards, Writing to Learn, student: teacher conferences

Final Exams, final projects, performance based tasks

Learner objectives assessed with rubrics, short quizzes, Writing to Learn, and Journeys assessment materials

District Benchmark Assessments, Writing Assessments, Basic/Analytical Reading Inventory (BRI/ARI), Scholastic Reading/Math Inventory (SRI/SMI)

Iowa Assessment, ACT, PLAN, AP Exams, PA Profile (Kindergarten), Technology Assessment (Grade 8)

Formative or Summative

Very formative More summative Very formative More summative Summative

Whose responsibility for creation

Classroom teachers Classroom teachers Collaborative teams at each school (Data Teams)

District teams of representative teachers

An external group of experts

Reported to Student Student and parentsGrade level teams (Data Teams)

District District and State

Each successive type of assessment (from left to right) requires a more significant investment in time, resources, and collaboration to prepare, administer, and garner useful data. Additionally, the direct impact on classroom instruction decreases as the type of assessments migrates away (left to right) from classroom formative assessments.

Copy Provided

Rick Wormeli – Video 1

http://www.youtube.com/watch?v=rJxFXjfB_B4

Quick Write: Compare/contrast formative and summative assessment.

Balanced assessment system

• Formative Assessment by Jakicic and Bailey.

–pp 19-23

• While you’re reading, identify:

–- 3 connections you made.

–- 2 questions you have about what you’re reading.

– - 1 “ah-ha” moment.

DEFINING DATA TEAMSWHAT DO WE DO AFTER

WE ASSESS?

Data Teams

WHAT THEY ARE:• Meetings with

agendas and minutes

• Collaborative efforts

• Analysis of common data from a common assessment

• Teams that use a strategic process (DDDM)

WHAT THEY ARE NOT:

• Grade level team meetings

• Discussions about individual students and their needs UNLESS the need is related to the prioritized goal

• Times to discuss grade or building events (field trips)

• Collectors of every kind of data

Data Teams Time Allocation

• Step 1: Collect and Chart Data (5%)• Step 2: Analyze Data and Prioritize Needs

(30%)

• Step 3: SMART goal (5%)• Step 4: Select Common Instructional

Strategies (30%)

• Step 5: Monitor Results (30%)

Where does this fit?

• We have provided one pack of Assessment cards to each table.

• At your table, arrange these cards in the order in which they should occur within a unit in your classroom.

• There may be some cards in your pack that you don’t use!

• Each table will need one representative to explain the table’s chosen sequence of assessments.

Annotating Curriculum Guide

• Classroom formative (green)

• Common formative (yellow)

• Summative (pink)

10 min Break!

RIGOR ANALYSISARE WE ASKING THE RIGHT QUESTIONS?

Purpose for Rigor Analysis

1. Align instruction and assessment with standards

2. Evaluate an assessment.

-Are we raising rigor?

-Does our instruction and assessment match the rigor of the standards?

3. Create formative assessments.

20

Alignment

Intended Curriculum

Enacted Curriculum Assessed Curriculum

21

Assessing Academic Rigor – Based on SREB Learning-Centered Leadership Program and the Wallace Foundation

Revised Taxonomy Table

Knowledge Dimension

Cognitive Dimension

Remember

RecognizingRecalling

Understand

InterpretingExemplifying

ClassifyingSummarizing

InferringComparingExplaining

Apply

ExecutingImplementing

Analyze

DifferentiatingOrganizingAttributing

Evaluate

CheckingCritiquing

Create

GeneratingPlanning

Producing

Factual Knowledge A1 A2 A3 A4 A5 A6

Conceptual Knowledge B1 B2 B3 B4 B5 B6

Procedural Knowledge C1 C2 C3 C4 C5 C6

Meta-Cognitive Knowledge D1 D2 D3 D4 D5 D6

KNOWLEDGE DOMAIN

Assessing Academic Rigor - Based on SREB Learning-Centered Leadership Program and the Wallace Foundation

22

The Knowledge Dimension

A. Factual Knowledge

B. Conceptual Knowledge

C. Procedural Knowledge

D. Metacognitive Knowledge

23

A. Factual Knowledge

Factual Knowledge: Basic elements students must know to be acquainted with a discipline or solve problems in it.

• Examples:–William Shakespeare –1812–4 x 3 = 12–>

24

B. Conceptual Knowledge

• Conceptual Knowledge: The interrelationships among the basic elements within a larger structure that enables them to work together.

• In other words, a category or group of things with features (attributes).

25

What is the difference between facts and concepts?

• Conceptual knowledge has to be taught by defining the attributes and with multiple examples and non-examples (some of which are near-misses); can be abstract or concrete.

• Examples: • Table• Love• Justice• Equal parts

26

C. Procedural Knowledge

Procedural Knowledge: How to do something: methods of inquiry, and criteria for using skills, algorithms, techniques, and methods

Examples:–In math, algorithms for performing long division –In science, methods for designing experiments–In English/Language Arts, procedures for spelling words

27

D. Metacognitive Knowledge

Metacognitive Knowledge: Knowledge of cognition in general as well as awareness and knowledge of one's own cognition (thinking about your thinking)

28

Examples:-Knowing when to use mnemonic strategies, paraphrasing, summarizing, questioning, note-taking, or outlining to attain a learning goal. -Realizing that your study session will be more productive if you work in the library rather than at home.

So Let’s Practice!

Identify the knowledge dimension :–Example 1: 1492 Columbus crossed ocean–Example 2: What steps are used in scientific inquiry?–Example 3: Describe your thinking at how you arrived at your answer.–Example 4: Compare analysis with evaluation.

29

THE COGNITIVE DOMAIN

Assessing Academic Rigor - Based on SREB Learning-Centered Leadership Program and the Wallace Foundation

30

The Cognitive Process Dimension

1. Remember

2. Understand

3. Apply

4. Analyze

5. Evaluate

6. Create

31

1. Remember

• Retrieving relevant knowledge from long term memory (verbatim, unchanged by student)

• Remembering is essential for meaningful learning and problem-solving and used in more complex tasks.

32

2. Understand

• Constructing meaning from instructional messages, including oral, written, and graphic communication.–More cognitive processes are associated with this category than any other category.–Most represented category in state standards.–Critical for all further learning.

33

3. Apply

• Carry out or use a procedure in a given situation.

34

4. Analyze

• Break material into its constituent parts and determine how the parts relate to one another and to an overall structure or purpose.

35

5. Evaluate

• Make judgments based on criteria and standards.

36

6. Create37

• Put elements together to form a coherent or functional whole; recognize elements into a new pattern or structure.

What is Each Task’s Intent?

1. Julio’s younger brother is learning how to read a thermometer and asks, “Why does the red stuff in the thermometer go up when it gets hot outside?” What is the correct explanation that Julio can give his brother?

2. Seatbelts save lives during car crashes. Use Newton’s first law to explain how seatbelts work.

3. Poison dart frogs live in the rainforest and are very brightly colored. Create a list of adaptations that would help the frog survive.

Evaluate the Intended Curriculum and an Assessment

• Using Bloom’s Revised Taxonomy of Educational Objectives, evaluate the curriculum unit/standards and place in matrix (green highlighter)

• Repeat the process for the assessment, evaluate each item and place in matrix (pink highlighter)

• Do the intended curriculum and assessment align?

Quick Whip

• What did you discover through this process?

Lunch!

Hattie Activity

• On the table in front of you, arrange the teaching component cards in order from least meaningful to student success (on the left) to most meaningful to student success (on the right).

• Before we tell you anything about how these ACTUALLY rank, look at where you placed “Teacher salary” and “Teacher access to vending machines.” Interestingly enough, these matter to teachers far more than they matter to student achievement. Remove these cards.

Revelations!

• The impact of a teaching component on student achievement was measured by researcher John Hattie in his book Visible Learning for Teachers (2012). He gave each teaching component a score (a statistical characteristic called effect size) and ranked them.

• How did your rankings compare to his findings?

The Evens

• At #10, with an effect size of -0.13

• RETENTION

• At #8, with an effect size of 0.17

• MATCHING TEACHING WITH LEARNING STYLES

• At #6, with an effect size of 0.22

• INDIVIDUALIZING INSTRUCTION

The Evens, Continued

• At #4, with an effect size of 0.59

• DIRECT INSTRUCTION

• At #2, with an effect size of 0.75

• FEEDBACK

• Look at the five teaching components we haven’t revealed yet. Where would you place them NOW?

The Odds

• At #9, with an effect size of 0.12

• ABILITY GROUPING/TRACKING

• At #7, with an effect size of 0.21

• REDUCING CLASS SIZE

• At #5, with an effect size of 0.52

• HOME ENVIRONMENT

The Odds, Continued

• At #3, with an effect size of 0.72

• TEACHER-STUDENT RELATIONSHIPS

• At #1, with an effect size of 0.90

• TEACHER CREDIBILITY IN EYES OF STUDENTS

• Look at the whole continuum. What makes perfect sense to you? What challenges your thinking?

CLASSROOM FORMATIVE ASSESSMENT

WHAT IS CLASSROOM FORMATIVE

ASSESSMENT?

Classroom Formative Assessment

FEEDBACKHOW TO GIVE FEEDBACK

ON CLASSROOM FORMATIVE

ASSESSMENTS

Feedback

Recommended