ESSENTIAL STRATEGIES TO STRENGTHEN YOUR PROGRAM THROUGH COMMON ASSESSMENTS Dawn Carney, Brookline Public Schools Tim Eagan, Wellesley Public Schools Tiesa

Embed Size (px)

Citation preview

  • Slide 1
  • ESSENTIAL STRATEGIES TO STRENGTHEN YOUR PROGRAM THROUGH COMMON ASSESSMENTS Dawn Carney, Brookline Public Schools Tim Eagan, Wellesley Public Schools Tiesa Graf, South Hadley Public Schools MaFLA, October 24, 2014
  • Slide 2
  • Agenda Objectives A message from the DESE (Craig Waterman, Assessment Coordinator) A few reminders! Warm up Developing and strengthening common assessments Using protocols to analyze student work Using assessment data to identify feedback for student growth Learning how the process can work for you takeaways and next steps
  • Slide 3
  • Objectives This workshop will help you to: Develop and strengthen DDMs (or common assessments) Use protocols to examine student work Use assessment data to identify feedback for student growth Identify professional learning to incorporate into SMART goals (and evaluation process) Learn how this process can strengthen your program and work for you!
  • Slide 4
  • A message from DESE Craig Waterman Assessment Coordinator Department of Elementary & Secondary Education
  • Slide 5
  • A few reminders! Where we are (MaFLA member survey) DDM = common assessments = valuable! Design an action plan Action plan should inform department vision How can these tools help my department with SMART goals/eval process? Essential to know ACTFL Standards, Guidelines and Descriptors
  • Slide 6
  • Warm up Review the workshop objectives Write down a goal for yourself ~ In terms of where you are with the DDM/common assessment process What do you need from today and what do you need for the future? Share with a partner 3 minutes
  • Slide 7
  • Developing & Strengthening Common Assessments Improving Teaching & Learning
  • Slide 8
  • Assessment is an ongoing process of setting clear goals for student learning and measuring progress toward those goals.
  • Slide 9
  • Performance vs Proficiency Reliant on Instruction Practiced Controlled Environment Separate from Instruction Unrehearsed Across a wide range of topics and settings Performance Proficiency ACTFL Performance Descriptors for Language Learners ACTFL, Inc., 2012
  • Slide 10
  • ACTFL Performance Descriptors Roadmap for teaching and learning, Help teachers create performance tasks targeted to the appropriate performance range, Challenge learners to use strategies from the next higher range. ACTFL Performance Descriptors for Language Learners ACTFL, Inc., 2012
  • Slide 11
  • Key Concepts Principles of Assessment
  • Slide 12
  • Practicality Resources FundingMaterials# of Staff # of Students Knowledge of Students & of Assessment TimeOther
  • Slide 13
  • Impact Washback + Washback Stakeholders Students Parents Administrators Teachers Other
  • Slide 14
  • Validity Does the assessment measure what it is supposed to measure? Is it used for its intended purposes on the intended population?
  • Slide 15
  • Four questions to ask after looking at selected assessments and student work samples. Building a Validity Argument
  • Slide 16
  • Validity Argument Question 1: Does the assessment measure what it is supposed to measure/what I thought it would measure?
  • Slide 17
  • Validity Argument Question 2: Does the assessment reflect real- life language uses? Is it authentic?
  • Slide 18
  • Validity Argument Question 3: Do the students and the teachers take the assessment seriously?
  • Slide 19
  • Validity Argument Question 4: Is the assessment consistent with instruction the what and the how?
  • Slide 20
  • Harvard Data Wise: We used to think validity was a property of a test. Modern validity takes a different approach: Inferences based on test scores cannot be perfectly valid, but they can be valid enough to be useful.
  • Slide 21
  • What impact does this validity question have on your action plan? Where is your department on this? Do your departmental assessments reflect real-life language uses? Are they authentic, and therefore valid? 2 minutes
  • Slide 22
  • Developing DDMs: Best practicewide range of item difficulties and rigor so that examinees of all abilities can show some capability... Technical Guide A: Considerations Regarding District-Determined Measures
  • Slide 23
  • Presentational Writing Prompt Example Modified Pretest/Post-test Model DDM Examples: Wellesley High School
  • Slide 24
  • Modern Languages, Year 3: A friend of yours has a problem with someone. Describe the problem, explore various solutions, and then advise your friend on a specific course of action. Use all appropriate verb tenses learned. What other information do you need in order to apply the Validity Argument to this prompt? 1 minute.
  • Slide 25
  • Performance Targets: Year 3 Students (Typically Sophomores) Intermediate-Low/Mid Proficiency ACTFL Performance Descriptors for Language Learners ACTFL, Inc., 2012
  • Slide 26
  • Performance Targets: Year 4 Students (Typically Juniors) Intermediate-Low/Mid Proficiency ACTFL Performance Descriptors for Language Learners ACTFL, Inc., 2012
  • Slide 27
  • Narrowing the Focus Developing Scoring Guides
  • Slide 28
  • WPS Presentational Speaking & Writing Task Rubric:
  • Slide 29
  • One domain only: Language Control Describes the level of control the learner has over certain language features or strategies to produce or understand language. ACTFL Performance Descriptors for Language Learners ACTFL, Inc., 2012
  • Slide 30
  • Why one narrow domain? Practicality: Build assessment literacy & capacity to talk about student work. Stay small and focused. Learn to be Specific and Objective. Define & refine our performance targets based on data.
  • Slide 31
  • Take 2 minutes and discuss: How can you use the principle of Practicality to support you in building an action plan? For example
  • Slide 32
  • One participant to share with group.
  • Slide 33
  • USING PROTOCOLS TO ANALYZE STUDENT WORK/LOOKING FOR GROWTH Consistency of the results of an assessment
  • Slide 34
  • Be reliable! Items (questions/prompts should assess the same skill/knowledge) Administration (consistency) Rating (two different scorers should get the same result/rating/score)
  • Slide 35
  • Keep it simple! Select a core course objective goals specific to your curriculum (example) Determine the communicative mode to assess (*hint presentational!) Develop the prompt Select the domain from your rubric for area of growth Items developing DDMs
  • Slide 36
  • DDM Template from South Hadley inspired by Tim!
  • Slide 37
  • Where are you? Have you developed common assessments? Do your common assessments focus on core course objectives? Have you determined departmental rubrics? Have you determined domains to focus on to determine growth? SMART goal? 2 minutes to discuss
  • Slide 38
  • Departmental decisions about prompt for each level If listening, how many times? If speaking, prompt in TL or English? Prompt provided ahead of time? If writing, time/length expectations etc. Administration reliability: protocol
  • Slide 39
  • Example from South Hadley
  • Slide 40
  • Scoring Sheet (SHHS)
  • Slide 41
  • Where are you?
  • Slide 42
  • How do you look at student work to determine quality/growth? Rating examining student work
  • Slide 43
  • Effective Analysis requires trust and commitment How do you build trust and encourage teamwork in your department?
  • Slide 44
  • Developing norms and ACE Habits of Mind Clear meeting objectives Develop norms for working together as a group Focus on evidence and observations not on judgments
  • Slide 45
  • Reliable ratings: Know your rubric **Develop/Select your rubric
  • Slide 46
  • Select the domain for growth Language control, language function Vocabulary use, text type, level of discourse Etc.
  • Slide 47
  • Determine evidence: rating What does quality look like? What are you looking for in a sample? How do we work together to calibrate?
  • Slide 48
  • Develop a protocol for analysis
  • Slide 49
  • Practice! Calibrate select student samples for each department meeting/PD meeting time Split by language mix levels Rate samples and discuss
  • Slide 50
  • How to use for SMART goals Item development Assessment administration Ratings and protocols
  • Slide 51
  • Where are you? Item development Assessment administration Ratings and protocols Impact on action plan and/or SMART goals? Whats your next step? One response from group to share
  • Slide 52
  • One participant to share with group.
  • Slide 53
  • Using Assessment Data to Identify Feedback for Student Growth Connecting learning and teaching
  • Slide 54
  • What should my students be able to do? Performance targets
  • Slide 55
  • What should students be able to do? Proficiency targets
  • Slide 56
  • Novice Respond to simple questions Use: isolated words lists of words memorized phrases some personalized re-combinations of words or phrases Ask memorized, formulaic questions Satisfy immediate needs WORD level
  • Slide 57
  • Intermediate Conversation partner in simple, direct conversations Describe and narrate Ask and answer simple questions Handle survival language Create with language SENTENCE level
  • Slide 58
  • Advanced Participate actively in conversations, formal and informal settings Describe & narrate in major time frames, good control Use variety of communicative devices to deal effectively with unanticipated complications Sustain communication with paragraph length connected discourse, accuracy and confidence Satisfy demands of work/school situations PARAGRAPH level
  • Slide 59
  • Proficiency Families All three levels perform intermediate mid tasks: Novice HighIntermediate LowIntermediate Mid most of the timeall of the timewith ease & confidence All three levels perform advanced mid tasks: Intermediate HighAdvanced LowAdvanced Mid: most of the timeall of the timewith ease & confidence
  • Slide 60
  • PSB Proficiency Targets
  • Slide 61
  • Take 2 minutes and discuss with your partner. Does your department have a shared understanding of performance and proficiency? Do you have a shared understanding of the performance targets for your program?
  • Slide 62
  • What do the assessment results tell me about my students? How do I help students progress along the proficiency continuum? Feedback
  • Slide 63
  • Slide 64
  • Individual feedback directs students to intended learning points out what student is doing well; offers specific information to guide improvement given only when students have at least some understanding doesnt do the thinking for students limits corrective information to what the student can act on How Am I Doing? by Jan Chappuis in Educational Leadership, September 2012 (Vol. 70, #1, p. 36-40), www.ascd.orgwww.ascd.org
  • Slide 65
  • Internal assessment: rubric as feedback
  • Slide 66
  • Slide 67
  • Feedback: self-assessment & goal setting Can-do statements Student goals Teacher goals
  • Slide 68
  • Take 2 minutes and discuss with your partner. How might the use of common assessments & analysis of student work inform instruction and curriculum in your department?
  • Slide 69
  • External Assessment
  • Slide 70
  • External Assessment: STAMP 4S
  • Slide 71
  • External assessment: feedback for growth
  • Slide 72
  • External Assessment: AAPPL
  • Slide 73
  • Slide 74
  • Common assessments, analysis and protocols, feedback for growth, oh my! Application: Making it work for you
  • Slide 75
  • Takeaways Reflect on the initial goal that you developed during the warm up and consider these questions: What are your next steps? Whats your action plan? What do you want/need to learn more about? SUMMER ACADEMY reminder and handout