Upload
bryce
View
62
Download
0
Tags:
Embed Size (px)
DESCRIPTION
DAC Academy 2012 August 23, 2012 Colorado Department of Education Assessment Unit. Welcome!. Introductions Assessment News and Reviews Introduction to Partnership for Assessment of Readiness for College and Careers (PARCC) - 15 Minute Break - PowerPoint PPT Presentation
Citation preview
DAC Academy 2012August 23, 2012
Colorado Department of EducationAssessment Unit
Welcome!• Introductions• Assessment News and Reviews• Introduction to Partnership for Assessment of
Readiness for College and Careers (PARCC)- 15 Minute Break
• Introduction of New Science and Social Studies Assessments
• Update on the Colorado Content Collaboratives- Lunch
• Afternoon Break-Out Sessions
Meet the Assessment Leadership and Support Team
Joyce ZurkowskiExecutive Director
of Assessment
Margo AllenBusiness Process
Manager
Christina Wirth-Hawkins
Assistant Director of Assessment
Meet the Assessment Team
Jason ClymerTCAP
Glen SirakavitNew Assessment
Christine DeinesCO ACT
Angela NorlanderContent
Collaboratives
Pam A. SandovalNAEP
CoordinatorMira MonroeSpecial Ed.
Heather Villalobos Pavia
ELL
Meet the Assessment Data Team
Jasmine CareyPsychometrician
Jessica AllenData
Additional CDE Staff
Linda LamirandeSpecial Ed.
Bill BonkAccountability
TRANSITIONAL COLORADO ASSESSMENT PROGRAM (TCAP)
Jason Clymer
TCAP and Summative Assessment Timeline
• TCAP and CoAlt Continue• Field test new social studies and
computer based science items2013• TCAP and CoAlt Reading, Writing, and
Math will continue• First year of new social studies and
science assessments will be operational2014
• New Reading Writing, and Math assessments (PARRC)
• Second year of new social studies and science assessments will be operational
2015
TCAP 2012
• Congratulations on a successful first year of TCAP!
The State of Reading• Grades 3, 4, 6 and 7
demonstrate upward trends in reading proficiency
3 4 5 6 7 8 9 100.0
10.0
20.0
30.0
40.0
50.0
60.0
70.0
80.0
90.0
100.0
CSAP/TCAP Reading Percent Proficient and Advanced 2005-2012
20052006200720082009201020112012
Grade
Perc
ent P
rofic
ient
& A
dvan
ced
The State of Writing
• Grades 5, 7 and 8 have higher proficiency levels than 2005 levels.
3 4 5 6 7 8 9 100.0
10.0
20.0
30.0
40.0
50.0
60.0
70.0
80.0
90.0
100.0
CSAP/TCAP Writing Percent Proficient and Advanced 2005-2012
20052006200720082009201020112012
Grade
Perc
ent P
rofic
ient
& A
dvan
ced
The State of Mathematics
• All grade levels have higher proficiency levels than 2005 levels
3 4 5 6 7 8 9 100.0
10.0
20.0
30.0
40.0
50.0
60.0
70.0
80.0
90.0
100.0
CSAP/TCAP Mathematics Percent Proficient and Ad-
vanced 2005-2012
20052006200720082009201020112012
Grade
Perc
ent P
rofic
ient
& A
dvan
ced
The State of Science
• All grades show improvement in proficiency compared to the 2008 scores
5 8 100.0
10.0
20.0
30.0
40.0
50.0
60.0
70.0
80.0
90.0
100.0
CSAP/TCAP Science Percent Proficient and Advanced 2008-2012
20082009201020112012
Grade
Perc
ent P
rofic
ient
& A
dvan
ced
TCAP 2012 Issues
• Parent “Opt Out”– All students must test– Letter addressing opt out will be updated and
re-released in 2013• Multiple Misadministrations
– Wrong session given: Reading/Writing– Using old items for test preparation– Students discussing items
Clarifying Procedures
• New test security: forthcoming• Procedures Manual update: coming in
September• Frequently Asked Questions (FAQ)• Reading after the test: updated script in
Test Proctor’s Manual• Supplementary training: Oral Scripts,
Teacher Read Directions, and other topics
TCAP 2013
• Will still assess the same content as noted in the TCAP Frameworks
• Will still be administered at the same time and in the same manner
• Training schedule will remain the same• Scoring and reporting will remain on the
same timeline
ACCOMMODATIONSMira Monroe
Accommodations What’s New?
• Format:– Includes instructions on all
accommodations– Tables appear to have more restricted
accommodations – but…
• iPad: Not allowed
• Verification of Removal Form• ACCESS for ELLs®
Accommodations• September: Training
» Statewide
• 10/31 – 11/23: Order Special TCAP» Navigator
• November 15: Extra Special TCAP Due» Mira
• December 15: Non-standard Accommodations
» Mira
Accommodations State Monitoring Visits
• Coordinating visits with Title programs• Mira and Heather will be doing visits
COLORADO ALTERNATE (CoAlt)
Mira Monroe and Linda Laraminde
Colorado Alternate Assessment CoAlt
• October 10 – 24: Order Material• November: Administration
Training
• February 6 – March 22: Test Window
• March 26: Schedule Pick-up
Eligibility is determined by the IEP team
1st Determine Academic Standard2nd Determine Assessment
Questions Contact:
Linda LamirandeExceptional Student Services [email protected]
Alternate Standards and Assessment Eligibility Criteria Worksheet
COLORADO ACT (CO ACT)Christine Deines
DAC Academy 2011
Colorado ACTCollege Entrance Exam
Accepted by U.S. Colleges, Universities, Military Academies and NCAA
Colorado ACT (COACT) State test date: April 23, 2013 Make-up test date: May 7, 2013 2013 Test Date on TUESDAY Accommodations Testing Dates: April 23 – May 7, 2013 All 11th grade students by law
11th Grade Alternate for students eligible to take CoAlt Managed by Exceptional Student Leadership Unit Testing Window: April 1 – April 26, 2013 Contact Linda Lamirande, ESSU 303-866-6863
Colorado ACT
COACTTest Supervisor (TS)
DAC
DACs may need to develop a
communication process with Test Supervisors for Accountability*.
*Test Supervisors must develop a communication plan with Back-up
Test Supervisors and Test Accommodations Coordinators.
COMMUNICATION
CONT
RACT
DAC Academy 2011
CO ACT UPDATESONLINE Schools
• New changes online schools: Two national tests date options to online students ONLY: – February 9, 2013 & April 13, 2013
• Student can choose either OPTION• Student can pay $20 to take April 13, 2013 if
they registered and miss Feb. , 2013 test option
DAC Academy 2011
ACT Graduating Class Reporthttp://www.cde.state.co.us/assessment/documents/coact/data/DifCOACTProfileReport_GradClassReport.pdf
• ACT Profile Report– Results of All Colorado Public School’s Spring Testing
Population– Results for State Mandated Test
• ACT Graduating Class Report– Most recent test Date for each student in most recent
graduating class in a Colorado HS (both private & public)
DAC Academy 2011
ACT Graduating Class ReportState
Percent of Graduates
Tested
Average Composite
Score
Percent Meeting English
Benchmark
Percent Meeting Reading
Benchmark
Percent Meeting
Math Benchmark
Percent Meeting Science
Benchmark
Illinois 100 20.9 65 47 44 30North Dakota 100 20.7 64 49 45 30Utah 97 20.7 64 54 40 29Colorado 100 20.6 62 47 41 31Louisiana 100 20.3 68 46 35 22Wyoming 100 20.3 60 46 38 28Michigan 100 20.1 59 45 36 26Kentucky 100 19.8 59 44 31 22Tennessee 100 19.7 59 43 29 21Mississippi 100 18.7 53 34 21 14National 52 21.1 67 52 46 31
DAC Academy 2011
NATIONAL ASSESSMENT OF EDUCATIONAL PROGRESS (NAEP)
Pam A. Sandoval
DAC Academy 2012
National Assessment of Educational Progress NAEP2013 National and state sample of grades 4 & 8 and national sample ofgrade 12(Close to 17,000 schools and 795,000 students nationwide)
NAEP 2013 Assessment Window
*Each selected student is tested in one subject, only.
National Assessment of Educational Progress NAEP
Participating schools selected by NationalNAEP Statisticians
– Most schools identified last May– 99% of participating NAEP districts have received initial
notification from NAEP State Coordinator– Will receive state & national results for grades 4 and 8 in
reading and math in fall of 2013. A few districts will also take the TEL test (technology & engineering literacy) which is a computer-based field test. In NAEP, we do not receive disaggregated results for districts or individual schools- not designed for this.
DAC Academy 2012
National Assessment of Educational Progress NAEP Roles
DAC Academy 2012
School & Community
NAEP State Coordinator
NSC
National NAEP Office
Contracted Assessment
Team: Westat
National Assessment of Educational ProgressNAEP- Relationship between NAEP & School Community
NAEP School Coordinator
NAEP State Coordinator
School & Community
District Contact• Assists NSC in
School communications
• Works with NSC and the Supervisor to
Oversee the process
DAC Academy 2012
• Confirms the assessment date
• Provides schools with info for parental notification
• Responds to questions
• Works with district/school personnel to ensure a smooth process
• Reports the results
2011 Math Grade 4: Average Scale Score
Higher
Not significantly different
Lower
2011 Reading Grade 4
*Significantly different (p < .05) from 2011
2011 Math Grade 8: Average Scale Score
Higher
Not significantly different
Lower
2011 Reading Grade 8
*Significantly different (p < .05) from 2011
NAEP-The Nation’s Report Card® public web site: http://nces.ed.gov/nationsreportcard/
NAEP-The Questions Tool http://nces.ed.gov/nationsreportcard/itmrls/
DATA OPERATIONSJessica Allen
DATA OPERATIONS
• CDE’s role is to support districts in data collection activities for TCAP/CoAlt, CO ACT and ACCESS for ELLs.
• This presentation will provide a brief overview of major data and logistic activities.
• A Handout with dates, resources specific to each activity and assessment is posted on the website.
DAC Academy 2012
Data OperationsEssentials for DACs
• Materials Ordering• Automatic Data Exchange (ADE)Collections
– Collect accurate data for test book labels (Pre-Coded Labels). – Verify student biographical data after testing (Student
Biographic Data (SBD)).• Logistics
– Handling of testing materials before, during, and after testing
• Final Assessment Results
DAC Academy 2012
Data Operations Ordering Materials
• TCAP – October Count is used for initial order, December/January
Pre-Coded labels used to adjust list.• CoAlt
– Online Enrollments –CTB Navigator• ACCESS for ELLs
– Online - MetriTech's website • CO ACT
– Order Online.– Email sent via ACT
DAC Academy 2012
Data Operations Pre-Coded Labels (PCL)
• Label applied to test booklet that links student information (e.g. name, gender) and eliminates having to ‘bubble’ this information on test booklet.
• TCAP/CoAlt and CO ACT-Students data sources– October Student Count and December/January
PCL Collection.• ACCESS for ELLs
– October Student Count information.
DAC Academy 2011
Data Operations: LogisticsReceiving, processing, and shipping test materials
• Training posted on the assessment website in November.
• Topics will include– Recording test invalidations and accommodations at time
of testing.– Creating the School Group List (SGL)– Tracking the number of tests returned by content area,
grade, and school– Procedures for Home Schooled students– N Count on Navigator
DAC Academy 2011
Data OperationsStudent Biographical Data (SBD) Review
• Opportunity to review and verify SBD data.• Training will be online in late February 2013.• Optional but necessary for any accountability
appeals that use assessment data.
DAC Academy 2012
Data OperationsN Counts – TCAP and CoAlt Only
• Review and verification number of test booklets submitted to CTB.
• Districts work directly with CTB.
DAC Academy 2012
Data Operations Final Assessment Results
• Districts receive information directly from testing companies.
• All results are embargoed from public distribution until a specific date
DAC Academy 2012
Data Operations: ADE
• CDE system• Each collection requires registration• Assessment Collections
- Pre-Coded Labels- Student Biographical Data Collection (Two stages)
- Stage 1: Download, edit and upload an approved file.- State 2: Review of approved file
DAC Academy 2011
Data Operations: ADE DAC Academy 2011
https://cdeapps.cde.state.co.us/
Link to system. Requires password. Data only available during review window.
Link to support documents.Separate section for each ADE collection.
Data Operations: ADE Documentation DAC Academy 2011
https://cdeapps.cde.state.co.us/doc_toc.htm
Data Operations: Final Remarks
• DAC emails provide information about upcoming data activities, availability of support documents and other information as needed.
• Each assessment is unique.
DAC Academy 2012
NEW ENGLISH LANGUAGE LEARNER ASSESSMENTSW-APT™
ACCESS FOR ELLS®
Heather Villalobos Pavia
Placement test: W-APTPurpose of W-APT• Identify students who may be candidates for ELL
programming
• Administer upon enrollment to determine the English language proficiency level of students new to the school or school system in order to provide ELL programming
• The W-APT is NOT used for program exit decisions
Characteristics of the W-APT• Aligned to English Language Proficiency (CELP)
Standards,
• 5 grade level cluster forms: K, 1-2, 3-5, 6-8, 9-12
• Results in scores from proficiency levels 1-6
• Speaking individually administered. Listening, Reading and Writing individually or small group administered.– First semester kindergarten only assesses speaking and
listening.
Annual measure: ACCESS for ELLs
Purpose of ACCESS for ELLs • To monitor students' progress in acquiring
(academic) English
• One component in the body of evidence used when making program exit decisions
Characteristics of ACCESS for ELLs
ACCESS for ELLs test items are written from the model performance indicators of the five English Language Proficiency (CELP) standards:
• Social & Instructional Language• Language of Language Arts• Language of Mathematics• Language of Science• Language of Social Studies
Characteristics (continued)
• Test forms are available in three overlapping tiers for each grade level cluster – Tier A: Proficiency levels 1-3– Tier B: Proficiency levels 2-4– Tier C: Proficiency levels 3-5
• Test administrator scripts are different for each test form
• Administered in groups of up to 22 students
Notable Differences with ACCESS for ELLs
• Kindergartners are tested 1 on 1• Must be a district employee to administer
the test• Listening is not on CD• Do NOT order overage, MetriTech
calculates an automatic 5% overage
PARTNERSHIP FOR ASSESSMENT OF COLLEGE AND CAREER READINESS (PARCC)
Christina With-Hawkins
Reading, Writing and Mathematics
• Recent legislation– Requires Colorado to participate as a Governing
Board member in a consortium of states that focuses on the readiness of students for college and careers.
– Requires the Board to rely upon the assessments developed by the consortium expected to be ready for spring 2015.
– Encourages the Board to conduct a fiscal and student achievement benefit analysis of Colorado remaining a Governing Board member starting on or before January 1, 2014.
PARCC • Colorado joined PARCC as a governing
member in August 2012.
• English Language Arts and Mathematics in grades 3-11
• Computer-based/Paper-Pencil
• First operational assessment: spring 2015
PARCC States
PARCC Governing States• Approve test specifications, priorities for content
assessed on each component, and recommended scoring model
• Develop long-term sustainability plans for the consortium and assessment system, including through design of the tests and ability to refresh over time
• Approve solicitations and select vendors for PARCC procurements
• Determine highest priority model instructional tools for PARCC to develop
PARCC Governing States (Continued)
• Build and expand cadres of K-12 educators and postsecondary faculty leading CCSS implementation and PARCC assessment development
• Ensure the assessment results provide the data needed to support state accountability mechanisms and educator evaluation model– Participate in technical & policy working groups on accountability
to help identify solutions to pressing accountability transition challenges and new approaches to accountability through ESEA waivers
PARCC Assessments• In English Language Arts/Literacy, whether
students:– Can read and comprehend complex literary and
informational text– Can write effectively when analyzing text– Have attained overall proficiency in ELA/literacy
• In Mathematics, whether students:– Have mastered fundamental mathematical concepts– Can apply those knowledge and skills in novel
situations
PARCC Assessment Design
68
End-of-Year Assessment
•Innovative, computer-based items
•Required
Performance-Based
Assessment (PBA)
•Extended tasks
•Applications of concepts and skills
•Required
Diagnostic Assessment•Early indicator of student knowledge and skills to inform instruction, supports, and PD•Non-summative
2 Optional Assessments/Flexible Administration
Mid-Year Assessment•Performance-based
•Emphasis on hard-to-measure standards
•Potentially summative
Speaking And Listening Assessment• Locally scored• Non-summative, required
69
PARCC Goal: Build a Pathway to College and Career Readiness for All Students
K-2 3-8 High School
K-2 formative
assessment aligned to the PARCC
system
Timely student achievement data showing students,
parents and educators whether ALL students are on-track to college and career readiness
ONGOING STUDENT SUPPORTS/INTERVENTIONS
College readiness score to
identify who is ready for
college-level coursework
SUCCESS IN FIRST-YEAR,
CREDIT-BEARING,
POSTSECONDARY
COURSEWORK
Targeted interventions &
supports:• 12th-grade
bridge courses• PD for
educators
Tools & Resources• Purpose: Support implementation of the CCSS - support
development of assessment blueprints; provide guidance to state, district- and school-level curriculum leaders in the development of aligned instructional materials
• Audience: State and local curriculum directors (primary audience) ; teachers
• URL: http://www.parcconline.org/parcc-model-content-frameworks
Model Content
Frameworks
Model Instructiona
l Units
• Purpose: Public review of two draft policies and PLDs• Audience: Broad audience: teachers, schools, districts
states (for CCSS implementation and PARCC assessment preparation)
• Timeline: Feedback by September 21, 2012• URL: http://www.parcconline.org/crd-pld-survey
Draft Policy and
Descriptors
70
• Item Development Contracts– Contracts with 2 consortia of vendors
– Item development officially launched in June 2012
– After 50% of work complete, PARCC will evaluate quality, rigor and innovation and re-award contract to vendor(s) who meet threshold for completing work
• Item Review Process– Core Leadership Review Teams from across PARCC
states/Operational Working Groups
– Bias & Sensitivity Review Team
– Local Educator Review Teams
Launching Item Development
71
PARCC Sample ItemsEnglish Language Arts
• Grade 3
• Grade 7
• Grade 10
PARCC Sample Items: ELA
PARCC Sample Items Mathematics
PARCC Sample Items: Mathematics
• Grade 3 : The Field
• Grade 6: Cake Weighing (Dana Center)
• High School – Golf Balls in Water
76
PARCC Timeline Through 2011-12
Fall 2011
Winter 2012
Spring2012
Summer 2012
PARCC Assessment Implementation
PARCC Tools & Resources
Model Content Frameworks released
(Nov 2011)
Educator Leader Cadres
launched
Sample summative assessment items
released
Item development
begins
Updated Model Content Frameworks
Released
Fall2012
77
Timeline Through First PARCC Administration in 2014-2015
PARCC Tools & Resources
College-ready tools released
Partnership Resource
Center launched
Professional development
modules released
Diagnostic assessments
released
Pilot/field testing begins
Expanded field testing of diagnostic
assessment
Optional Diagnostic and
Midyear PARCC Assessments
Spring2013
Summer 2013
Winter 2014
Spring2014
Summer
2014
Fall2013
Fall2014
PARCC Assessment Implementation
Expanded field testing
K-2 Formative Tools
Released
Winter 2015
Spring2015
Summative PARCC
Assessments (2014-15 SY)
Standard Setting in Summer
2015
NEW COLORADO SCIENCE AND SOCIAL STUDIES ASSESSMENTS
Glen Sirakavit
Science and Social Studies Assessments
• Based on the Colorado Academic Standards• Grades:
– Science: grades 5, 8 and 11– Social Studies: grades 4, 7 and 11
• Timeline– Field test administration in spring 2013– Operational administration in spring 2014
Science and Social Studies Assessments
• Attain balance:– Innovation with technical soundness and
feasibility– Breadth with depth
• Take advantage of technology:– Development: item type
• SR, CR, simulation/performance-based– Administration: computer-based– Scoring: automated and artificial intelligence
Science and Social Studies Assessments - CoAlt
• Attain balance:– Innovation with technical soundness and feasibility– Appropriate for students with significant cognitive
disabilities• Take advantage of technology:
– Development: item type • SR and supported performance tasks
– Administration: test examiner– Scoring: test examiner scoring with score input online
Science and Social Studies Assessments
Examples of Technology-Enhanced Items
Opportunities for District Involvement in Development
• Item writing (for 2013 FT) – Fall 2012• Item review – Late Fall 2012• Cognitive labs – Early Spring 2013• Field testing – Late Spring 2013• Anchor paper selection – Early Summer 2013• Data review – Early Summer 2013• Item writing 1(for 2014 FT) – Spring 2013• Item writing 2 (for 2014 FT) – Early Summer 2013
Opportunities for District Involvement in CoAlt Development
• Item writing (for 2013 FT) – August 2012• Item review – November 2012• Field testing – April 2013• Data review – Early Summer 2013• Item writing 1(for 2014 FT) – Spring 2013• Item writing 2 (for 2014 FT) – Early Summer 2013
Paper to Online Assessments• Recurrent theme in next generation
assessment strategies• Leveraging advances in technology for greater
efficiency, flexibility, and potential cost savings• Benefits increasingly apparent
– Opportunities for more effectively assessing student understanding and performance
– Faster turnaround of scores– Improved security model– More efficient method of test delivery– Student motivation
• Moving online offers greater opportunity to integrate/align instruction and assessment
• But… How to make such a large, complex transition?
From “Considerations for Next-Generation Assessments: A Roadmap to 2014”, Pearson.
Three Levels of “Readiness” for Online Testing• School
– Students• Training, practice, familiarity
– Teachers, administrators & technology staff• Close partnerships, training, policy administration
– Network & Infrastructure• Setup, computer/lab logistics & load planning
• District– Coordination, especially between assessment &
technology organizations– Network-wide capacity planning
• State– Policies, transition planning, & decision making
From “Considerations for Next-Generation Assessments: A Roadmap to 2014”, Pearson.
Example: Managing Assessment Data Load
When properly used, caching or proxy solutions can reduce the load of data traffic that online assessments place on the network capacity
When testing begins, multiple
streams of identical,
redundant data can clog up and overwhelm the
district or school network
Five Step Roadmap for Transitioning to Online Assessments
1. Conduct a Needs Analysis
2. Develop a Realistic Transition Strategy & Plan
3. Ensure Interoperability
4. Communicate Proactively
5. Plan for Ongoing Change
The full roadmap and additional resources are available online at: www.PearsonAssessments.com/NextGenRoadmap
From “Considerations for Next-Generation Assessments: A Roadmap to 2014”, Pearson.
Highlights from the Roadmap – Steps 1 & 2• Step 1 – Conduct a Needs Analysis
– Start with content & assessment design requirements
– Conduct a “readiness” survey of district and school technology
• Step 2 – Develop a Realistic Transition Plan– Focus on a multi-year, graduated strategy
and schedule– Success is built on providing
districts/schools with online testing experience prior to ramping up testing for all students in all content areas
Highlights from the Roadmap – Steps 3 - 5
• Step 3 – Ensure Interoperability– Standards are jointly agreed-upon limitations and constraints – Very important to have engagement from both technology and
assessment community• Step 4 – Communicate Proactively
– Find or create a forum for engaging personnel across districts, both within and across states
– Build training for both assessment and tech staff• Step 5 – Plan for Ongoing Change
– Unlike paper assessments, technology running online assessments will continue to change
– Plan for recurring readiness checks, & build state-district communication into planning
The Technology Readiness Tool• Pearson contracted to develop
• Both national assessment consortia will provide the tool to the states to deploy in six data collection windows between 2012 and 2014
• Will collect local data to determine technology readiness for online assessments, and provide gap analysis
• Will use data to support local/state/national planning for the transition
Measuring Local Readiness
Readiness for online assessments has multiple different dimensions:1. Computers & other devices
Minimum system requirements2. Ratio of devices to test-takers
Including testing window and session scheduling3. Network and infrastructure
Bandwidth, network utilization, size of content4. Personnel (staffing & training)
• Mid-July through August– Superintendents appoint District Technology Coordinators (DTC)
• August through December– Pearson/CDE notify districts of survey window, provide web-based
training and provide access to the Survey & Readiness Tool– Pearson/CDE identify/confirm field test participants
• Field test districts who have technology challenges will receive extra support to complete an action plan
• November through December– Pearson/CDE conduct trainings and open training centers
• January through March– Districts install proctor caching, configure PearsonAccess & TestNav– Districts complete certification checklist
• May– Feedback from districts on the test administration
Measuring Local Readiness
Online Testing District Readiness Process
Identify DTCs
• Distribute letter to District• Identify District Technology Coordinators
Execute Survey & Readiness
Tool
• Provide training and access to Survey & Readiness Tool• Execute the Survey & Readiness Tool and capture results
Determine Field Test Districts
• Recruit Field Test Districts• Provide training sessions for participating District Technology
Coordinators
Open Online Training Center
• Verify District’s online test environment• Teachers & Students have access to “sandbox” online test
system
Special Notes for Technology Staff
• Technology leadership & support is critical– Providing readiness data for
statewide transition and gap analysis– Guidance & support for local, state
and national planning
• Assessment and technology issues are intertwined – solutions require cross-disciplinary understanding– Statewide equity and comparability issues– Security of test content– Importance of cross-training local personnel, and evaluating
support for entire assessment solution– Forward planning for the future
Lessons Learned from Other States about Transitioning to Online Assessments
• Phased approach– Start small and build online capacity– Initially there may be problems, difficulties, challenges before
reaching stability• Communicate and promote the advantages/benefits
– Students are engaged– Interactivity of technology-enhanced items– Testing interface is user-friendly and accessible– Reduced administrative burden
• No need to inventory test materials and risk losing them– Scores are returned sooner– Built-in Online Accommodations
• Oral Scripts would not require additional proctors/testing environments
Lessons Learned from Other States about Transitioning to Online Assessments
• Challenges– Might need more proctors/administrators in each test
environment• 2 test proctors walking around and observing students directly and 1 test
administrator watching the computer monitor to ensure students are on task– Distinction between instructional technology and assessment
technology• With the abundance of high tech consumer products, our constituencies may
expect the transition to be much quicker– Setting testing windows
• Tension between longer testing windows to better manage load and the timeline that most schools/districts prefer
– Collaboration between testing personnel and technology personnel
• At the school, district, and state level, know who is responsible for what and who to contact when there is a problem
Lessons Learned from Other States about Transitioning to Online Assessments
• Recommendations– Certification
• Require schools to self-certify that they have met the guidelines• Require schools to validate what was reported with self-certification by using
software or an outside company to provide independent certification– Test the system– Train all new users– Develop an emergency plan
• Direct access to key staff of the vendor– Conduct surveys and special studies to get feedback from district
administrators, teachers, students• Perceptions of test administrators vs. what the thoughts of students actually were
Colorado Content Collaboratives
Overview and UpdateAugust 2012
Angela Norlander
colorado content collaboratives cde
The Right Question
• For the student
• For the teacher
What does
mastery look like?
colorado content collaboratives cde
How Colorado Will Determine Student Learning
Quality Criteria for One Measure
Multiple Measure Design Principles for Combinations of Measures
Growth Measure Development
colorado content collaboratives cde
Content Collaboratives--Cohorts
Cohort One February–May 2012
• Dance• Drama & Theatre Arts• Music• Reading, Writing &
Communicating• Social Studies• Visual Arts
Cohort Two June-December 2012
• Comprehensive Health• Mathematics• Physical Education• Science• World Languages• Career and Technical
Education
colorado content collaboratives cde
2012 Purpose The objective is to identify an initial bank of high quality
student academic measures which can be used to determine, in part, the effectiveness of an educator
Sample measures in each grade for each subject will establish the beginning of on-going “build out” of the bank
Over time, the Content Collaboratives will focus on developing instructional resources, creating performance tasks and continue to populate the bank with multiple measures that represent both student learning and educator effectiveness
colorado content collaboratives cde
What goes in the bank?
Identification of assessments districts can use
Multiple modes of actual assessmentsFuture tasks and items which may become
eligibleProtocol for eligibility
colorado content collaboratives cde
Pilot then peer
review
NationalResearchers
I: Jan-Mar 2012II: Jun-Aug 2012
I: Feb-May 2012II: July-Nov 2012 I &II:
Feb-Dec 2012I & II: Aug 2012-
Aug 2013
I & II Nov 2012-Aug 2013
Researchers gather existing fair, valid
and reliable measures for consideration.
Technical Steering Committee creates
frameworks and design principles for collaboratives
to use in reviewing and creating measures.
Committee reviews
recommendations of collaboratives.
Piloting and peer review of
measures.
Aug 2012-Aug 2013: Cohort I piloting & peer
review
January 2013-Aug 2013:
Cohort II piloting & peer review
Measures placed in
online Education
Effectiveness Resource Bank for voluntary
use.
Collaboratives use protocol to review
researchers’ measures for
feasibility, utility and gaps.
Prepare to fill gaps.
Provide recommendations
to Technical Steering
Committee.
Cohort I & II: Flow Chart of Work
Colorado Content
CollaborativesTechnical Steering
CommitteeFuture WorkBank
Who is helping us?•Researchers
•Technical Steering Committee•Center for Assessment (NCIEA)
•Pilot Districts•Peer Reviewers
•Other states and districts
colorado content collaboratives cde
High Quality Assessment Content Validity Review Tool
• A high quality assessment should be...Aligned• A high quality assessment should be…Scored using Clear Guidelines and Criteria• A high quality assessment should be...FAIR and UNBIASED• A high quality assessment should…increase OPPORTUNITIES TO LEARN
colorado content collaboratives cde
colorado content collaboratives cde
High Quality Assessment Content Validity Review Tool
Training ModulesDefinitionsExamplesRelease in November 2012
colorado content collaboratives cde
Reading, Writing & CommunicatingAugust 15, 2012
Cohort I: Next StepsContinue to review assessmentsConnecting & collaborating with national
partners to fill gapsAdvanced assessment literacy trainingState Model Curriculum DevelopmentPerformance Task Development
colorado content collaboratives cde 2012
Cohort IIWork began in Pueblo on July 23rd-24th Will meet with Researchers on September
19th in LovelandReview work will continue on September 20th in Loveland, October 24th & 25th in Aurora, and November 14th in Golden
colorado content collaboratives cde 2012
Technical Steering Committee
colorado content collaboratives cde
How Colorado Will Determine Student Learning
Quality Criteria for One Measure
Multiple Measure Design Principles for Combinations of Measures
Growth Measure Development
colorado content collaboratives cde
Technical Steering Committee Met August 2nd
Committee members discussed combination and growth
strategies Representatives from 5 districts
participated as respondentsAgenda and Notes posted on
Content Collaboratives website
colorado content collaboratives cde
Technical Steering CommitteeNext Steps
React to drafts of:•Practical guidelines for districts regarding the combination of multiple measures•Glossary of terms for use in the guidelines•Examples of how to plot growth•Approaches currently being used in Colorado districts
Next in-person meeting:•Wednesday, December 12, 2012, in Denver
colorado content collaboratives cde
2012-2015 Work of Content Collaboratives
2012
• Researchers offer assessments for consideration to the Content Collaboratives• Cohorts I & II of Content Collaboratives review/recommend assessments for piloting• Cohort I assessments begin piloting in Fall 2012 to determine its utility within educator effectiveness evaluations• Guiding principles and criteria posted on the website for designing and vetting assessments to be used in
educator effectiveness evaluations• Begin populating Resource Bank with Cohort I assessments in November 2012
2013
• Continue piloting of Cohort I assessments & begin peer review of assessments in terms of how the assessments function for the purposes of educator effectiveness evaluation
• Begin piloting of Cohort II assessments in January 2013• Begin populating Resource Bank with Cohort II assessments in Winter 2013• Content Collaboratives, using identified measures, begin working on curriculum and instructional designs aligned
to the Colorado Academic Standards
2014• Continue to refine and build the Resource Bank• Build out sophisticated instructional lessons that respond to gaps in student learning
2015• Continue to refine and build the Resource Bank• Continue to build statewide capacity• Continue build-out of the bank in regards to instructional practices
Colorado Content Collaboratives
Contact:Angela NorlanderOffice of Assessment, Research & [email protected]
Website:http://www.cde.state.co.us/contentcollaboratives/
colorado content collaboratives cde