McLeod 2007 MN-SDC PowerPoint

Preview:

DESCRIPTION

2007 Minnesota Staff Development Council Annual Forum. May 16, 2007. Dr. Scott McLeod, CASTLE, www.scottmcleod.net.

Citation preview

USING DATA TO MAKE DECISIONS:Results from the Minnesota Statewide

DDDM Readiness Study

Dr. Scott McLeodDr. Karen Seashore

University of Minnesota

Get this presentation

See the RESOURCES

section of your handout!

Frequent formativeassessments

Professional learningcommunities rooted

in student information

Making instructionalchanges

•Data safety Data transparency

•TechnologyAlignment for results

Go

od

b

ase

line

da

taM

easurableinstructional goals

9 essential elements of data-driven PLCs

Respondents

Respondents

• Teachers (n = 3,135 / 11,120?) (28%?)

• Principals (n = 791 / 1,770) (45%)

• Superintendents (n = 202 / 351) (58%)

• District technology coordinators (n = 139 / 351) (40%)

4,267 Minnesota educators

Awesome!

Respondents by gender, race / ethnicity

96% White

Respondents by urbanicity

Respondents by level

Respondents by AYP status

Assessment Intensity

Frequent formativeassessments

Professional learningcommunities rooted

in student information

Making instructionalchanges

•Data safety Data transparency

•TechnologyAlignment for results

Go

od

b

ase

line

da

taM

easurableinstructional goals

9 essential elements of data-driven PLCs

I receive state assessment results each year [teachers]

I receive state assessment results each year [teachers]

I receive other yearly assessment results each year [teachers]

I receive other yearly assessment results each year [teachers]

Teachers collaborate to create and use common periodic assessments for

student progress monitoring [teachers]

Teachers collaborate to create and use common periodic assessments for

student progress monitoring [teachers]

Teachers use other (not teacher-created) periodic assessments for student progress monitoring [teachers]

Teachers use other (not teacher-created) periodic assessments for student progress monitoring [teachers]

Summary

• Lots of teachers are NOT intersecting with yearly data

• Some differences between secondary subject areas

• Clear, consistent downward gradient from elementary to secondary

Let’sRecap

Beliefs About Types of Assessments

Frequent formativeassessments

Professional learningcommunities rooted

in student information

Making instructionalchanges

•Data safety Data transparency

•TechnologyAlignment for results

Go

od

b

ase

line

da

taM

easurableinstructional goals

9 essential elements of data-driven PLCs

Assessments are aligned withstate curriculum standards

Assessment results are easy tounderstand and interpret

Assessment results are detailed enough to adequately inform teachers’ instruction

Assessment results are timely enough to adequately inform teachers’ instruction

Summary

• Weak agreement that assessments are aligned with standards

• Non-state assessments are– easier to understand– more detailed– much more timely

Let’s Recap

Other Components of the Core

Frequent formativeassessments

Professional learningcommunities rooted

in student information

Making instructionalchanges

•Data safety Data transparency

•TechnologyAlignment for results

Go

od

b

ase

line

da

taM

easurableinstructional goals

9 essential elements of data-driven PLCs

Measurable instructional goals

Measurable instructional goals

Measurable instructional goals

Teacher teams (PLCs) that meet regularly

Teacher teams (PLCs) that meet regularly

Teacher teams (PLCs) that meet regularly

Making instructional changes

Making instructional changes

Making instructional changes

Summary

• Administrators less positive about teacher behavior

• Teachers feel collaboration time is inadequate

• Clear, consistent downward gradient from – elementary to

secondary– AYP to No AYP

Let’s Recap

Supporting Conditions

Frequent formativeassessments

Professional learningcommunities rooted

in student information

Making instructionalchanges

•Data safety Data transparency

•TechnologyAlignment for results

Go

od

b

ase

line

da

taM

easurableinstructional goals

9 essential elements of data-driven PLCs

Data access and transparency

Data access and transparency

Data access and transparency

Data safety

Data safety

Data safety

Technology

Technology

Technology

Alignment for results

Alignment for results

Alignment for results

Summary

• Teachers less positive about supporting conditions

• Clear, consistent downward gradient from – elementary to secondary– AYP to No AYP

Other Factors

Leadership and support

Leadership and support

Leadership and support

Professional development

Professional development

Professional development

Beliefs

Beliefs

Beliefs

Summary

• Teachers less positive about– administrator support– staff development

• Teachers more likely to believe achievement is out of their control

• Clear, consistent downward gradient from– elementary to secondary– AYP to No AYP

Let’srecap

A Few Last Things

Teachers most likely to agree that…

1. They have the knowledge and skills to improve student learning

2. They can significantly affect students’ achievement levels by trying different teaching methods

3. If they constantly analyze what they do and adjust to get better, they will improve

4. District goals were focused on student learning

5. They feel some personal responsibility when school improvement goals are not met

Teachers most likely to disagree that…

1. They are given adequate time for collaborative planning

2. State assessments are timely enough to adequately inform instruction

3. They have significant input into data management and analysis practices

4. State assessments are detailed enough to adequately inform instruction

5. They have received adequate training to effectively interpret and act upon yearly state assessment results

Miscellaneous comments

Our success as educators should be determined primarily by our impact upon student learning

Our success or failure in teaching students is primarily due to factors beyond our control

rather than to our own efforts and ability

• State test data aren’t very useful

• Teachers feel less positively about school and district DDDM activity than do administrators

• Significant percentages of teachers are not intersecting with DDDM

• Clear, consistent differences between– elementary and

secondary– AYP and NO AYP

Overall summary of descriptive statistics

Let’srecap

Frequent formativeassessments

Professional learningcommunities rooted

in student information

Making instructionalchanges

•Data safety Data transparency

•TechnologyAlignment for results

Go

od

b

ase

line

da

taM

easurableinstructional goals

9 essential elements of data-driven PLCs

Next steps = more sophisticated statistics

• Factor analysis

Example

P34 (goals) +P41 (transparency) + P43 (technology) + P47 (prof devt) + P51 + P53 + P54 + P55 (alignment) = ADMIN BEHAVIOR

Next steps = more sophisticated statistics

• Regression, SEM, maybe HLM– dependent variables

• DDDM study results (including factors)• MDE attendance / mobility• MDE enrollment• MDE languages• MDE licensed staff• NCES Common Core of Data

– independent variables• DDDM study results (including factors)• MDE achievement (AYP status, MCAs)• MDE dropouts / graduation

Wrap-up

• Questions?

• Reactions?

• Implications for action?

Recommended