28

Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Embed Size (px)

DESCRIPTION

Creating an Evaluation Framework for Data-Driven Instructional Decision-Making. Sponsored by the National Science Foundation. Contact Information. Ellen Mandinach EDC Center for Children and Technology 96 Morton Street, 7th Floor New York, NY 10014 (212) 807-4207 [email protected]. - PowerPoint PPT Presentation

Citation preview

Page 1: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making
Page 2: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Creating an Evaluation Framework for Data-Driven

Instructional Decision-Making

Sponsored by the National Science Foundation

Page 3: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Contact Information

Ellen Mandinach

EDC Center for Children and Technology

96 Morton Street, 7th Floor

New York, NY 10014

(212) 807-4207

[email protected]

Page 4: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Project Staff

• Ellen Mandinach• Margaret Honey• Daniel Light• Cricket Heinze• Hannah Nudell• Luz Rivas• Cornelia Brunner, special advisor

Page 5: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Overarching Objective

The project will bring together complimentary evaluation techniques, using systems thinking as the primary theoretical and methodological perspective, to examine the implementation and use of data-driven applications in school settings.

Page 6: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

What We Promised

• To create an evaluation framework based on the principles of systems thinking.

• The examination of technology-based tools for data-driven decision-making.

Page 7: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Goal 1

We will build a knowledge base about how schools use data and technology tools to make informed decisions about instruction and assessment.

Page 8: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Goal 2

We will develop an evaluation framework to examine the complexities of dynamic phenomena that will inform the field and serve as a knowledge building enterprise.

Page 9: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Overarching Issues

• The use of the methodological framework to examine data-driven decision-making.

• The development of a systems model for the use of data and the technology-based tools for the participating districts.

• Validation of the models by scaling to a second set of sites.

• Examination and validation of the theoretical and structural frameworks.

Page 10: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Selected Applications

• Handheld diagnostic tools (e.g., Palm Pilots)

• The Grow Network

• Data warehouse

Page 11: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Why Selected

– These projects have been selected for three reasons: • 1. We have existing relationships with both the developers

and the school systems in which they are being implemented. • 2. Through our current research we have developed a

baseline understanding of how the systems are used. • 3. While these initiatives focus on improving student

performance, they use different information sources and strategies in supporting data-driven decision-making. Variability in focus and implementation is particularly relevant to the design of an evaluation framework that can generalize.

Page 12: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Handheld Diagnostics• Ongoing diagnostic assessment in early literacy and

mathematics learning.• Teachers assess student learning using the handhelds.• Teachers upload information from the handhelds to a web-

based reporting system where they can obtain richer details about each student.

• They can follow each student’s progress along a series of metrics, identify the need for extra support, and compare each student’s progress to the entire class.

• Produces customized web-based reports.

Page 13: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Grow Network

• A data reporting system with print and online components.• Provides customized reports for administrators, teachers, and

parents.• Reports are grounded in local or state standards of learning.• The categories of reporting and instructional materials explain

the standards that inform the test.• The data that are reported and the recommendations that are

made are aligned to encourage the thoughtful use of data.

Page 14: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Data Warehousing

• Locally grown initiative that enables school improvement teams, administrators, teachers, and parents to gain access to a broad range of data.

• Varied data available to multiple stakeholders in several formats for use in various contexts.

• The underlying principle is that the availability of data enables educators to access data and interpret the information to make informed decisions.

Page 15: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Year One Sites

• Handhelds - Albuquerque Public Schools

• Grow Network - New York Public Schools

• Data Warehouse - Broward County Public Schools

Page 16: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Year Two Validation Sites

• Handhelds - Mamaroneck Public Schools

• Grow Network - Chicago Public Schools

• Data Warehouse - Tucson Unified School District

Page 17: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Three Frameworks

• Methodological - Systems Thinking

• Theoretical - In the Service of Focused Inquiry, Transforming Data to Information to Knowledge

• Structural - Tool Characteristics

Page 18: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Methodological FrameworkSystems Thinking

The need to recognize:

• The dynamic nature of school systems.

• The interconnections among variables.

• The levels of stakeholders within school systems.

Page 19: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

A Conceptual Framework

QuickTime™ and aTIFF (LZW) decompressor

are needed to see this picture.

Page 20: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Structural Functionality Framework

• Accessibility

• Length of Feedback Loop

• Comprehensibility

• Flexibility

• Alignment

• Links to Instruction

Page 21: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

From Salomon & Almog, 1998

“A paradox gradually became evident: The more a technology, and its usages, fits the prevailing educational philosophy and its pedagogical application, the more it is welcome and embraced, but the less of an effect it has. When some technology can be smoothly assimilated into existing educational practices without challenging them, its chances of stimulating a worthwhile change are very small.”

Page 22: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

What does it mean to say:Does it work?

What is the “it”?

How do we operationalize “work”?

Page 23: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making
Page 24: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Different Views, Different Results

Page 25: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Methodological Implications for Technology-Based Educational Reform Efforts

• Longitudinal Design

• Multiple Methods

• Hierarchical Analysis

• System Dynamics

Page 26: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Evaluation

• Should be meaningful and constructive. The results and information should benefit the students, teachers, school, and district.

• Should not be punitive.• Should be informative, providing information on what is

going on, how to improve, or other important questions.• Should account for contextual factors.• Should use measurable components.• Should be flexible.

Page 27: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

How to Evaluate the Use of Technology:Everyone Wants to Write an NSF Proposal

Page 28: Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Preliminary Findings from the Sites

• New York City - Grow

• Broward - Data Warehouse

• Albuquerque - Handhelds

• Chicago - Grow

• Tucson - Data Warehouse

• Mamaroneck - Handhelds (forthcoming)