1 / 28

Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

Creating an Evaluation Framework for Data-Driven Instructional Decision-Making. Sponsored by the National Science Foundation. Contact Information. Ellen Mandinach EDC Center for Children and Technology 96 Morton Street, 7th Floor New York, NY 10014 (212) 807-4207 Emandinach@edc.org.

Download Presentation

Creating an Evaluation Framework for Data-Driven Instructional Decision-Making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Creating an Evaluation Framework for Data-Driven Instructional Decision-Making Sponsored by the National Science Foundation

  2. Contact Information Ellen Mandinach EDC Center for Children and Technology 96 Morton Street, 7th Floor New York, NY 10014 (212) 807-4207 Emandinach@edc.org

  3. Project Staff • Ellen Mandinach • Margaret Honey • Daniel Light • Cricket Heinze • Hannah Nudell • Luz Rivas • Cornelia Brunner, special advisor

  4. Overarching Objective The project will bring together complimentary evaluation techniques, using systems thinking as the primary theoretical and methodological perspective, to examine the implementation and use of data-driven applications in school settings.

  5. What We Promised • To create an evaluation framework based on the principles of systems thinking. • The examination of technology-based tools for data-driven decision-making.

  6. Goal 1 We will build a knowledge base about how schools use data and technology tools to make informed decisions about instruction and assessment.

  7. Goal 2 We will develop an evaluation framework to examine the complexities of dynamic phenomena that will inform the field and serve as a knowledge building enterprise.

  8. Overarching Issues • The use of the methodological framework to examine data-driven decision-making. • The development of a systems model for the use of data and the technology-based tools for the participating districts. • Validation of the models by scaling to a second set of sites. • Examination and validation of the theoretical and structural frameworks.

  9. Selected Applications • Handheld diagnostic tools (e.g., Palm Pilots) • The Grow Network • Data warehouse

  10. Why Selected • These projects have been selected for three reasons: • 1.We have existing relationships with both the developers and the school systems in which they are being implemented. • 2. Through our current research we have developed a baseline understanding of how the systems are used. • 3. While these initiatives focus on improving student performance, they use different information sources and strategies in supporting data-driven decision-making. Variability in focus and implementation is particularly relevant to the design of an evaluation framework that can generalize.

  11. Handheld Diagnostics • Ongoing diagnostic assessment in early literacy and mathematics learning. • Teachers assess student learning using the handhelds. • Teachers upload information from the handhelds to a web-based reporting system where they can obtain richer details about each student. • They can follow each student’s progress along a series of metrics, identify the need for extra support, and compare each student’s progress to the entire class. • Produces customized web-based reports.

  12. Grow Network • A data reporting system with print and online components. • Provides customized reports for administrators, teachers, and parents. • Reports are grounded in local or state standards of learning. • The categories of reporting and instructional materials explain the standards that inform the test. • The data that are reported and the recommendations that are made are aligned to encourage the thoughtful use of data.

  13. Data Warehousing • Locally grown initiative that enables school improvement teams, administrators, teachers, and parents to gain access to a broad range of data. • Varied data available to multiple stakeholders in several formats for use in various contexts. • The underlying principle is that the availability of data enables educators to access data and interpret the information to make informed decisions.

  14. Year One Sites • Handhelds - Albuquerque Public Schools • Grow Network - New York Public Schools • Data Warehouse - Broward County Public Schools

  15. Year Two Validation Sites • Handhelds - Mamaroneck Public Schools • Grow Network - Chicago Public Schools • Data Warehouse - Tucson Unified School District

  16. Three Frameworks • Methodological - Systems Thinking • Theoretical - In the Service of Focused Inquiry, Transforming Data to Information to Knowledge • Structural - Tool Characteristics

  17. Methodological FrameworkSystems Thinking The need to recognize: • The dynamic nature of school systems. • The interconnections among variables. • The levels of stakeholders within school systems.

  18. A Conceptual Framework

  19. Structural Functionality Framework • Accessibility • Length of Feedback Loop • Comprehensibility • Flexibility • Alignment • Links to Instruction

  20. From Salomon & Almog, 1998 “A paradox gradually became evident: The more a technology, and its usages, fits the prevailing educational philosophy and its pedagogical application, the more it is welcome and embraced, but the less of an effect it has. When some technology can be smoothly assimilated into existing educational practices without challenging them, its chances of stimulating a worthwhile change are very small.”

  21. What does it mean to say:Does it work? What is the “it”? How do we operationalize “work”?

  22. Different Views, Different Results

  23. Methodological Implications for Technology-Based EducationalReform Efforts • Longitudinal Design • Multiple Methods • Hierarchical Analysis • System Dynamics

  24. Evaluation • Should be meaningful and constructive. The results and information should benefit the students, teachers, school, and district. • Should not be punitive. • Should be informative, providing information on what is going on, how to improve, or other important questions. • Should account for contextual factors. • Should use measurable components. • Should be flexible.

  25. How to Evaluate the Use of Technology: Everyone Wants to Write an NSF Proposal

  26. Preliminary Findings from the Sites • New York City - Grow • Broward - Data Warehouse • Albuquerque - Handhelds • Chicago - Grow • Tucson - Data Warehouse • Mamaroneck - Handhelds (forthcoming)

More Related