480 likes | 723 Views
MEIM's 15th Annual Technology Conference & Expo Dec. 2, 2005. Data Informed Decision Making that Improves Teaching and Learning. Why are educators so fired up about data? . Superintendents ask. How do we know if teachers are teaching our curriculum?
E N D
MEIM's 15th Annual Technology Conference & Expo Dec. 2, 2005 Data Informed Decision Making that Improves Teaching and Learning
Why are educators so fired up about data? Superintendents ask • How do we know if teachers are teaching our curriculum? • How do we maximize the value of dollars spent for assessment and data management? • Are all of our students achieving at acceptable levels?
Professional learning communities ask • What is it we want our students to • know and be able to do? • How will we know when they have learned it? • What will we do when students are not learning?
Why are educators so fired up about “data?” Improving Student Achievement Is The Reason.
Creating some common languageabout data in schools What are the major systems? How are they related? What have districts done? Where do we want to go?
4 Major Data & Technology Systems in Schools. Assessment Systems Student Information Systems Data analysis systems Data warehouse
Data analysis process From Matt Stein. Making Sense of the Data: Overview of the K-12 Data Management and Analysis Market, Eduventures, Inc., Nov. 2003.
What is a Student Information System? • Registers new students • Demographic information (address, emergency contacts, etc.) • Attendance • Scheduling of classes • Achievement data • Examples include: CIMS, Skyward, Chancery, Pentamation, Zangle, etc. It is not keeping track of what is going on in classrooms.
What is an Assessment System? Tool for gathering achievement information • Some deliver item banks • Benchmark by NCS Pearson • MAP by the Northwest Evaluation Association • Some deliver intact tests • Assess2Learn by Riverside • EdVision by Scantron, • Homeroom by Princeton Review • Most are web-based It is assessing what is going on in classrooms.
Administrators, public, legislators Evaluation Accountability Long range planning Teachers, parents, students Diagnosis Prescription Placement Short range planning Very specific ach info Who needs what data? A single assessment cannot meet all needs. e.g., Who understood this concept? Why is Becky having trouble reading? e.g., What percent met standards on 4th grade MEAP math? Are students doing better this year than they were doing last year? Large Grain Size Fine Grain Size
What is a “data analysis system?” • The vendor maps your data to their system • Predefines the kinds of analyses staff will do • Allows user to create answers to questions • Lots of nice graphs, lists, etc. Examples: AMS by TurnLeaf, SAMS by Executive Intelligence, QSP, STARS by SchoolCity, Pinnacle by Excelsior Inform by Pearson. File Maker lets districts invent their own system. D’Tool and TestWiz are “sort of” data analysis systems.
What is a data warehouse? • It brings all the various sets of data together • Financial data • Personnel data • Building infrastructure data • Student demographic information • Student program information • Student achievement information • Example: Center for Educational Performance and Information’s Michigan Education Information System. (80% of work is data cleansing.)
What’s in CEPI’s data warehouse? School Code Master School Infrastructure Database (SID) Single Record Student Database (SRSD) Financial Information Database (FID) Registry of Educational Personnel (REP) Student Test and Achievement Repository (STAR) MEAP ACT SAT
Why some things aren’t in a warehouse…. Easier to ignore hoarding Not sure what it is or how to measure it overlooked stray
How are these things related? You can have a Student Info System and nothing else. You can have an assessment system and nothing else (but most assessment systems “depend” on data from the SIS). There is no point in having a data analysis system unless you have data. If you have a SIS & an assessment system, you’ll probably want a data analysis system. The State of Michigan is creating a data warehouse. A data analysis system could also use data from the warehouse. A data analysis system can bring the pieces together without a warehouse.
Oakland Schools Board of Education agreed to spend up to $1,600,000 in 2005-06 to makePearson Benchmark “Lite” & Inform available to all districts.
What we are trying to do:Provide Technology that Will Help • Improve teaching and increase learning for all • Useful reports for teachers, principals and district administration • Common assessments tied to GLCEs • Item banks tied to GLCEs • Multiple district on-ramps
Project Planning Process • Fall 2003 – Meetings with focus groups • Fall 2004 create RFP • Oct 2004 – Meeting with Assessment, Curriculum and Technology directors from Oakland districts to discuss requirements • Dec 2004 – RFP sent out to bid • Jan 2005 – 10 responses received • May 2005 – Committee selects products • July 2005 – Oakland School BOE approval
Oakland & LEA Members Only(N = 15) Items are arranged by “Importance” rating.)
Benchmark Test Results By Test • This view displays one • or all tests that the • selected student • population has taken. • Student scores are • plotted across a • proficiency scale. • The view displays the • percentage of students • who scored within the • range of each level on • the proficiency scale.
Benchmark Test ResultsBy Standard • This view displays • each assessed standard • and graphs the • percentage of students • who mastered and did • not master the standard • on each assessment. • Selecting a single test • displays detailed results • by standard for that test. • Selecting all tests • displays student • performance on the • standards over time.
Benchmark Test ResultsBy Individual - View Mastery Details • This view displays • all mastery records • for the given student, • sorted by standard. • This represents a • detailed running • record of a student’s • mastery across all • benchmark tests.
Benchmark Test ResultsItem Analysis • Click on the question • number to see the • question itself. • Click on the icon next • to the question number • to see a breakdown of • the item’s performance • by demographic • category.
Benchmark Test ResultsFrequency Distribution • This view plots a line-dot graph • based on the test frequency • distribution, and calculates the • range, mean, standard • deviation, and standard error. • In addition to this baseline data, • you can choose to plot up to • four graphs for particular • demographic groups. • The sample displays the • distribution of female scores • compared to the overall • baseline. • The view also displays how the • scores fall along the selected • proficiency scale.
Pearson Benchmark Benchmark Lite ends here
Pearson School Systems *** School District Self-Guided Product Tour Please see comments in Notes Section, using “Notes Page” view.
All users can run queries and reports(Teachers, principals, counselors, etc.)
Oakland Schools Support • Models defined to support diverse needs of districts and multiple on-ramps • Monetary support • Oakland Schools resources aligned • Curriculum, Item Banks, and Assessments delivered to all districts
Professional Development for LEA’s • Using data to inform instruction • Using Benchmark & Inform for grouping and differentiation • Using the Benchmark with Common Assessments • Using the Benchmark for Classroom Assessments • Administrator use of Inform • SIP Planning using both products
Early successes • Lake Orion High School • 5 departments • 14 courses • 36 teachers (about 25%) • 72 sections • Over 2200 scan sheets
Phase I (Sept-Nov) • Meet individually with department heads • Review exams with course teams • Create answer keys • Verify data • Distribute results to participating teachers • Review detailed results to participating teachers • All-staff professional development (11-11-05)
Impact of Phase I • Improved dialogue between participating teams • Discussion and modification of course assessment schedule • Question issues • Assessment design • Increased participation • Improved teacher comfort level of common assessment procedures
Phase 2 (Nov-Jan) • Try online testing • Try using rubrics • Additional course benchmarks • Build new tests • Identify & train department experts
Phase 3 (Jan-March) • Initiate middle school implementation • Benchmarks • Create common assessments for core courses • Collaborate with high school departments • Coach high school teams
Phase 4 (March – August) • Create and administer benchmark assessments in all high school courses • Administer common assessments in middle schools • Design/modify instructional practices based on data
Inform • Create structure for naming/filing queries for • Principals • Teachers • Create a consistent set of queries for each • Teach all principals to run their own queries • Get additional test data into Inform
“Favorite Queries/Reports” To Facilitate Initial Pearson Inform Training
Depending on An Individual’s Access Permissions … “Favorite Queries” Can Be Viewed At the District, School and Class Levels