310 likes | 520 Views
Student Assessment and Data Analysis Oakland Schools. MAEDS 2005 Tammy L. Evans. Why are educators so fired up about data?. How do we know if teachers are teaching our curriculum? How do we maximize the value of dollars spent for assessment and data management?
E N D
Student Assessment and Data AnalysisOakland Schools MAEDS 2005 Tammy L. Evans
Why are educators so fired up about data? • How do we know if teachers are teaching our curriculum? • How do we maximize the value of dollars spent for assessment and data management? • Are all of our students achieving at acceptable levels? Superintendents ask… 2
Professional learning communities ask • What is it we want our students to know and be able to do? • How will we know when they have learned it? • What will we do when students are not learning? 3
Why are educators so fired up about “data”? Improving Student Achievement! 4
Creating some common languageabout data in schools • What are the major systems? • How are they related? • What have districts done? • Where do we want to go? 5
4 Major Data & Technology Systems in Schools Oakland Schools focus is on Assessment and Analysis Assessment Systems Student Information Systems Data warehouse Data analysis systems 6 (see Data warehouse PP on CD)
SAS DAT PURPOSEStudent Assessment System & Data Analysis Tool • Improve teaching and increase learning for all • Useful reports for teachers, principals and district administration • Common assessments tied to GLCEs • Item banks tied to GLCEs • Multiple district on-ramps 7
What is an Assessment System? • Tool for gathering achievement information • It is assessing what is going on in classrooms. 8
Who needs what data? A single assessment cannot meet all needs. • Administrators, public, legislators • Evaluation • Accountability • Long range planning • Teachers, parents, students • Diagnosis • Prescription • Placement • Short range planning • Very specific ach info e.g., What percent met standards on 4th grade MEAP math? Are students doing better this year than they were doing last year? e.g., Who understood this concept? Why is Becky having trouble reading? Large Grain Size Fine Grain Size 9
Oakland Schools’ Path to Student Achievement • Fall 2004 – Meetings with focus groups, create RFP • Oct 2004 – Meeting with Assessment, Curriculum and Technology directors from Oakland districts to discuss requirements, including multiple “on ramps” • June 2005 deadline 10
The RFP • Input gathered from LEA focus groups in Curriculum, Assessment, Instruction and Technology • RFP authored at Oakland Schools through a collaboration between Career Focused Education, Learning Services, Research Evaluation and Assessment, Purchasing, School Quality and Technology Services. • Draft copy provided to LEA Technology and Assessment Directors for input. • Click here for details of the RFP • Click here for details of the vendor pricing submitted 11
The Committee • OCSA charged Oakland Schools and LEAs to move forward on acquisition of assessment and analysis system. • The RFP evaluation committee was formed, consisting of ISD and LEA staff representing Assessment, Curriculum and Technology. • Representatives from OCREAC, Teaching and Learning Council, Oakland County Technology Directors, OCSA Instruction &Technology subcommittee. • Committee members were from Berkley, Huron Valley, Lamphere, Lake Orion, Troy, Novi, South Lyon, Walled Lake and West Bloomfield. 12
ISD Collaboration • Jan 2005 – Oakland Schools and Wayne RESA met to review strategic goals around assessment and data analysis. • Joint RFP was created • Wayne RESA joined RFP evaluation committee • Wayne RESA and Oakland Schools separated scoring and recommendation for individual needs and approvals. 13
The evaluation begins • 10 vendors responded to the RFP • The committee met to review the responses. • The committee chose three vendors for demonstrations • Click here for theDebriefing Voting Results. 14
The demonstrations • Vendors were asked to cover specific points. • Half day demonstrations for each vendor were held at Farmington Training Center on March 10 & 11, 2005. • All Oakland Schools LEAs were invited to send representatives to the demonstrations. • Over 100 participants reviewed the products and were asked to complete a survey. • Click here for the Survey results. 15
Further evaluation • After the demonstrations, the committee met to discuss the products and created a pros/cons list for each vendor. • Using an audience response system, the group prioritized the functionality of the products and rated each vendor on those functional areas. (see SAS-DAT PP on CD for full presentation.) • Click here for the Functionality Summary 16
Vendor References • A subcommittee was formed to conduct reference interviews. • Included committee members from Huron Valley, South Lyon, Walled Lake and West Bloomfield and Oakland Schools • Plato – two references, EduSoft – two references, Pearson – three references • Click here for the Reference Questions • The reference information was synthesized and presented to the committee on April 11. • Click here for the Reference Call Summary 19
Further Analysis • Reviewed goals of RFP • Reviewed priority & ranking from vendor demonstrations • Reviewed vendor reference calls • Reviewed pricing 20
The Evaluation • Filled out evaluation sheets • Click here for the Evaluation Form • Results tallied: • Plato 4680 • EduSoft 4350 • Pearson 5720 21
Site Visit • May 4, 2005 – Putnam City Schools, OK • Met with Curriculum Director, principals to review product in use. 22
Facilitated Product Demonstration • May 5, 2005 – Oakland Schools • SAS-DAT Committee members were invited to participate in a test drive of Benchmark and Inform. 23
Oakland Schools Support • Models defined to support diverse needs of districts and multiple on-ramps • Monetary support • Curriculum, Item Banks, and Assessments delivered to all districts 27
The Partnership • Created Benchmark “Lite” • Host for Oakland Schools’ • Standard curriculum • Units / Lesson plans • Assessments • MCF – Michigan Curriculum Framework • Common assessments tied to GLCEs • Item banks tied to GLCEs • Allows districts to create assessments • Benchmark “Full” • administer tests (scan or web based) • report scoring • Inform • Analyzes test responses down to the individual student 28
Where we are now… • Conversion for 27 of 29 districts • Training • Implementation! August 2005+ • Sharing experience with other MI districts. • Contract allows for state purchase • Increased participation reduces cost for all 29
MACUL 2006 Presentation will cover… • Success stories • Lessons Learned • Examples of classroom assessment • Examples of analysis • Website and demonstration 30
Questions 31