360 likes | 536 Views
How to Use SPBQ Data to Improve Student Learning. VATD Conference October 29, 2008 Presented by Virginia Beach City Public Schools Department of Technology Department of Curriculum and Instruction Department of Research, Evaluation, and Assessment. How to Import SPBQ Data. Meta Data
E N D
How to Use SPBQ Data to Improve Student Learning VATD Conference October 29, 2008 Presented by Virginia Beach City Public Schools Department of Technology Department of Curriculum and Instruction Department of Research, Evaluation, and Assessment
How to Import SPBQ Data • Meta Data • Test Table • Test Category Table • Student Table • Location Table • Import SOL Results • Importing Preliminary SOL Results • Matching Records to Students in Student Information System • Pushing the SOL Results to the Student Information System • Importing Final SOL Results • Import SOL SPBQ Results • Importing SPBQ Item Descriptors • Importing SPBQ Item Responses • Updating the SOL Results Test Cores • SPBQ Reports • Web Reporting System (WRS) • SPBQ Report by School, Division • SPBQ Report by School, Division – 3 Year • SPBQ Report by Teacher, School, Division • SPBQ Report by Student, School, School Level, Division
Importing SOL Test Results – Setting up Meta Data • The SOL tests are imported into a database using a process that reads the SOL data file and creates a record on a SQL table for each test. • There are several tables that we initially populate with information about each test that allow us to identify the test. These tables normally do not change but will require a change if a new test is added or if information about an existing test changes.
Test Table • This table contains a record for each SOL test. The information is used to identify, group, and sort tests internally for our imports and SOL reports. Below is a sample of the information for the Writing tests. The structure of this table is shown in Appendix A.
Test Category Table • This table contains a record for each category for an SOL test. The information is also used internally for our SOL reports. Below is a sample of the information for the EOC Algebra I test. The structure of this table is shown in Appendix A.
Student Table • This table contains a record for each student in the School Information System. The information is used during the matching process when the SOL test is imported. Below is a sample of the information for several students (the names and id not shown). The structure of this table is shown in Appendix A.
Location Table • This table contains a record for each school. The information is used internally for our SOL reports. Below is a sample of the information for the several schools. The structure of this table is shown in Appendix A.
Importing SOL Test Results • The Test Results Management System – or TERMS – is an application developed to allow test results to be imported into our database. • The SOL Results text file – for example, vaa_wrfall07_d128_stud_wrt_ext.txt - is selected as the file to import. • The text file is read – one record at a time – and the data is imported into a temporary import table. The data is extracted based on the field position in the record description. When the format changes, we adjust our tables and import process to use the new format.
Importing SOL Test Results (cont.) • Currently, we use the Document Information (Level, Subject Code, Year of Standard and Revision) to identify the test. • We use the Student Name, Birth Date, and Student ID to identify the student. • If the student cannot be identified using this data, we have an intermediate process that is available to allow manually matching the student to those in our Student Information System. The process is similar to the EIMS Resolution process.
Importing SOL Test Results (cont.) • The TERMS system allows the preliminary SOL Results file to be imported so we can begin reporting on data prior to receiving the final SOL Results data. Several preliminary imports can be performed with new records being added to the SOL Results table. • Changes to the student information on the test records will not be performed until the final SOL Results data are imported – the final import process will be described following the preliminary import steps.
Importing SOL Test Results (cont.) • The following shows the SOL Results Import screen with several tests that have been imported into the temporary import table. The first two tests do not match students in the Student table.
Importing SOL Test Results (cont.) • The unmatched students can be manually matched since the tests are in a temporary import table. The ‘Suggest Student Key’ button will display a list of students that are potential matches.
SOLResults Table • After the unmatched students have been matched, the data will be accepted into the SOLResults table. This table contains a record for each SOL test. The SOLResults information is used with the meta data described earlier to generate the SOL reports. Below is a sample of the information for the Writing tests shown in the import process on the previous slides. The structure of this table is shown in Appendix A. • We currently have SOL tests loaded for the past 10 years (since spring 1998) with approximately 1,782,293 tests in the SOLResults table.
SASI Extended Test History • The SOL tests are pushed to SASI into Extended Test History. There is another set of tables and a push application that perform this process. List of tests in Extended Test History Details for a single test
Importing SOL Results Final data • The TERMS application contains a process to import a SOL Final file. This process will only update existing SOL data. • The SOL Results Final text file – for example, vaa_wrfall07_d128_stud_wrt_ext.txt - is selected as the file to import. • The text file is processed in a similar manner as the preliminary file – one record at a time – and a list of changes to the tests is created. • We use the Test Number, Test Administration, and Unique Identification Number (UIN) to match the test on the import file to an existing SOL Results test.
Importing SOL Results Final data • The report below shows the list of changes for review. Once the changes are approved, they are applied to the data.
Importing SOL SPBQ Results • The TERMS application contains an import process for the SOL SPBQ data. • We do not import all the SPBQ data fields since the majority of the fields are duplicated on the SOL Results data. For that reason, the SOL Results data for the test administration must be imported first.
Item Descriptor Import • The SPBQ data is delivered in two files – one file contains the Item Descriptors and the other contains the Item Responses. The first step of the SPBQ import process is importing the Item Descriptors. • The Item Descriptor data is downloaded in a spreadsheet with a worksheet for each test type. Our import process uses text files so this data is ‘saved’ into a text file prior to import.
Item Descriptor Import (cont.) • The spreadsheet data and the text file for the Writing Item Descriptors are shown below.
Item Descriptor Import (cont.) • The Item Descriptor Import writes the data to the SOLItemDescriptor table. This information is used to identify each item response with a code and description. • The description for the Item Code can change. The ‘Active’ field is used to indicate the active description for the current test administration. • On the next slide, Item Code 0007 has two descriptions with the second description set as the active description. This allows us to print the old description on reports for prior test administrations.
Item Descriptor Table • Below is a sample of the information for the Writing tests. The structure of this table is shown in Appendix A.
Item Responses Import • The second SOL SPBQ file is the SPBQ Item Response file. This file contains the student level records with the responses for each SOL test. Since the majority of the information is already on the SOL Results, we do not import the duplicated information. • The SPBQ Item Response text file – for example, vaa_wrfall07_d128_spbq_wrt.txt - is selected as the file to import. The text file is read – one record at a time – and imported into a temporary import table. The data is extracted based on the field position in the record description. When the format changes, we adjust our tables and import process to use the new format.
Item Responses Import (cont.) • We use the Document Information (Level, Subject Code, Year of Standard, and Revision) to identify the test. After the test is identified, we use Test Number, Test Administration, UIN, and the Form Number to match the SPBQ to an existing SOL Results test. • If a matching test is not found we attempt another match using the Test Number, Test Administration, Student, and Form Number. • If the responses cannot be matched using this data, we have an intermediate process that is available to allow manually matching the test. It’s very rare that a matching test is not found since the SPBQ data is imported after the SOL Results final data.
Item Responses Import (cont.) • The following shows the SOL SPBQ Item Responses Import screen with several tests that have been imported into the temporary import table.
Item Responses Import (cont.) • After all the test records have been matched, the data will be accepted into the SOLItemResponse table. Multiple response records are inserted into the SOLItemResponse table for each test record on the import table. • One record is inserted for each item descriptor/item response. The responses are contained in the Item Response field - each descriptor/response is eight characters. 3 character Category Number Example: 001, 002 4 characters for the Item Descriptor Example: 0009, 1355 1 character for the Response 0 = Incorrect 1 = Correct N = No Response or Multiple Marks
Item Responses Import (cont.) • There can be up to 100 responses for each test. The four character Item Descriptor is used to lookup the item on the Item Descriptor table that was imported earlier. • We currently have SOL Item Responses loaded for the past eight years (since fall 2000) with approximately 58,342,384 responses on the SOLItemResponse table.
SOLItemResponses Table • Below is a sample of the information for Category 001 for one of the EOC Writing tests shown in the import screen. The structure of this table is shown in Appendix A.
SOL Test Cores • The SPBQ Reports described in the next section assume a single set of responses for each test core. We found that certain forms numbers –Braille, Large Print and Read Aloud forms – have a different set of responses for the same core. • For example, for the Spring 2007 EOC Chemistry, the Read Aloud Alternate form (S3026) has a different set of responses than the main form numbers. Both tests have the same test core. • A process is run after the SOL Results and the SPBQ Responses have been loaded to identify the tests that have the different set of responses. The item descriptors are matched up – if they are different, the test core on the test is changed in our internal table. This will ensure that a separate report page is generated for the alternate form numbers.
WRS SPBQ Reports • Our Web Reporting System (WRS) allows users to log in via our intranet and print reports based on their user profile. • The SPBQ Reports are available to the Department of Research, Evaluation and Assessment, C & I, and school level personnel such as the Testing AP, Principal, SIS, and the Data Support Specialist. • The front end of the reporting system is a web-based application with table driven selection screens. The reports are written using a reporting tool that allows the reports to be generated using the web interface. • Each report calls a process on the SQL Server where the SOL and SPBQ data is stored – the process generates the report data and the report can be viewed online, generated as a PDF or saved to an Excel spreadsheet.
WRS SPBQ Reports (cont.) • The following shows the report selection screen used for the majority of the SPBQ reports.
SPBQ Report by School, Division • This report displays the SPBQ data for one session for a selected school. The report shows the number of correct responses and percent correct for the school and for the division. The report displays the Item Descriptors in each category and the totals and percents for each item.
SPBQ Report by School, Division – 3 year • This report displays the SPBQ data over a 3-year span for a selected school. • The report shows the number of correct responses and percent correct for the school and for the division for the selected session and for the same session for the prior two years. • For example, if spring 2007 is selected, totals will be shown for spring 2005, spring 2006, and spring 2007. The report will show the item responses for all years even if they do not match an item response in the year selected. Items that exist in multiple sessions will show the totals on one line.
SPBQ Report by Teacher, School, Division • This report displays the SPBQ data for one session for a selected school. The report is in the same format as the School, Division report. The report shows the number of correct responses and percent correct for each teacher (group name) as well as the school and the division.
SPBQ Report by Student, School, School Level, Division • This report displays the SPBQ data for a selected student for one session – it can be run for all tests or a single test. The report is in the same format as the School, Division report. The report shows the responses for the student as well as the number of correct responses and percent correct the school, the school level (middle or high), and the division.
Contact • For additional information: • Chris Bruno Chris.Bruno@vbschools.com • Chris Huggins Chris.Huggins@vbschools.com • Virginia Beach School Department of Technology (757)-263-1100