1 / 29

Academic Support Grantee Data Training Department of Education

Academic Support Grantee Data Training Department of Education. Hoagland-Pincus Center (UMass) Shrewsbury, MA January 13, 2005. FY04 School Year Programs Highlights (all fund codes).

hester
Download Presentation

Academic Support Grantee Data Training Department of Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Academic Support Grantee Data TrainingDepartment of Education Hoagland-Pincus Center (UMass) Shrewsbury, MA January 13, 2005

  2. FY04 School Year Programs Highlights (all fund codes) • 248districts, high schools, community colleges, and partnering organizations served 8,653students in the classes of 2003-2006. • Participating students each received an average of 27.3 hours of MCAS remediation service. • 4,234of the students served earned their competency determination (CD) after participation in the program. • 24% more students participating in AS programs earned their CD compared to those not participating in AS programs (60% compared to 36%).

  3. FY04 School Year Programs -Highlights (all fund codes) • 58% of programs took place during the school day, 37% during extended time (before or after-school/evening), 5% during the weekend. • 70% of programs focused on mathematics; 30% on English language arts. • 18% of students served were from vocational schools. • 50% of participating students received individual (1 teacher:1 student) or smallest group instruction (1:2-5), 45% received small group instruction (1:6-10), and 5% received an “other” type of instructional model.

  4. How Department Uses Academic Support Data? • Included in reports for legislature and Board of Ed. • Viewed as a reflection of program effectiveness • Influences TA provided to grantees • Considered when determining funding amts. for grant programs and individual grants • Cost per pupil • #’s served • Completion rates • Gains made on MCAS tests • Dosage effect (effect of increased participation hrs.)

  5. What Type of Data is Collected/Selected? • SASID • Student Name, Grade, School • Fund Code (School Year / Summer) Allocation Grants • 632/625 – Districts/Approved SPED Schools Competitive Grants • 596/597 – Work and Learning Programs • 598/593 – Community Colleges/Partners • 619/592 – Districts/Partners

  6. What Type of Data is Collected/Selected? • Project Info. (e.g. ELA3D – HS English during day)

  7. What Type of Data is Collected/Selected? • Instructional Model (Teacher:Student) • 1:1 (Individual) • 1:2-5 (Smallest Group) • 1:6-10 (Small Group) • Other (*Note: Must explain in narrative evaluation) • Hours of Service per Student • Pre/Post MCAS Scores(*Note: Entered by Department)

  8. What Type of Data is Collected/Selected? • Student Status • Completed Program: Participated in at least 75% of possible hours of service • Withdrew: Participated in fewer than 75% of possible hours of service • Enrolled, but did not attend: Initially enrolled; zero hours of service • Participated until receipt of 220+ score: Attended program until s/he learned (during the program) that s/he passed the most recent MCAS test; hours of service > zero • Transferred out of school/district: Student was enrolled and may have attended, but s/he did not complete program due to transferringout of the school/district.

  9. How to Use Data? • Inform narrative evaluations and future proposals • Improve your program • Do we need to increase outreach efforts? • Is Day vs. Extended better for attendance and outcomes? • Are more service hours correlated to greater MCAS gains? • Was there more success with ELA vs. MTH – why? • What was the most important factor for MCAS score gains? • Build support for your program (e.g. present data to school board or to other potential funders)

  10. How to Improve Data Collection? • Appoint central person to plan collection and input data into security portal. • Make sure central person has access to security portal and is aware of deadlines. • Have central person share with program instructors the data that needs to be collected, their role in the collection process, and why it is collected.

  11. How to Improve Data Collection? Suggestions: • Maintain data in an excel spreadsheet before inputting into the security portal. (http://www.doe.mass.edu/as/data/as_template.xls) • Start entering data into security portal at least two weeks before due date to account for issues that may arise and so that it can be used to inform your narrative. (http://www4.doemass.org)

  12. When are theData Submission Due Dates? Session Times: Data Due Dates • Fall (Sept. to mid-Nov).: 1/31/05 * • Winter (Late-Nov. to early-Mar.): 4/30/05 • Spring (Feb/Mar. to May/June):7/31/05 ** • Summer (July to August): 9/30/05 ** * Mid-year narrative evaluation & data reports due for fund code 598. ** Narrative evaluations due for all fund codes (& for FC598/593 data report).

  13. Questions? Contact Allison Ward,Academic Support Data Specialist Tel: 781-338-3232 Email: ACsupport@doe.mass.edu Website: http://www.doe.mass.edu/as/data

More Related