490 likes | 499 Views
Understand the process and timeline for annual program reviews, including comprehensive and unit reviews. Templates provided for instructional and non-instructional programs. Review cycles, document submission guidelines, and importance of assigned program courses are covered.
E N D
AY 2008 Comprehensive & Annual Program Review Data Delivery Plan Data Description Process – Timeline Rev. 11-24-08
Purpose • The primary purpose of this presentation is to provide clarity to all instructional programs and non-instructional units on how the data is calculated for both our local Comprehensive and the system required Annual Program Reviews. • We have been asked to produce an annual program review for each and every one of our instructional programs and units. They are required of each system CC and will be taken to the U of H Board of Regents for their review. • So who has to do a program review this year? If you are normally scheduled to do a comprehensive or are “Jumping” you will do a comprehensive review this year. Additionally, every instructional and non-instructional program will do an Annual Program Review this year. • Not sure if you’re scheduled for a comprehensive review or not? Click here for the Comprehensive Program-Unit Schedule.
What’s a Jumper? • A Jumper is a locally defined term that is used to describe an instructional program or non-instructional program (unit) that has decided to jump out of their normally scheduled slot for their comprehensive program reviews and into this years cycle. • Why would anyone want to do that? Jumping into this years comprehensive cycle means that you will have an opportunity to be considered for any budgetary decisions that will be made in this years budget process. • Jumpers will still have to do their comprehensive review on their next scheduled review. Jumping does not affect the existing schedule—you are voluntarily doing an extra review to be considered in this budget cycle.
I belong to an Instructional Program… which template do I use? • A datasheet has already been provided to all instructional programs for use in both the comprehensive and annual reviews. • There are 3 templates available to Instructional Programs this year. Simply click on the template below that applies to you, paste in the completed datasheet where needed, and then write in your review: Instructional Comprehensive Program Review Template (Use this template ONLY if you are scheduled for a comprehensive program review this year or are jumping) AY 2008 Annual Instructional Program Review Template (ALL Instructional programs will need to complete this—even if you’ve completed a comprehensive review) AY 2008 Annual Instructional Program Review Coversheet (ALL Instructional programs will need to complete this—even if you’ve completed a comprehensive review)
I belong to a non-instructional unit…which template do I use? • All Units are required to submit an annual review. Additionally, if you are on the schedule for a comprehensive review or are jumping you will also need to complete a comprehensive review. • If you are on the schedule for a comprehensive unit review or are jumping you will be using the comprehensive unit review template below. Comprehensive Unit Review Template • For annuals, if you are writing your unit review in support of Administrative Services you will use: Administrative Services Template • For annuals, if you are writing your unit review in support of Student Support you will use: Student Support Annual Coversheet Student Support Annual Report of Program Data (provided by IR) Also required for submission is an Executive Summary • For annuals in all other units you will use: Annual Unit Review Template
Instructional Program Courses • Over the course of the Summer our Interim Assistant Dean of Instruction, Noreen Yamane, and IT Specialist Marv Kitchen worked on assigning program courses to the appropriate programs. • The listing of program courses with the associated program owners was returned to the system office for use in this years program review. The program courses sent in are roughly equivalent to what we called our program paid courses in last years program review—they are the courses your program is responsible for. • Having a single table to keep all of the program courses for all of our system-wide community colleges is a HUGE improvement to our program review process. • Please be sure to click on the link below and ensure that your courses have been appropriately assigned to your program. Remember that the program review is only looking at Fall data so you will not see any Spring courses listed. If need be we can send an update to the system office for next years review cycle. • Not sure what courses are listed for your program? Check the Program Course Listing
AY 2008 Comprehensive-Annual Instructional Program Review Data Elements • The system office delivered roughly 67% of all of the data provided to you this year. Their primary concern was in getting all of the Fall 2007 annual data to us so we had to fill in the blanks in order to get you all of the data you need for all 3 years. • Wherever possible previous year data was copied directly in from last years program review. This was done to reduce the confusion that might come about where this year we are giving you data that is different from what was reported last year. • The system office reported data for all 3 years using today’s routines, which may vary from what was reported last year. This is due to the system office using “improved” routines over the ones used last year. • The data tables you will be receiving will cover the last 3 full academic years reported as of Fall05 through Fall07 and lists 29 different data elements from a variety of sources. • Student information data this year comes exclusively from the Operational Data Store (ODS). Organizationally this means that all community colleges are getting the data from the same place and at the same snapshot in time (this is a good thing). • The following slides will explain in detail what data has been provided to you for your comprehensive-annual instructional program review write ups and how it has been calculated.
#1 Annual new and replacement positions in the state. • This data element represents the combined new and replacement jobs for the State of Hawaii in your trade, projected for the period 2005-2011 for Fall 05 and 06, and 2008-2014 for Fall 07. This years program review has been corrected to reflect just the annual value for all 3 years. • Economic Modeling Specialists Inc (EMSI) compiles data based on Standard Occupational Codes (SOC) that the college has linked to the instructional program. . • From their website, “…EMSI specializes in reports that analyze and quantify the total economic benefits of community and technical colleges in their region, and also creates data-driven strategic planning tools that help colleges maximize their impact through labor market responsiveness…”
#2 Annual new and replacement positions in the County. • This data element represents the combined new and replacement jobs for the County of Hawaii in your trade, projected for the period 2005-2011 for Fall 05 and 06, and 2008-2014 for Fall 07. This years program review has been corrected to reflect just the annual value. • Economic Modeling Specialists Inc (EMSI) compiles data based on Standard Occupational Codes (SOC) that the college has linked to the instructional program. • From their website, “…EMSI specializes in reports that analyze and quantify the total economic benefits of community and technical colleges in their region, and also creates data-driven strategic planning tools that help colleges maximize their impact through labor market responsiveness…”
#3 Number of Majors • This is total number of students registered at Hawaii Community College for credit classes within the declared major. • This is as of Fall semester at census.
#4 SSH Program majors in Program Classes • This is the sum of fall student semester hours taken by program majors in our locally defined program courses. • Excludes Directed Studies (99 series). • Includes Cooperative Education.
#5 SSH Non-Majors in Program Classes • This is the sum of fall student semester hours taken by non-program majors in our locally defined program courses. • Excludes Directed Studies (99 series). • Includes Cooperative Education.
#6 SSH in All Program Classes • The sum of fall student semester hours taken by all students in our locally defined program courses. • Excludes Directed Studies (99 series). • Includes Cooperative Education.
#7 FTE Enrollment in Program Classes • This is the sum of fall student semester hours taken by all students in program classes (#6) / 15 for full time. • This is as of Fall semester at census.
#8 Number of Sections Taught • This is the number of program courses (actual sections) taught in the program. • This is as of Fall semester at census. • The number of classes taught excludes Directed Studies courses (99, 099, 199, 299) but includes Cooperative Ed courses (93, 093, 193, 293)
Determination of program’s health based on demand (Healthy, Cautionary, or Unhealthy) • This year Doug has asked the IR Office to calculate health calls for all instructional programs. In the following years the programs will be asked to calculate their own health calls. • Program Demand is determined by taking the number of majors (#3) and dividing them by the number of New and Replacement Positions by County (#2). • The following benchmarks are used to determine health for majors per new and replacement positions in the county: Healthy: 1.5 - 4.0 Cautionary: .5 – 1.49; 4.1 – 5.0 Unhealthy: <.5; >5.0 • Finally, an Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy • The overall category health score is used to evaluate the overall health of your program.
#10 Average Class Size • Average class size is total student registrations in program classes divided by the total number of classes taught (#8). • This excludes Directed Studies courses (99, 099, 199,299) but includes Cooperative Ed courses (93, 093, 193, 293)
#11 Class Fill Rate • Class fill rate is total student registrations in program classes (number of seats filled) divided by the maximum enrollments (number of seats offered). • This is as of Fall semester at census.
#12 FTE BOR Appointed Faculty • FTE of BOR Appointed Program Faculty is the sum of appointments to your program (1.0, 0.5, etc) excluding Lecturers and other non-BOR appointees. • Remember that these are positions that were appointed to your program—whether or not these people are actually teaching classes. • Data is as of 10-31-2007 • This information now comes directly from system HR. If this is not correct for your program we can work to get it updated in the HR system.
#13 Majors / FTE BOR Appointed Faculty • The number of majors (from data element #3) divided by the number of BOR Appointed Program Faculty (from data element #12) • Student Majors are taken as of Fall semester at census. • Note: this is not all students taking your classes…just the majors.
#14 Majors / Analytic FTE Faculty • Total number of fall majors (from data element #3) divided by FTE faculty. • FTE faculty is a workload measure not to be confused with BOR Appointments. It is calculated by taking all of the semester hours taught in Program classes and dividing them by 15 credits for full load. • Student Majors are taken as of Fall semester at census.
#14a Majors / Analytic FTE Faculty @ 12 cr. • Starting for Fall 2007 Doug has indicated an interest in adding a data element that roughly maps to programs that operate on contact hours instead of credit hours. For these programs we will calculate workload faculty fte by dividing credits taught by 12 instead of 15. • If you do not have a #14a in the data set provided to you, please ignore this additional detail. It was only added to the data sheets for programs using 12 credits for FTE. • This new calculation for Faculty FTE is used in data elements 14a, 15, 16, and 29. • Programs identified as using 12 credits for full-time are: AG, ABRP, AMT, CARP, DISL, DMA, EIMT, ET, FSER, MWIM, NURS, PRCN, CHO, and TEAM.
#15 Program Budget Allocation • Program budget allocation includes: Personnel, supplies and services, and equipment. • Personnel costs + b budget costs = the program budget allocation. • With the exception of the 3 Nursing programs, we calculate personnel costs by multiplying the Faculty FTE (workload), which includes lecturers, by the AY 2008 UHPA faculty rank 4 rate per credit hour value of $1612, then by 30 credits. • For the 3 Nursing programs, we calculate personnel costs by multiplying the Faculty FTE (workload), which includes lecturers, by the AY 2008 UHPA faculty rank 5 rate per credit hour value of $1808, then by 30 credits, for all 3 reporting years.
#15 Program Budget Allocation cont. • For programs identified as using 12 credits for full-time, starting in Fall 2007 your program budget is decided based on this new calculation. • For all programs except Nursing your program budget for 2005 and 2006 was taken directly from what was reported last year. • The only real change for program budget allocations is for Fall 2007 in programs identified as using 12 credits for full-time.
#16 Cost per SSH • This is the cost to run your program based on student semester hours. • Costs come from Program Budget Allocation (data element #15) divided by student semester hours for all students in program classes (from data element #6) • It should be noted that we are dividing the number of student semester hours from the Fall semester by an annual expenses. More accurately this data element represents “Annual Cost per Fall SSH”. • For programs identified as using 12 credits for full-time equivalency, starting in Fall 2007 your cost per SSH is decided based on this new calculation as it is derived from program budget allocation.
#17 Number of Low Enrolled (<10) Sections • This is the number of Program classes (actual sections) taught (#8) with 9 or fewer students. • These are registered students in fall semester as of census. • Excludes Directed Studies (99 series)
Determination of program’s health based on efficiency (Healthy, Cautionary, or Unhealthy) • This year Doug has asked the IR Office to calculate health calls for all instructional programs. In the following years the programs will be asked to calculate their own health calls. • Program Efficiency is calculated using 2 separate measures…Class fill rate (#11), and Majors/FTE BOR Appointed Faculty (#13). • The following benchmarks are used to determine health for Class Fill Rate : Healthy: 75 – 100% Cautionary: 60 – 74% Unhealthy: < 60% • An Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy • The overall category health score is used to evaluate the overall health of your program.
Determination of program’s health based on efficiency (Healthy, Cautionary, or Unhealthy) continued… • The following benchmarks are used to determine health for Majors/FTE BOR Appointed Faculty : (note: per Doug, HawCC is not using the mandated enrollment capacity method given in the system rubric) Healthy: 15 - 35 Cautionary: 30 – 60; 7 - 14 Unhealthy: 61 +; 6 or fewer • An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy • Finally, an average of the 2 overall health scores for Class fill rate and Majors/FTE BOR Appointed Faculty is determined using the following rubric: 1.5 - 2 = Healthy .5 - 1 = Cautionary 0 = Unhealthy
#19 Persistence Fall to Spring • Major count (#3) who at subsequent spring semester are enrolled and are still majors in the program at time of census. • Example: 31 majors start in Fall 21 majors of the original 31 persist into Spring 21/31 = .6774 or 67.74%
#20a Number of Degrees Earned (Annual) • The number of Degrees conferred in the fiscal year. • We are counting student major outcomes such as AA, AAS, and AS. • The count is of credentials and shows duplicate credentials received by the same student. • When you are looking at the data chart keep in mind that the fiscal year is roughly equivalent to the academic year. So for the column labeled 2006, that is AY2006, which is Fall 05 and Spring 06.
#20b Number of Certificates of Achievement Earned (Annual) • The number of certificates earned are counted as of the fiscal year. • We are only counting the student major outcome CA, Certificate of Achievement. • Neither Academic Subject Certificates (ASC) or Certificates of Completion (CC) are being counted for this particular data element. • When you are looking at the data chart keep in mind that the fiscal year is roughly equivalent to the academic year. So for the column labeled 2006, that is AY2006, which is Fall 05 and Spring 06.
#20c Number of Certificates of Completion Earned (Annual) • Doug has asked that we add the number of CC’s earned to program data sheets where the ONLY major outcome is a Certificate of Completion. If your program does not produce just CC’s you will not have a #20c in your data sheet. • The number of certificates earned are counted as of the fiscal year. • We are only counting the student major outcome CC, Certificate of Completion. • When you are looking at the data chart keep in mind that the fiscal year is roughly equivalent to the academic year. So for the column labeled 2006, that is AY2006, which is Fall 05 and Spring 06.
#21 Number Transferring (to UHM, UHH, UHWO) • This is the number of program majors (#3) that were enrolled in the 0708 academic year at UH Manoa, UH West Oahu, or UH Hilo for the first time… • And, not enrolled UHM, UHWO, or UHH at any time in the previous academic year… • And, had an enrollment at HawCC at any time in that same previous academic year… • And their cumulative earned student semester hours were more or equal to 12 as of Spring that year.
#22 Perkins Core Indicator: Academic Attainment (1P1) • Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is actually representative of the previous academic year. • Note corresponding Perkins State Standard • Academic Attainment is calculated by: Concentrators who have a cumulative GPA >= 2.0 in academic courses and who have stopped program participation in the year reported Concentrators , with academic courses, who have stopped program participation in the year reported
#23 Perkins Core Indicator: Technical Skill Attainment (1P2) • Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is actually representative of the previous academic year. • Note corresponding Perkins State Standard • Technical Skill Attainment is calculated by: Concentrators who have a cumulative GPA >= 2.0 in vocational courses and who have stopped program participation in the year reported Concentrators who have stopped program participation in the year reported
#24 Perkins Core Indicator: Completion Rate (2P1) • Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is actually representative of the previous academic year. • Note corresponding Perkins State Standard • Completion rate is calculated by: Concentrators who have received a vocational degree or certificate and who have stopped program participation in the year reported Concentrators who have stopped program participation in the year reported
#25 Perkins Core Indicator: Placement in Employment, Education and Military (3P1) • Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is actually representative of the previous academic year. • Note corresponding Perkins State Standard • Placement in employment, education, and military is calculated by: Completers in the year reported (previous Perkins year) who have stopped program participation and who transferred or are employed within one employment quarter following program completion Completers in the year reported (previous Perkins year) who have stopped program participation
#26 Perkins Core Indicator: Retention in Employment (3P2) • Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is actually representative of the previous academic year. • Note corresponding Perkins State Standard • Retention in employment is calculated by: Numerator 3P1 retained in employment for a second quarter and continuing transfers Completers in the year reported (previous Perkins year) who have stopped program participation and who transferred or are employed within one employment quarter following program completion (Numerator of Report 3P1)
#27 Perkins Core Indicator: Non-Traditional Participation (4P1) • Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is actually representative of the previous academic year. • Note corresponding Perkins State Standard • Non-Traditional participation is calculated by: Participants in under-represented gender groups who participated in nontraditional programs in the year reported Participants in nontraditional programs in the year reported
#28 Perkins Core Indicator: Non-Traditional Completion (4P2) • Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is actually representative of the previous academic year. • Note corresponding Perkins State Standard • Non-Traditional participation is calculated by: Completers in under-represented gender groups in nontraditional programs in the year reported Completers in nontraditional programs in the year reported
Determination of program’s health based on effectiveness (Healthy, Cautionary, or Unhealthy) • This year Doug has asked the IR Office to calculate health calls for all instructional programs. In the following years the programs will be calculating their own health calls. • Program Effectiveness is calculated using 3 separate measures…(Degrees Earned (#20a) + CA Certificates Earned (#20b)) / Majors (#3), (Degrees Earned (#20a) + CA Certificates Earned (#20b)) / Annual new and replacement positions in the county (#2), and Persistence Fall to Spring (#19). • The following benchmarks are used to determine health for Degrees and CA Certificates earned per major : Healthy: > 20% Cautionary: 15 - 20% Unhealthy: < 15% • An Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy • The overall category health score is used to evaluate the overall health of your program.
Determination of program’s health based on effectiveness (Healthy, Cautionary, or Unhealthy) continued… • The second measure used to determine health is (Degrees Earned (#20a) + CA Certificates Earned (#20b)) / Annual new and replacement positions in the county (#2). • The following benchmarks are used to for this measure: Healthy: .75 – 1.5 Cautionary: .25 - .75 and 1.5 – 3.0 Unhealthy: < .25 and >3.0 • An Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy • The overall category health score is used to evaluate the overall health of your program.
Determination of program’s health based on effectiveness (Healthy, Cautionary, or Unhealthy) continued… • The third measure used to determine health is Persistence Fall to Spring (#19). • The following benchmarks are used for this measure: Healthy: 75 - 100% Cautionary: 60 - 74% Unhealthy: < 60% • An Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy • The overall category health score is used to evaluate the overall health of your program.
Determination of program’s health based on effectiveness (Healthy, Cautionary, or Unhealthy) continued… • You should now have a value of zero, one, or two for each of the 3 effectiveness measures. The process of determining the Effectiveness health call score contains the following 3 steps: • Step #1: Add up all 3 Overall Category Health scores for the effectiveness measures. (the zero’s, one’s and two's you assigned earlier) • Step #2: Determine the effectiveness category health call range where: 5 - 6 = Healthy 2 - 4 = Cautionary 0 - 1 = Unhealthy • Step #3: Now use the scoring rubric below to determine the effectiveness health call score: (for example: you had a healthy 5 in the previous step you would assign it a healthy 2 here) 2 = Healthy 1 = Cautionary 0 = Unhealthy
Determination of program’s overall health (Healthy, Cautionary, or Unhealthy) • You should now have a value of zero, one, or two for each of the 3 program health calls; Demand, Efficiency, and Effectiveness. Simply add those 3 values together and use the Scoring Range Rubric below to determine the overall health of your program. 5 - 6 = Healthy 2 - 4 = Cautionary 0 - 1 = Unhealthy
#29 Faculty FTE Workload • FTE Faculty Workload is determined by summing the semester hours that instructors earn by teaching a program course, and then dividing by 15 for the full time equivalent value. • For programs that consider 12 credits full time load the summed semester hours are divided by 12. • This is for Fall semester only. • FTE Faculty Workload includes both Faculty and Lecturers.
Comprehensive & Annual Program Review Timeline All instructional program and unit reviews (Comprehensive and Annual) due to VCAA by email by EOB Wednesday November 26th Package and Delivery of 43 Program & Unit Reviews to System Office Build out 2008 program review on the assessment website Deliver data for instructional programs Oct 10 Oct 16 26Nov Dec 15 When Reviews are complete. Oct 17th Oct 20-23 Oct 13-16 Develop presentation and training materials Collect and deliver data to Student Support Services Provide Annual Program Review Training to Campus Post 43 Program & Unit Reviews to Assessment Website
AY2007 Comprehensive & Annual Review Process • Step 1 Write your instructional program or unit review using the appropriate template. • Step 2 Send yourdocuments to VCAA Doug Dykstra by email no later than end of business, Wednesday November 26th. • Step 3 VCAA will ensure that all reviews and coversheets have been received and that they are adequate. • Step 4 VCAA will forward all approved reviews to IR Shawn Flood for further processing. • Step 5 The Annual reviews will be appropriately packaged and sent to the System Office for review by the UH Board of Regents. Comprehensive reviews will be forwarded along as appropriate. • Step 6 All reviews will finally be converted to PDF and posted to the Assessment Web Site.
Questions? • The intention of this presentation was to provide one source for all of the documentation related to the Comprehensive & Annual Program Review process. I have linked all of the documents you should need directly into this presentation. • If you need more information on this process please feel free to contact me: Shawn Flood974-7512 Mahalo!