580 likes | 727 Views
Accountability Reporting for California Community Colleges. Patrick Perry Vice Chancellor of Technology, Research, & Info. Systems CCC Chancellors Office. Data Preamble. “Information is the currency of democracy.” -Thomas Jefferson
E N D
Accountability Reporting for California Community Colleges Patrick Perry Vice Chancellor of Technology, Research, & Info. Systems CCC Chancellors Office
Data Preamble • “Information is the currency of democracy.” -Thomas Jefferson • “Get your facts first, then you can distort them as you please.” -Mark Twain • “In the twenty-first century, whoever controls the screen controls consciousness, information and thought.” -Timothy Leary
The CCC System • 109 campuses, 72 districts, all locally governed • 2.6 million students (annual unduplicated) • 1.1 million FTES (annual) • 35% white; half over age 25; 70% part-time • No admissions requirements • $20/unit; 40% get fees waived • Highest participation rate of any CC system in US; 25% of all CC students are CCC
CCC Chancellor’s Office • Weak authority; powers vested locally • Unitary MIS data collection (1992) • Student, faculty, course, section, session, grade level detail • Data collected end of term, 3x/yr • Used for IPEDS, apportionment, accountability, research, online data mart
History of CCC Accountability • Simple reporting, fact books until 1998 • 1998: State provides $300m ongoing in exchange for accountability reporting • “Partnership for Excellence” was born • CCC developed report in isolation • CCC allowed to determine “adequate progress” • “Contingent funding” never triggered • Used 5 metrics to measure system and college-level performance
PFE Metrics • Annual volume of transfers to CSU/UC • Annual volume of awards/certificates • Rate of successful course completions • Annual volume of Voc. Ed. Course completions • Annual volume of basic skills improvements (lower to higher level) • 4 of 5 are volume metrics, only 1 rate
The State Said: • Your metrics allow for no adequate college comparisons • Your method of determining “adequate progress” is suspicious • You only look good because you are growing • Partnership over (2001), but keep reporting, (until 2004) • we have to spend your money buying energy from Enron
What Happened Next • Gov. Gray Davis: recalled for spending money buying energy from Enron • Replaced by “The Governator”
The Governator • Likes Community Colleges • Comes from a country that has European “academic bifurcation” (Austria)-university vs trade paths • Attended Santa Monica Community College • Took ESL, PE, bookkeeping, micro/macroeconomics • Transferred to U. Wisconsin-Superior
And Arnold Said: • We shall haves deez accountabeelity seeztem for de community collegez. • A bill was passed to create the framework, and eventually the framework was enacted. • Named: Accountability Reporting for Community Colleges (ARCC).
Arnold Said: • There shall be no pay for performance, but there will be the ability to compare performance.
We Said: • Some metrics will be system only; others will be at college-level • College metrics will be rates (to mitigate size for comparison) • No rankings—we will compare colleges against their “peers” • No $$$=ARCC is a “dashboard” accountability report.
Arnold Said: • Colleges need to address their performance annually to the State.
We Said: • Colleges are more responsive to their local district Board; annual requirement to take local ARCC results to local Board and submit minutes to State • Colleges must submit 500 word response, which becomes a part of the final report.
Arnold Said: • The report shall be done in collaboration with the State, not in isolation.
We Said: • The Dep’t of Finance, Leg Analyst, and Secretary of Education shall be a part of the technical advisory committee (along with CCC researchers and stakeholders). • We will either succeed or fail together. • This was a really smart move.
ARCC • The Model: • Measures 4 areas with 13 metrics: • Student Progress & Achievement-Degree/Certificate/Transfer • Student Progress & Achievement-Vocational/Occupational/Workforce Dev. • Pre-collegiate improvement/basic skills/ESL • Participation • “Process” is not measured
Student Prog. & Achievement: Degree/Cert/Xfer • College: • Student Progress & Achievement Rate(s) (SPAR) • “30 units” Rate for SPAR cohort • 1st year to 2nd year persistence rate • System: • Annual volume of transfers • Transfer Rate for 6-year cohort of FTF’s • Annual % of BA/BS grads at CSU/UC who attended a CCC
Student Prog. & Achievement: Voc/Occ/Wkforce Dev • College: • Successful Course Completion rate: vocational courses • System: • Annual volume of degrees/certificates by program • Increase in total personal income as a result of receiving degree/certificate
Precollegiate Improvement/Basic Skills/ESL • College: • Successful Course Completion rate: basic skills courses • ESL Improvement Rate • Basic Skills Improvement Rate • System: • Annual volume of basic skills improvements
Participation • College: • None yet…but coming. • System: • Statewide Participation Rate (by demographic)
Major Advancements of ARCC • Creating a viable alternative to the GRS Rate for grad/transfer rate. • Finding transfers to private/out of state institutions. • Doing a wage study. • Geo-mapping district boundaries. • Creating peer groups.
Defining Grad/Transfer Rate • Student Progress & Achievement Rate (SPAR Rate) • IPEDS-GRS for 2-yr colleges stinks: • No part-timers • How do you define degree-seeking? • Tracking period too short • Outcomes counting methodology terrible • AA/AS/Cert counted before transfer • Transfer to 2-yr college is counted
SPAR Rate • Defining the cohort: • Scrub “first-time” by checking against past records (CCC, UC, CSU, NSC)
SPAR Rate • Define “degree-seeking” behaviorally for CC populations • Not by self-stated intent; this is a poor indicator • Behavior: did student ever attempt transfer/deg-applicable level math OR English (at any point in academic history) • Students don’t take this for “fun”
Defining Degree-Seeking Behaviorally • Separates out remedial students not yet at collegiate aptitude • Measure remedial progression to this threshold elsewhere • Creates common measurement “bar” of student aptitude between colleges • Same students measured=viable comparison
SPAR Rate-Unit Threshold • CCC provides a lot of CSU/UC remediation • Lots of students take transfer math/Eng and leave/take in summer • Should not count these as success or “our” student • Set minimum unit completed threshold (12) for cohort entrance • Any 12 units in 6 years anywhere in system
SPAR Denominator: • First-Time (scrubbed) • Degree-seeking (at any point in 6 years, attempt transfer/degree applicable math or English) • 12 units (in 6 years) • This represents about 40% of students in our system
SPAR Numerator • Outcomes the State wants: • Earned an AA/AS/certificate; OR • Transfer: to a 4-yr institution; OR • Become “transfer-prepared”;OR • Completed 60 xferable units • Became “transfer-directed”: • Completed both xfer level math AND English • No double-counting, but any outcome counts • SPAR Rate=51%
Tracking Transfers • SSN-level matches with CSU, UC • Nat’l Student Clearinghouse for private, proprietary, for-profit, out of state • Match 2x/yr, send all records since 1992 • Update internal “xfer bucket” • Works great for cohort tracking • Needed method for “annual volume”
Tracking Transfers • Annual Volume of Transfers • CSU/UC: they provide these figures based on their criteria • We didn’t want to redefine this • Private/Out of State: NSC “cross-section” cut method • Validated against CSU/UC xfers from NSC source • Added another 30% to annual volumes
Wage Study • What was the economic value of the degrees (AA/AS/certificate) we were conferring? • Required data match with EDD • Had to pass a bill changing EDD code to allow match
Wage Study • Take all degree recipients in a given year • Subtract out those still enrolled in a CCC • Subtract out those who transferred to a 4-yr institution • Match wage data 5 years before/after degree
Wage Study • Separate out two groups: • Those with wages of basically zero before degree • Those with >$0 pre wage • The result: The Smoking Gun of Success
Mapping Districts • CC Districts in CA are legally defined, have own elections, pass own bonds • We did not have a district mapping for all 72 districts • So we couldn’t do district participation rates
Mapping Project • Get a cheap copy of ESRI Suite • Collect all legal district boundary documents • Find cheap labor—no budget for this
Peer Grouping • “Peers” historically have been locally defined: • My neighbor college • Other colleges with similar demography • Other colleges with similar size
Peer Grouping • Taking peering to another level: • Peer on exogenous factors that predict the accountability metric’s outcome • Thus leaving the “endogenous” activity as the remaining variance • Cluster to create groups • We picked 6 clusters, with a min of 3 in a cluster • Each metric produces different factors, peers, clusters
Peer Grouping: Example • Peering the SPAR Rate: • 109 rates as outcomes • Find data for all 109 that might predict outcomes/explain variance • Perform regression and other magical SPSS things • See how high you can get your R2
Finding Data • What might affect a grad/transfer rate on an institutional level? • Student academic preparedness levels • Socioeconomic status of students • First-gen status of students • Distance to nearest transfer institution • Student age/avg unit load
Finding Data • We had to create proxy indices for much of these (142 tried) • GIS system: geocode student zipcode/ZCTA • Census: lots of data to be crossed by zip/ZCTA • Create college “service areas” based on weighted zip/ZCTA values • Different than district legal boundaries