560 likes | 804 Views
The Rocky Road to OBE FL. November 2002. Need for Success. Administrative and management tools Senior level buy in State Library and field training Nurturing approach for those who apply for grants. Florida’s Long-Term Outcomes.
E N D
The Rocky Road to OBE FL November 2002
Need for Success • Administrative and management tools • Senior level buy in • State Library and field training • Nurturing approach for those who apply for grants
Florida’s Long-Term Outcomes • Library staff are able to show the impact their projects have on their patrons • Library staff are able to create the systems needed to measure and use the data they collect during a project
Florida’s Implementation Plan • Development of outcome model and standardized outcomes • Marketing of outcomes evaluation process • Planning for staff training and technical assistance
Evaluation Personnel • State and Federal Grants Office • One LSTA Coordinator; one unit chief • Research Office • One unit chief 1998-1999; one consultant added 2000 • Private Consultants Hired as Needed
First Year Expectations • We would develop a statewide outcomes evaluation program soon • State Library staff would develop a basic understanding of outcomes • Statewide Born to Read (BTR) program would be our model program • BTR would be the core of our 5-year evaluation
Outcomes Timeline: FY98 • Research Office established • Born to Read grant application written to accommodate evaluation component • LSTA administration grant written to accommodate activities for outside evaluator • Private consultant hired • Outcomes workshop presented to selected State Library staff
Implementation Costs FY98 Activity $$ %DLIS Consultant/Workshop $1,500 <1%
Lessons Learned FY98 • Implementation was going to happen slowly and needed to be planned
Second Year Expectations • We would lay administrative groundwork to implement OBE • Develop a planning document • Change state grant application and guidelines to match new requirements • Phase in the requirements for outcomes evaluation • Develop and field-test a training manual • Adapt existing methods, tools and documents used in libraries for measurement and planning to OBE
Second Year Expectations • We would design the 5-year LSTA evaluation • We would initiate new exemplary projects program • Division staff would become trainers • We would get the word out • Field would be resistant
Outcomes Timeline: FY99 • Presented OBE at Directors’ Conference • IMLS selected Florida as one of five states for pilot project • Revised rules, guidelines and forms • Private consultants hired to develop outcomes model • Born to Read pilot project developed • LSTA Advisory Council received overview of outcomes evaluation model
Outcomes Timeline: FY99, continued • Planning began for Exemplary Projects and Outcomes Evaluation Workbook • Outcomes Evaluation implementation plan written • First Born to Read evaluation completed • OPPAGA review favorable toward outcome concept • Florida’s “LSTA Outcome Evaluation Plan” published
Implementation Costs FY99 Activity $$ %DLIS BTR Evaluation, Yr 1 $25,000 <1% Plan/Workbook $29,900 1%
Lessons Learned FY99 • Resistance to OBE was more than anticipated • First inkling BTR could not be used as the backbone of our LSTA Evaluation • We needed to revise our Logic Model • Measuring long term outcomes seems to require expertise library staff don’t have • Field demanded less theory, more application
Lessons Learned FY99 • We need to provide grant recipients with sample survey instruments • We need to standardize outcomes • When hiring outside experts, ask questions • Librarians’ views on confidentiality differ from outside evaluators’ • Need to enable Advisory Councils, Library Boards and Friends groups to communicate about OBE
Third Year Expectations • We would be able to use the Born to Read projects as the core of our 5-year evaluation • Standardized outcomes would help clarify OBE and simplify the process • The field would understand the difference between outputs and outcomes
Third Year Expectations • We would be able to get the field to focus on the impact their projects would have on their patrons • Series of Capacity Building Workshops would enable grant recipients to work with outcomes
Outcomes Timeline: FY00 • Outcomes evaluation program presented at annual Library Directors’ conference • State Library consultants trained • Born to Read Capacity Building Workshop (outcomes plans developed) • Developed standardized outcomes for six additional sub-grant categories
Outcomes Timeline: FY00 • Revised exemplary projects program initiated • Workshops on outcomes-based evaluation presented around the state • Second Born to Read evaluation completed • “Workbook: Outcome Measurement of Library Programs” published
Implementation Costs FY00 Activity $$ %DLIS Grant Workshops $6,000 <1% BTR Evaluation, Yr 2 $25,000 <1% Consultant/Training $5,000 <1%
Lessons Learned FY00 • Six LSTA categories was cumbersome • Need to develop means to roll up evaluations at the state level • Field staff need training in how to apply outcomes to their specific projects • We need to provide continuous training • Need to take “baby steps”
Lessons Learned FY00 • 2nd BTR evaluation inconclusive • We would not be able to use BTR for 5-year evaluation
Fourth Year Expectations • Grant recipients would be able to refine logic models • We would identify data gathering tools that could be shared among grant recipients • Grant recipients would describe their results in outcomes terminology in the annual report
Fourth Year Expectations • State Library would begin the Exemplary Project Recognition Program using new state level guidelines • LSTA evaluation would tell us how the OBE implementation was going
Outcomes Timeline: FY01 • Panel discussion of outcomes at annual Library Directors’ Conference • LSTA sub-grant categories changed from 6 to 2; new standardized outcomes created • LSTA Capacity Building Workshop (outcome plans refined) • LSTA grantwriting workshop featured new guidelines
Outcomes Timeline: FY01 • LSTA applications required outcome plan (not scored) • First sub-grants using outcomes model awarded • Six evaluation workshops conducted • LSTA program evaluation
Standardized State Outcomes • Characteristics • Broad scope, relevant • Easily understandable • Focus on skills/behaviors • Can include quantitative and qualitative indicators • Tailored to local programs
Access for persons having difficulty using libraries Persons having difficulty using library services use services or information that were not previously available Library Technology Connectivity and Services Public uses technology to get information Public learns to use technology Standardized State Outcomes
Golden Gateways Library Family Learning Centers Small Business Information Resource Center REACH: REmote ACcess to the Homebound Parents Plus Born to Read Juniors to Seniors: Hillsborough Remembers Library Elderly Outreach (LEO) Project Access for persons having difficulty using libraries
Development of the Everglades Information Center Electronic Library: Community Training and Outreach Seniors Connect @ Jacksonville Public Library Mobile Training Lab Osceola Internet Improvement Library Automation: Moving Legal Information Toward the Public Mi Servicio de Biblioteca Library Technology Connectivity and Services
Evaluation of Capacity Building Workshop FY01 • General Presentations perceived as covering familiar ground • Small group, similar grant sessions energized attendees • Attendees revised outcome plans the way preferred by breakout session presenter • Some attendees were restricted from changing their outcomes plan
Evaluation of Capacity Building Workshop FY01 • 1/5 of those suggesting future topics mentioned evaluation “how to” topics • Library staff seem to need help with: • Identifying indicators of success • Methods of anlayzing data
Outcomes Plan Review FY01 • 63 total projects/plans • Grant review looked at outcome(s) statement • 56 (89%) scored 8 out of 10 points in grant review • 7 (11%) scored lower than 8 points in grant review • 5 (7%) incorrect plans
Implementation Costs FY01 Activity $$ %DLIS LSTA Evaluation $63,980 2% Grant Workshops $3,000 <1% Capacity Building Workshop $17,604 <1% Evaluation Methods Workshops $12,200 <1%
Lessons Learned FY01 • Phased in approach seems to be working • Field library staff want • Examples, especially of indicators and simple data collection tools • To see how outcomes work • To know how much data is needed • To know how to report results of their projects • To de-emphasize theory; emphasize application
Lessons Learned FY01 • We need to make connections between nationally recognized work on outcomes evaluation and what is going on in the field • LSTA evaluation did not tell us how well our OBE program was being implemented
Fifth Year Expectations • Field would • Submit logic models using format • Set targets for continuing LSTA projects • Integrate evaluation planning into projects • Describe achievement of outcomes in annual report
Fifth Year Expectations • Exemplary projects would demonstrate achievement of outcomes • We would • Simplify standardized outcomes • Be able to roll up results for IMLS • Be able to write outcomes-based strategic plan • Plan the next 5-year LSTA evaluation
Outcomes Timeline: FY02 • “Counting on Results” program presented at Library Directors’ conference • Some LSTA annual reports reflected outcomes-based evaluation • LSTA Capacity Building Workshop (evaluation strategies and tools) • LSTA grantwriting workshop included section on outcomes • On-demand evaluation workshops conducted
Outcomes Timeline: FY02, continued • LSTA applications required outcomes plan (scored); evaluation plan completed after grants awarded • Developed outcomes-based strategic plan • Consultants work individually with libraries • National Leadership Grant awarded to develop web-based outcomes evaluation training material
Assessment of Capacity Building Workshop FY02 • Non-specific requests for future topics • Attendees wanted continued opportunities to get together and learn about how others were implementing grants • Attendees wanted more hands-on opportunities
Outcomes Plan Review FY02 • 39 total projects/plans • Grant review looked at outcome(s) statement and indicators • 26 (67%) scored 13 out of 15 points in grant review • 13 (33%) scored lower than 13 points in grant review • 6 (15%) incorrect plans
Implementation Costs FY02 Activity $$ %DLIS Capacity Building Workshop $6,520 <1% Evaluation/Grant Workshops $10,000 <1% Long Range Plan Staff Time <1%
Lessons Learned FY02 • Training not always offered when needed • Most programs can’t establish targets in the first year • Having 2-3 outcomes means having many different types of programs using the same outcomes • Need to find way to roll up outcomes data to report to IMLS
Lessons Learned FY02 • Training in aspects of evaluation is more meaningful when tied to specific project types • Field understands outcomes, but struggles with indicators and data collection
Where Do We Go From Here? • We will roll up outcomes data into annual report for IMLS as a test site for new annual report form • We will find out what other states are doing so that we can borrow good ideas • We will develop relevant web-based OBE training materials
Where Do We Go From Here? • We will diversify support provided to library staff • We will find out if library staff is receiving the support they need • We will simplify the OBE process for the field
Where Do We Go From Here? • We will promote/pilot outcomes model for other library programs • We will collect information about how libraries use outcomes in Florida • Sub-grantees will continue to submit outcome plans using format