1.18k likes | 1.49k Views
Resource Database Assembly: The Next Generation. Part One. How We Got Here. 2013 AIRS Conference sessions in Portland Common thread emerged in discussions about best practices, potential metrics, staffing models, etc.
E N D
How We Got Here 2013 AIRS Conference sessions in Portland Common thread emerged in discussions about best practices, potential metrics, staffing models, etc. Opened the group to volunteers via the AIRS Networker Open Forum in June, 2013 Put together a new survey for resource work (last one was in 2008)
Today’s Presenters John Allec, Findhelp Information Services, Toronto, ON Sue Boes, United Way for Southeastern Michigan, Detroit, MI MariolyBotero, United Way of Greater Atlanta Katie Conlon, Iowa Compass, Center for Disabilities and Development, Iowa City, IA Cathleen Dwyer, CDK Consulting, New York, NY Steve Eastwood, 2-1-1 Arizona, Community Information and Referral Services, Phoenix Polly Fay-McDaniel, Institute for Human Services, 2-1-1 HELPLINE, Bath, NY Lindsay Paulsen, 2-1-1, United Way of the Midlands, Omaha, NE Edward Perry, 2-1-1 Tampa Bay Cares, Clearwater, FL
Additional Group Members Matthew Finley, United Way Services, Cleveland, OH Jan Johnson, Council of Community Services, 2-1-1 Virginia SW Region, Roanoke, VA Clive Jones, AIRS Vicki Lofton, Heart of Texas Council of Governments, Waco, TX Tamara Moore, United Way of Central Maryland, First Call for Help, Baltimore, MD Georgia Sales, 2-1-1 LA County, San Gabriel, CA
Survey Respondents Agency Type 145 Agencies from 41 States & Provinces
Where We’re Going Discussions of the group this last year + Feedback today = Recommendations for Staffing Metrics Database update percentage requirements Etc.
How Much Wood Could a Woodchuck Chuck if a Woodchuck Could Chuck Wood…. In other words… Realistically, how many records can a resource specialist keep updated?
Sue Boes United Way for Southeastern Michigan, Detroit, MI Record Complexity
Record Complexity A way to measure the degree of difficulty as relates to the time/cost required to manage a set number of records Application of a consistent formula that “weights” individual database elements and scores them A tool for determining where to most effectively apply staff time and resources
Record ComplexityMethod Assign points to database elements Develop a scale Determine average work hours Consider variables Create formula Review possible outcomes
Record ComplexityDetermine Average Work Hours Simple Entries: 1 – 5 hours (2.5 hours average) Moderate Entries: 6 – 10 hours (7.5 hours) Difficult Entries: 11 – 20 hours (15 hours) Complex Entries: 21 – 40 hours plus (30 hours) Time should include research
Record ComplexityVariables • Skill set of data entry staff (learning curve) • Variance in point spread – additional agencies, sites or services add time • Time required to research and validate information • Ability to verify information – agency cooperation, returned phone calls, URL, etc… • Availability of standardized infrastructure to manage consistent data entry parameters • Current implementation of best practice protocols and AIRS standards
Record ComplexityFormula Average hours per level of difficulty x the given number of records at that level of difficulty. When totaled for each level of difficulty, provides the sum total hours required to manage the database.
Record ComplexityCalculations 1412 x 2.5 (average) = 3530 hours 269 x 7.5 (average) = 2017 hours 102 x 15 (average) = 1530 hours 43 x 30 (average) = 1290 hours Total hours for all tiers = 8367
Record ComplexityApply Formula Sum of hours for all tiers of complexity = 8367 8367 hours required to maintain a database of this complexity make-up and size At 1950 hours per FTE (37.5 hours per week x 52 weeks), requires approx. 4.5 FTE 405 records per 1FTE (1826 records/4.5)
Record ComplexityPractical Applications Database Management projections Parity projects FTE’s required to manage database Equitable assignments Evaluate new initiatives
Record ComplexityApplication to Staffing Plans Define resource management tasks Define “other than” resource management tasks Account for percentage of staff time on both Apply complexity formula to database What percentage of staff time is required to established database management goals?
Record ComplexityDatabase Management Tasks Formal and informal updates New agency development Style guide adherence Application of AIRS standards and best practices Taxonomy upkeep Quality measures
Record Complexity“Other” Tasks Organizational projects and meetings that support organizational agendas Professional development (StrengthsFinder) Outreach/presentations to community Mailing of promotional materials Vendor liaison Availability to contact center (time on phones) Data and reporting (quarterly and annual reports) Volunteer management
Record ComplexityJob Task Analysis http://airsnetworker.airs.org
Record Complexity“Other” Tasks Survey Results Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Record ComplexityFood for Thought Is there a pattern to the complexity of databases? Very small sample indicates 2% Complex, 5% Difficult, 15% Moderate and 78% Simple Could that pattern be used to help define a reasonable number of FTE per records as an industry standard?
Record ComplexityApply Formula 25 *30 =750 (2% complex) 63*15 =945 (5% difficult) 188*7.5=1410 (15% moderate) 977*2.5=2442 (78% simple) Total hours = 5547 Translates to 2.85 FTE (5547/1950 hours) at 440 records per FTE
Polly Fay-McDaniel Institute for Human Services, 2-1-1 HELPLINE, Bath, NY Update Percentages
Update Percentages Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Update Percentages AIRS Standards and Quality Indicators for Professional Information & Referral, Version 7.0, Revised March, 2013 Standard 12: Database Maintenance The I&R service has procedures to ensure that information in the resource database is accurate and complete. At a minimum are an annual survey of all organizations in the database and interim updates of records throughout the year as new information becomes available.
Update Percentages Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Update Percentages AIRS Standards and Quality Indicators for Professional Information & Referral, Version 7.0, Revised March, 2013 Use materials submitted by agency or gathered elsewhere • Website • Questionnaire • Social Media • Pamphlets • Newspaper Articles • Telephone directories
Update Percentages AIRS Standards and Quality Indicators for Professional Information & Referral, Version 7.0, Revised March, 2013 “Once the I&R services (that means YOU as a trained resource specialist) is satisfied that it has obtained the best information possible…it is permissible to mark the agency as having its annual review.”
Update Percentages Process improvement – maybe your procedures aren't working? Do you even have written procedures in place? Are we expecting too much per FTE? (have you looked at the complexity of your database to ensure you have enough FTEs in place to do the work?) Are you including other tasks and not leaving enough time for database development and maintenance work? Additional benchmarks in place evaluating work of Resource Specialists? Are they doing their jobs?
Update Percentages Update fatigue by agencies, the increased demand on our agencies to do more with less? Does the percentage of those living below poverty within the geographic center for services impact our work? The credibility of the overarching agency, competing agendas? Are we seeing changes in the way service providers share and exchange information that no longer look like the usual ways of networking?
Update Percentages Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Survey https://www.surveymonkey.com/s/NS3Z9YD Papercopies also available
Coming Up In Part Two Answer: This quiz show debuted 50 years ago, on March 30, 1964.
Fun! Answer: This quiz show debuted 50 years ago, on March 30, 1964. Question: What is Jeopardy? http://www.211arizona.org/jeopardy
A woodchuck would chuck as much wood as a woodchuck could chuck if a woodchuck could chuck wood! SO… Are we doing as much as we possibly can, or are we doing all we can and doing it correctly?
Cathleen Dwyer AIRS Database Reviewer CRS, CIRS CDK Consulting, New York, NY Resource Database Standards
Resource Database Standards AIRS requires that 6 database standards be met for accreditation (#s 7-12) Inclusion/Exclusion Criteria Database Elements Classification System/Taxonomy Content Management/Indexing Database Search Methods Database Maintenance
Resource Database Standards Standard 8 – Data Elements • Are all required data elements accommodated by your software? • When software does not include a required data element, have you developed a “work-around”?