240 likes | 360 Views
“Timeliness Isn’t Everything…”. Larita L.M. Paulsen Residential Care Services State of WA. The Performance Context. Staff: Are highly professional, well-educated, & mature in career Show extreme pride of ownership over work-products
E N D
“Timeliness Isn’t Everything…” Larita L.M. Paulsen Residential Care Services State of WA
The Performance Context • Staff: • Are highly professional, well-educated, & mature in career • Show extreme pride of ownership over work-products • Have tendency to stop data collection if their point has been “proved”; not always willing to peel back the onion
Performance (cont.) • Staff: • Get frustrated when management questions them • Do not do a good job of policing themselves as peers, because that would be questioning someone else’s professional judgment. Often easier to blame management.
Most importantly…. • Critical thinking is very hard to teach to professionals • Some say you either “have it” or “you don’t”
Leaders can’t just demand • Simply demanding improved performance by assigning targets doesn’t mean that it will happen • Leaders have to provide everyone in the organization with the “system” or the “road map” • Leaders must do whatever it takes to create the operational capacity necessary to achieve the targets • Robert Behn
Another thought… • Whether developing managers to be better leaders & coaches, or you are developing employees to improve customer service & critical thinking… • You must instill accountability to transform learning into performance and performance into results. • Mark Samuel
Leaders must ask… • “What is our most consequential performance deficit?”
Performance Measurement • To date, most performance measurement around complaint investigations (both at state and federal level) has been focused on timeliness of investigation (which we usually meet) • Based on feedback and observations, mgt. team had to decide that “we didn’t know what we didn’t know”
It appeared that…. • A number of investigations were taking a long time to be completed • Complainants periodically vocalized that the investigator didn’t effectively investigate the issues they were most concerned about • In some high-profile cases, it appeared that investigators based conclusions on assumptions or incomplete data
The management team…. • Adopted the following approach to begin attack of problem • Had to ask ourselves, “What would better performance look like?”
Designed QA Strategy… • Monitoring & reporting progress frequently, personally, & publicly • Building operational capacity • Taking advantage of small wins to reward success • Creating “esteem opportunities”
Performance Management • Performance measurement is not performance leadership • Performance measurement is a passive activity & easily delegated • Performance leadership requires ceaseless, active engagement by leadership • So we decided to take it on!
The Project • Started with extensive training for all staff and managers in June of ’06 • Developed Complaint/Incidence Guidance as our definition of thorough investigation • Trained staff on detailed investigative protocols • Updated all of our operational principles and procedures
Goals of the QA project were to… • Develop consistent QA process where local managers are providing staff feedback from a similar framework • Increase communication between peer managers, and have them assume responsibility for issues that impact regional QA results • Regional Administrator responsible for developing and implementing QA plan in response to unique identified issues for local region
Goals (cont.) • Staff are recognized and rewarded for producing improvements • “Everyone can win” in the project design
Phase I Pilot • Tested QA tools & methods at the regional level prior to statewide implementation • At conclusion of the pilot, results were analyzed for trends, needed changes to tools & methods, identification of new training issues, & identification of individual performance issues.
QA Tool Development • Created QA tool to look at elements that define thoroughness of investigation • “Deceptively simple” because most questions can’t be answered “yes” or “no” • Managers and staff had to apply both critical thinking and judgment to answer elements on tool
The Review Process… • Headquarters (HQ) pulled a sample list of complaints/incidents; random selection of higher-priority (2-day, 10-day) • QA reviews were conducted by regional Field Managers and a parallel review panel at HQ. HQ program managers gave the “any man” perspective because not surveyors
Review (cont.) • Field Managers reviewed peer manager unit’s work; peer review responsibility rotated each month • At outset of project, defined responsibilities of each party involved • The investigative guidance described what the operational policies were for complaint/incident investigation
Review (cont.) • The worksheet identified key elements from each of these operational policies • Not all of the key elements were expected to be formally documented, but there had to be some evidence to support positive scoring • Managers were encouraged to ask questions of investigators to clarify what thinking had been or if there was missing information
Review (cont.) • Both field managers and HQ completed same worksheet and looked at identical packets of information for each investigation (including review of working papers) • Field managers discussed their review findings with peer manager, then the Regional Administrator.
Review (cont.) • HQ did both quantitative and qualitative analyses of results. These were discussed at various management meetings. Further QA action plans resulted. • Quantitative results were reported on agency intranet so that staff could view progress frequently.
Results…a work in progress • We learned a lot! • Never assume, because you will be surprised • Staff response to project • Other lessons learned • Next steps