360 likes | 882 Views
Data Gathering Basics of Safe Patient Movement and Handling Research Design. William F. Wieczorek, Ph.D. Buffalo State Center for Health and Social Research Chaitali Ghosh, Ph.D. Mathematics Department, Buffalo State College. 2 nd Annual Safe Patient Handling in Health Care Conference,
E N D
Data Gathering Basics of Safe Patient Movement and Handling Research Design William F. Wieczorek, Ph.D. Buffalo State Center for Health and Social Research Chaitali Ghosh, Ph.D. Mathematics Department, Buffalo State College 2nd Annual Safe Patient Handling in Health Care Conference, Niagara Falls, NY October 1-2, 2008 This project is supported through a New York State Legislative Initiative (#C070232) sponsored by Senator George Maziarz.
Session Overview (1) • Basics of Research and Evaluation Design • Defining Meaningful Objectives • Identifying data relevant to patient handling/movement program implementation
Session Overview (2) • Methods of collecting data • Getting to Outcomes: analyzing and presenting results from your data • Examples from the New York State Safe Patient Movement and Handling Demonstration Project
Defining Terms • Research means a systematic investigation, including development, testing, and evaluation, designed to develop or contribute to generalizable knowledge. (45 CFR46.102(d)). • Definition of research used by US government (e.g., Institutional Review Boards, human subjects)
Defining Evaluation • Program evaluation is defined as follows: A systematic assessment of the results or outcomes of program efforts to measure actual outcomes against the intended outcomes of the program; to discover achievement and results; to discover deviations from planned achievements; to judge the worth of the program; to identify unintended consequences; and to recommend expansion, contraction, elimination, or modification of the program. (DOJ) • Evaluation is a specialized form of research
Rationale for Evaluation • Broad options • Summative: outcomes • Formative: processes, implementation issues • Long-term sustainability • Make specific to your project and needs • Be inclusive in planning the evaluation • Get feedback from array of participants/constituencies
Defining Meaningful Objectives • Scope of the project • How large (1 facility, 100 facilities, number of residents/clients) • Type of facility (acute, long-term, home care) • Type of clients/residents/patients • Number of staff
Purpose of Data Collection • Write down your program objectives • Share them with others (get feedback) • Link the objectives to specific data • Project milestones and specific dates • Worker injury data • Staff retention • Lost time
Access to Relevant Data • Simple but not easy!!! • Can you truly get the information you need • Reasonableness as initial assessment of access (Does it exists, Can it be accessed, In what format)
Quantitative/Qualitative Data • Quantitative data: something that is measureable as a number • such as a count of events, rate of injury, costs, scale • Instrument-based questions • Usually analyzed using statistical techniques • Focus is on outcomes/summative evaluation
Quantitative/Qualitative Data • Qualitative data: non-numerical measures • Verbal feedback, narratives, open-ended feedback • Analyzed from a context viewpoint • Usually informs process evaluation • Mixed methods evaluations utilize quantitative and qualitative data
Data for Safe Patient Handling Projects • Workplace data • Occupational injuries (back, strains, shoulder, etc) • Time off/lost time • Worker retention/turnover • Patient Data • Injuries, satisfaction • Systems/environment data • Ergonomic/safety committees • Training/updates
Data Collection Methods • Record data • Injuries/OSHA/DOL logs, sheets etc. • HR data • Business office data • Patient injury data • May be more challenging than anticipated!!!! • Need specific protocol for each type of record data • Issues with validity/data quality
Data Collection Methods • Surveys/Questionnaires • Self-report of injuries to compare with record data • Staff assessment of program • Implementation issues • Key knowledge assessment • Staff satisfaction/Patient satisfaction • Open-ended feedback
Data Collection Methods • Key informants • Specific questions regarding the program • Interview with persons knowledgeable about program • Can be done with group of informants • Focus groups • Group process • Questions with a group of similar people
Data Quality Issues • Validity and reliability • Validity is whether something is an accurate measure • Reliability is whether the measure is stable (test-retest, internal consistency for scales) • Data quality is variable • Biases • Need for convergent information from multiple sources
Getting to Outcomes • Methods of analyzing your data • Simple non-statistical techniques • Comparisons, interpretations of qualitative data, assessment of milestones • Statistical methods • Allows one to assess the probability that changes are real versus being due to chance • Usually requires training in methods
Getting to Outcomes • Statistical considerations • Need for adequate sample size and valid measures • Need to use statistical methods appropriate for the data sources • Pre-/post tests, trends overtime, comparison of local data with state and national data • Descriptive statistics (frequencies, means) often used for survey data
Getting to Outcomes • Match your data analysis and presentation to the needs of your audience!! • Scientific publication • Internal reports • Funding agencies
Example • Safe Patient Handling and Movement Project • NYS demonstration project • Three long-term care facilities • Two hospitals • Evaluate impact (summative eval) • Assess issues that impede/assist implementation (formative eval)
Data sources • Record data • Worker injuries • Days lost • Type of injury • Calculated costs • For period of three years prior to project and three years during project • Adding patient injury data
Lessons and Recommendations • Getting data is harder than it sounds! • Key staff turnover • Data quality is a major issue for record data • Potential biases, changes in record methods over time • Even “required” data may not be readily available
Lessons and Recommendations • Need to build rapport/relationship • With your team • With key administrators • With union reps • With key nursing and training staff • Be persistent • Develop procedures for contacts and follow ups
Lessons and Recommendations • Baseline data provides useful insights • Tremendous variability in some measures (e.g., days lost) • Small populations and the impact of a single major injury can skew the data • Adds more degree of difficulty for single sites and demonstration projects • Formative: processes, implementation issues
Lessons and Recommendations • Surveys and staff feedback are critically important • Showed that staff doesn’t recognize importance of ergo teams • Differences in perceptions regarding program between floor staff and administrators • Injury reports from survey indicates that many injuries and near misses are not reported