1 / 15

Program theory driven evaluation of The rationalized upgrading of dost regional standards and testing laboratories in Re

Program theory driven evaluation of The rationalized upgrading of dost regional standards and testing laboratories in Region-x. Engr. Romela N. Ratilla 1 st M & E Network Forum Crowne Plaza Manila Galleria, Quezon City November 8, 2011. Outline of the presentation. Problems

freja
Download Presentation

Program theory driven evaluation of The rationalized upgrading of dost regional standards and testing laboratories in Re

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program theory driven evaluation of The rationalized upgrading of dost regional standards and testing laboratories in Region-x Engr. Romela N. Ratilla 1st M & E Network Forum Crowne Plaza Manila Galleria, Quezon City November 8, 2011

  2. Outline of the presentation • Problems • Evaluation Approach • Findings • Recommendations • Lessons Learned-Insights

  3. I. problems • What is the PT of the project RURL? • How is the analytical testing service delivered? • What is the profile of the customer-enterprises of DOST-X RSTL in terms of asset, number of employees, market, testing needs, sector, with QA system and Laboratory? • How do the customer-enterprises rate the services of DOST-X RSTL in terms of sustainability, relevance, timeliness, cost and quality? • Is there a significant difference in the delivery of DOST-X RSTL’s testing services for different customer categories

  4. PROBLEMS 3. Were the desired outcomes accomplished or was the RURL project effective? • To what extent did the testing services of DOST-X RSTL help the customer-enterprises considering quality of products, income, employment opportunities, development of new products, compliance to external regulations? 4. Is the PT of RURL valid? • What is the trend of DOST-X RSTL’s delivery of testing services three (3.5) years before and after the implementation of the project in terms of income from testing fees, number of enterprises served, number of tests rendered? Is there a significant difference in the trends? • What is the counterfactual of the project in terms of income from testing services, number of enterprises served and number of tests rendered? • What is the impact of the project on the 24th month or after 2 years of implementation?

  5. PROBLEMS 4. Is the PT valid? • Is there a significant relationship between the enterprise’s volume of patronage and extent of final outcome in terms of quality of products, income, employment opportunities, development of new products, compliance to external regulations? • With service delivery as moderator, is there a significant relationship between the enterprise’s volume of patronage and extent of final outcomes? • With customer-enterprise characteristics as moderator, is there a significant relationship between the volume of patronage and extent of final outcome in terms of?

  6. II. Evaluation approach • Mixed Methods integrated within the PT driven evaluation framework of Chen (2005) • Quasi-experimental design • Interrupted Time Series • Counterfactual estimation • Descriptive • Correlational/Cross-sectional/Ex post facto evaluation • Empirical-analytical approach (Leeuw, 2003)

  7. Approach • Program Theory Framework (Chen, 2005)-Roadmap • Two causal theories or models- Logic • Counterfactual Model for causal analysis of Rubin often referred to as the Potential Outcomes Framework (as cited in Morgan and Winship, 2009). • The Model of Sufficient Component Cause proposed by Rothman in 1998, (Rothman, Greenland and Lash, 2008). • INUS Condition (Mackie, 1974 as cited in Shadish et al., 2000;VanderWeele and Robins, 2006). “Insufficient but non-redundant part of an unnecessary but sufficient condition” example: a lighted match as a cause of forest fire

  8. approach SAMPLING, RESPONDENTS AND INSTRUMENTS • To construct the PT • Project documents • SQ-I • Validation by FGD • Sampling frame-18 elements • 16 responses (87%) • Interrupted Time Series (Determinants to Intermediate Outcome) • Time Series (Monthly) Data 2004-2010 • Income Generated by DOST-X RSTL • Number of Enterprises Served • Number of Tests Conducted • Descriptive (Intermediate Outcome-Final-Outcome) • SQ-II • Sampling Frame – 231 elements (customer- • enterprises from 2004-2010) • 52 responses (22.5%)

  9. IV. Findings Chain of Assumptions RURL Program Theory Framework • Good service delivery • Strength- Quality and Relevance • Weakness- Timeliness and Cost • Ratings • Trends in Intermediate Outcome • Decreasing trends before the project • Increasing trends after the project • Trends

  10. Findings • Relationships • Volume of patronage is not a significant predictor of extent of final outcomes • Volume of patronage is a significant predictor of extent of outcome in terms of income of the enterprise, employment generation and development of new products when service delivery is better. • Volume of patronage is a significant predictor of extent of outcome in terms of quality of products to MSMEs but not to large enterprises. • Volume of patronage is not a significant predictor of extent of final outcome in terms of compliance to regulations with any of the factors included in the study. • Summary of Linear Regression Results

  11. IV. Recommendations • Short term • Computerize the sample receiving and simplify the forms to reduce transaction time to 15 minutes. • Explore rapid and less costly testing methods. • Conduct orientation to staff on service quality dimensions to improve their understanding on customer expectations about quality service. • RSTL is to keep an updated database of the profile of customers. • Provide more working hours in microbiological testing to reduce waiting time. • Reach out to the micro enterprises and make the services relevant to them. • Thank respondents and inform them of the results and action taken • Open regular positions given to chemists or microbiologists

  12. Recommendations • Long Term • Inclusion of an explicit program theory in projects of DOST and government in general • The program theory must be a part of project conceptualization, implementation and evaluation of government interventions • Service delivery assessment must also be a component of every government project outcome-evaluation • Program theory driven evaluation to determine effectiveness of government projects

  13. v. LESSONS LEARNED (Obstacles and help ) • Design (i.e., simple, effective and readily available outcome indicators) • Evaluation approach (practical and less costly) • Use of office database and project documents • Involve project planners, implementers and other stakeholders • Getting responses (Response rate) • Follow up • Make respondents understand that they are stakeholders and would benefit from the results • Inform them of the results of the survey and actions taken

  14. V. Lessons learned (IMPORTANCE AND APPLICATION) • Importance • Determine strengths and weaknesses in project implementation • Identify points for improvement • Improve project effectiveness • Better decisions for projects of similar PT • Good governance • Application • Outcome or impact evaluation of projects/programs

  15. v. LESSONS LEARNED (Good outcomes and Tips) • PT of RURL found valid • RSTL delivered the services well • RURL is effective-desired outcomes are accomplished • Points for improvement are identified and actions immediately implemented • DOST might consider a new project (Phase II) with same PT • Maintain accurate data bases in the offices • Constantly solicit feedback from customers • Make implementers understand the importance of service delivery to project effectiveness • Evaluate service delivery as part of personnel performance evaluation THANK YOU FOR YOUR KIND ATTENTION! melrats@yahoo.com

More Related