1 / 14

Department of State Program Evaluation Policy Overview

Department of State Program Evaluation Policy Overview. Spring 2013. Overview of Policy and Status. Policy was approved by Secretary on March 1, 2012 This is the first State Department Policy

lionel
Download Presentation

Department of State Program Evaluation Policy Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Department of State Program Evaluation Policy Overview Spring 2013

  2. Overview of Policy and Status • Policy was approved by Secretary on March 1, 2012 • This is the first State Department Policy • The policy defines evaluation as : A systematic and objective assessment of an on-going or completed project, program or policy: Evaluations are undertaken to (a) improve the performance of existing interventions or policies, (b) asses their effects and impacts, and (c) inform decisions about future programming. Evaluations are formal analytical endeavors involving systematic collection and analysis of qualitative and quantitative information.

  3. Policy Application • The Policy applies to evaluating the Department’s diplomatic, management and development programs, projects, and activities • Evaluation is essential to the Department’s ability to: • Measure and monitor program performance • Make decisions about programmatic adjustments and changes • Document program impact • Identify best practices and lessons learned • Assess return on investment • Provide inputs for policy, planning, and budget decisions • Assure accountability • The Policy intends to: • Provide a coordinated strategy and work plan for conducting evaluations • Respond to stakeholder demands for transparency in decision-making • Work in concert with existing and pending Department policies, strategies, and operational guidance

  4. The Department of State • Is very large and diverse • It has 51 major bureaus and offices covering a huge array of fields form nuclear disarmament to health to law enforcement to human trafficking to visa and passports • It has over 200 posts overseas • Our policy on evaluation not only includes programs but management and diplomacy

  5. APPLICABILITY & PURPOSE • Effective March 1, 2012, the evaluation policy applies to all State Bureaus and Major Offices such as S/GAC. • The Policy proposes a framework to implement evaluations and is intended to provide: • Clarity about the purposes of evaluation • Evaluation requirements • Types of evaluation • An approach for conducting, disseminating, and using evaluations • Evaluation at the Department has two primary purposes:

  6. EVALUATION REQUIREMENTS • Policy requires “all large programs, projects, and activities be evaluated at least once in their lifetime or every five years, whichever is less.” • Programs, projects, and activities broadly defined so that bureaus can apply the policy according to the level where most of their funding is used. • Beginning in FY 2012, bureaus are required to evaluate 2 - 4 programs, projects, and activities within a 24-month period. Posts will be added in FY 2013. • Bureaus must also ensure that implementing organizations carry out evaluations consistent with The Policy’s guidelines.

  7. EVALUATION TYPES & STANDARDS • Types: Performance Evaluation vs. Impact Evaluations • The expectation is that most evaluations conducted will be Performance Evaluations. • Whether Performance or Impact, all evaluations must be context-sensitive, independent, and methodologically sound.

  8. EVALUATION TYPES & STANDARDS Standards: Methodological Rigor & Independence and Integrity • Methodological Rigor • Evaluations should be “evidence based,” meaning that they are based on verifiable data and information that have been gathered using professional evaluation standards • Data must be reliable and valid • Qualitative and quantitative data are acceptable • Independence and Integrity • Bureaus must ensure that evaluators are free from any pressure and/or bureaucratic interference. • Bureau staff and managers should be actively engaged during the evaluation process, but that engagement must not improperly influence the outcome.

  9. EVALUATION USE & BUREAU EVALUATION PLANS EVALUATION USE • Evaluation findings must be integrated into decisions made about strategic plans, program priorities, project design, planning, and budget formulation. BUREAU EVALUATION PLANS • Bureaus must develop and submit a annual Bureau Evaluation Plan (BEP) • The BEP will cover a three-year period and will be updated annually.

  10. EVALUATION RESPONSIBILITY AGENCY LEVEL • F and BP will work closely with the Performance Improvement Officer (PIO) to assist bureaus in implementing the policy. • F – Coordinator for Foreign Assistance-funded programs, projects, and activities evaluations • BP – Coordinator for State Operations-funded programs, projects, and activities evaluations • F and BP to coordinate tools, technical support, funding and evaluation training to assist bureaus in implementing The Policy. BUREAU LEVEL • Management Responsibility • It is the bureau’s responsibility to ensure that evaluations are planned, budgeted, and conducted. • Bureaus are asked to budget for evaluations in their 2014 BRR. • Implementation Responsibility • It is the bureau’s responsibility to implement and manage evaluations.

  11. BUREAU POINT OF CONTACT & EVALUATION RESOURCES BUREAU POINT OF CONTACT • Each bureau must identify a senior staff person to be the Bureau Point of Contact for evaluation • Should be a Deputy Assistant Secretary or their designee • Will be the main point of contact in the bureau on evaluation • Will interact with the PIO, F, and BP on the bureau’s evaluation efforts and compliance with the evaluation policy. EVALUATION RESOURCES • Performance monitoring and evaluation are allowable Foreign Assistance program costs • Evaluation costs will vary by program, so no set amount is prescribed. • Based on international professional standards, program managers should identify resources of up to 3-5% for evaluation activities.

  12. SUPPORT AND TECHNICAL ASSISTANCE • Indefinite Delivery/Indefinite Quantity (IDIQ) Contracts: Five IDIQ contracts to facilitate contractual services for evaluations: Dexis Consulting Group; Development and Training Services; DevTech Systems; Social Impact, Inc.; and Weidemann Associates. • Evaluation Guidance: Covers planning for evaluations; SOWs; data collection methods; evaluation reports; using evaluation information; confidentiality; role of Bureau Evaluation Coordinator • Training: Department is providing training and technical assistance (TA&T)to bureau staff. Will coordinate TA&T with Bureau Evaluation Coordinators

  13. DOCUMENTING & SHARING EVALUATION REPORTS • Bureaus and posts must maintain an official copy of completed evaluation reports. • Final evaluation reports will be sent to a new internal web site • Bureaus and posts are required to post copies of their evaluation reports on their OpenNet or ClassNet website homepage. • Summaries of completed evaluations will be reported in the Department’s Annual Performance Report.

  14. Implementation efforts and status • The first task was to build staff capacity and a culture which values evaluation • We have an active IDIQ contract with 5 firms • We have two professional training programs operative and have trained over 100 staff thus far • We have a very active ((100 members) Community of Practice group which meets monthly to pick up and share information • We have 45 ongoing evaluations and over 100 planned this year • We are actively involved with the DAC and AEA as well as colleagues around the world

More Related