1 / 29

WAP Monitoring & Quality Control

WAP Monitoring & Quality Control. Thursday, September 23, 2010. Agenda. DOE Monitoring Approach IG/GAO Trends DOE Field Implementation Quality Assurance Grantee Monitoring. DOE Monitoring Approach. Monthly Timeliness Accuracy Production Planned vs. Actual

bary
Download Presentation

WAP Monitoring & Quality Control

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WAP Monitoring & Quality Control Thursday, September 23, 2010

  2. Agenda • DOE Monitoring Approach • IG/GAO Trends • DOE Field Implementation • Quality Assurance • Grantee Monitoring

  3. DOE Monitoring Approach • Monthly • Timeliness • Accuracy • Production • Planned vs. Actual • Previous Month vs. Current Month • Expenditures • Planned vs. Actual • Previous Month vs. Current Month

  4. DOE Monitoring Approach • Quarterly • Completeness • OMB 1512 • DOE 504.3 Program Report • SF-425 Federal Financial Report • Timeliness • Accuracy of Data • Performance Targets • Production • Expenditures • Monitoring • Training

  5. DOE Monitoring Approach • On-Site Visits (Minimum frequency/Year) • Grants over $100 Million – Quarterly • Grants between $100 Million - $40 Million – Three visits • Grants under $40 Million – (Semi-Annual) • On-Site Visit Content • Program Administrative Management • Program Technical Management • Program Financial Management

  6. IG/GAO Trends • Quality of Oversight of Programs, Contractors, and Program Staff • Substandard installation of energy saving materials • Davis-Bacon non-compliance • Low Spending Rates/Grant recipients fell short of goals to weatherize homes

  7. IG/GAO Trends • Inadequate systems capable of tracking overall performance by individual contractors and local agencies • Insufficient Internal Controls to Prevent Fraud, Waste, and/or Abuse of Funds: • Poor workmanship • Incomplete client files • Lack of qualified staff • Incomplete vehicle and equipment inventories • Onsite monitoring and/or inspection

  8. A Snapshot of IG Conclusions and Observations The State X CAA had not implemented financial and reporting controls needed to ensure Weatherization Program funds are spent effectively and efficiently. Specifically, State X CAA had not: • Performed on-site financial monitoring of any of its sub-grantees under the Recovery Act; • Reviewed documentation supporting sub-grantee requests for reimbursements to verify the accuracy of amounts charged; • Periodically reconciled amounts paid to sub-grantees to the actual cost to weatherize units; • Maintained vehicle and equipment inventories as required by Federal regulations and state and Federal program directives; and, • Accurately reported Weatherization Program results to the Department.

  9. DOE Field Implementation • Grantee Level • State Plan Implementation • Grantee Policy and Procedures Manual and Field Standards • Reporting Systems (Financial, Administrative, and Programmatic) • Training and Technical Assistance • Subgrantee Level • Program and Financial Management • Production Planning and Quality Control Process • Procurement • File Management

  10. DOE Field Implementation • Dwellings • Training Requirements • Application Process • Initial Inspection/Audit • Health and Safety • Technical Standards and Installation • Quality Control • Job Costs and Funding Sources

  11. Federal – QA Reviews • All agencies to be visited by contractors • Contractors may NOT be “weatherizers” • Home Inspectors that are BPI credentialed; RESNET/HERS; or Weatherization Trained • Captures information for PO to determine where attention needs to be focused • Strategy • “SMS version” -- Hit the state with 3-6 staff – disrupt operations for a week in lots of agencies – then get out • “QA Contractor” – PO schedules reviews, depending on need, could be continuously (xx% a week) • Results • Aggregated results/discussion will lead to focused attention by the Project Officer

  12. The Process is a Cycle

  13. Quality Assurance Reviews • The unit review form has 3 major sections: • Agency/unit identification • File review • On-site work assessment • Attic • Sidewalls/kneewalls • HVAC • Subspace • Windows/doors • Other measures • Additional attention needed?

  14. Unit identification and file review • The unit identification and desk file review sections should be completed at the State or sub-grantee office prior to the site visit. • We usually spend half a day reviewing 8-12 files then a day and a half in the field • 45 minutes for the first one; 35 minutes for the next; 15-20 minutes for the rest… (we are trainable!) • 45 minutes (on average) going through a unit

  15. Quality Assurance Example: FILE REVIEW Unit Assessed Using: (select one) Energy Audit Priority List None Evidenced Energy Audit or Priority List is included and aligns with expected audit/list (approved by DOE) Energy Audit or Priority List is included, but does NOT align with approved audit/list No clear indicator of use of site specific audit or approved priority list in assessment Work Order/Building Weatherization Report Yes No DOE Investment: $_________________________ Total Job Investment: $_____________________ Yes, work order is included in file and appears to follow the audit/list Yes, work order is included but does NOT follow approved audit/list No, work order is not in file

  16. Trends in Quality within State • With reviews of 4-6 different agencies in a state • We look for isolated issues for specific agencies • Agency may be mis-entering information in the audit tool (e.g., door = infiltration measure reduction) • Missed opportunities for savings -- sometimes the “low hanging fruit” (e.g., no refrigerator replacements allowed) • We look at what may be an area for the state to affect change (peer-to-peer, training needs)

  17. Electric Base Load Measures

  18. Regional Trends and Opportunities • With reviews of 4-6 different states in a region • We look for isolated issues for specific states • Statewide concerns (e.g., windows sans justification) • We look for trends that might be housing stock or climate based • Air sealing challenges based on a specific housing stock • How solar screens are being implemented in the program in some of the hot climates • We think about how best to pass information from one state to another

  19. Window Sunscreens

  20. National Trends and Opportunities • After seeing about 20 states, you start to think about national opportunities/conversations • Audits and Priority Lists • Clarifying when states with a priority list need to run a site specific audit • Multi-family misunderstandings – what states need to do if they do not have an approved multi protocol in place • Health and Safety • What does that mean, how does it get implemented

  21. Passing along “Best Practices” • We see opportunities to share with others • Colorado – Passport for crew advancement • Iowa – Agency photo files were superb • Idaho – Best LSW documentation we’ve seen • North Carolina – Agency has amazing file documentation, should be replicated • Vermont – Great use of two-part foam • Mississippi, Alabama, Illinois, Georgia, New (all)…

  22. And, building up our “can you believe” library Cardboard box used for fan enclosure

  23. Can you figure this one out??

  24. Personal Favorite Agency assured us, this is not their typical method

  25. What do we know • We know: • 30,000 houses are going to be reviewed • To date 2000 have been done • Every local agency is going to be visited • Maybe multiple times, depending on production levels • Local agencies are going to be asked to go with the contractor on reviews (Grantee is welcome) • At their discretion, agency can send the contractor out without local agency staff; that is the choice of the agency

  26. Grantee/Subgrantee Scheduling is Not Random Factors Scheduling Approach Schedule 3 months in advance Advise PO’s and Contractor PO’s can confirm schedules with Grantees/sub-grantees Problems on timing or availability can be accommodated with schedule adjustment (which, when) Vacations can be worked around Contractor can identify resources as required Finalize schedule @2 weeks prior to start of next month • Levels of Grantee and mandatory visits/year requirement and % requirements • Factors in last (most recent) review dates • Current state of Actual production, units weatherized vs. Planned • Number of units to be inspected per Subgrantee based on the “inherited” schedule from the Grantee – even considers some Subgrantees with disproportionately higher number of units than others in state • Factors in the net number of reviews done already • Need to balance to @ 2000-2200 units/month over several years

  27. What we DON’T Know • We don’t know: • Who – contract hasn’t been awarded • When – when is predicated on determining who… • What – (I just threw this in since I was on a roll) • Where – DOE hasn’t determined priority points of what states will go first • Will it be states that haven’t been visited by SMS? • Will it be large states with high production? • How – training of contractors STILL must occur, even after award • Let’s make sure they know what they are looking at

  28. We’re all in this together • Weatherization Plus goal • The quality of work would be the same regardless of locale • Evaluation… Quality Assurance Reviews… PO increased monitoring efforts… • These equal lots of opportunities to get it right and affect change… together!

  29. Questions/Comments?

More Related