1 / 20

USAID MISSP: Metrics for Information Systems Security Program

Learn about USAID's Model Information Systems Security Program (MISSP) and the importance of metrics in managing and improving information systems security. Presented by James P. Craft, USAID IA Program Manager.

rjoann
Download Presentation

USAID MISSP: Metrics for Information Systems Security Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Metrics and the USAID Model Information Systems Security Program (MISSP) Presented by James P. Craft USAID IA Program Manager jcraft@usaid.gov 202/712-4559 If something can’t be measured, it can’t be managed!

  2. Agenda • USAID’s Information Systems Security Challenge • USAID Response to Challenge • Model Information System Security Program (MISSP) • Micro- and Macro- Metrics at USAID • Results • What is a “Good” Metric • Conclusion

  3. ISS Challenge at USAID • ISS identified as two material weaknesses • IG Audits highlighted ISS as a major agency problem • No existing ISS program • Major Financial System needed major improvement • Critical Systems were not certified/accredited • More funded needed to address critical ISS issues • No information systems security culture • No measures -- no accountability

  4. USAID Response • Recruit Agency ISSO • Set ISS vision and goals and develop USAID ISSP plan: • Security Process Framework - Program Areas with Life Cycle • Infuse Best Security Practices • Leverage external on-going ISS resources and activities e.g. MISSP • Create MISSP to identify, collect, and implement Best Practices in standard format

  5. MISSP Vision • Freely Provide a Tested, Complete Model ISS Program • Rational functional & process frameworks • At least one best practice per essential ISS activity with practices tested in a global enterprise. Designed to be dynamically scalable to available resource level using alternative best practices • Modular design to include, policy, process, training, metrics, software, hardware, and tools- the MISSP Best Security Practice (BSP) Package • Use economies of scale & leverage national initiatives • Bring Total Quality Leadership to ISS • Rapid prototype approach using multiple paradigms • Continual process improvement • Metrics vision - highlight good and bad behavior

  6. MISSP Implementation USAID REFERENCE MISSP BASELINE SSE-CMM BASED SSE-CMM BASED MAPS TO MAPS TO ISS LIFE CYCLE IMPLEMENTATION ISS LIFE CYCLE USAID USAID Assign Security Assign Security SSE-CMM Countermeasures Countermeasures ISSPP MISSP PROGRAM AREAS ISSPP Process • Security • Security Appraisal Policy Policy - Security Program Management PRIME PRIME • Security • Security Design Design Planning CONOP CONOP WBS WBS Validate Validate ng Theoretical ng Theoretical • Security • Security - Personnel Security Analytical Analytical Architecture Architecture Security Security Specification Specification Properties Properties PROGRAM - Security Training AREA MANAGER Function • Risk and • Risk and • Vulnerability • Vulnerability SSE-CMM ISS Threat Threat Analysis Analysis Performance Process Assessment POLICY & Assessment • Penetration • Penetration - Physical Security Appraisal PLANNING Administration Consequences Consequences • C&A • C&A • Needs • Needs Planning Planning Prioritization Prioritization PROGRAM AREA - Contingency Planning MANAGER Cycles within Empirical Empirical SECURITY Lifecycle Specify Specify • Safeguard • Safeguard TRAINING & Testing Testing Selection Selection Security Security Pragmatic Pragmatic - Technical Security AWARENESS • Safeguard • Safeguard Needs Needs • PA11 Verify • PA11 Verify Implementation Implementation Operations Operations and Validate and Validate • Safeguard • Safeguard - Incidence Response Security Security PROGRAM Testing Testing Monitor and Monitor and • PA06 Build • PA06 Build AREA Respond Assurance Respond Assurance MANAGER Argument Argument ASSESSMENT - Risk Management • PA03 Assess • PA03 Assess • PA01 • PA01 Security Security PRODUCTS Administer Administer Risk Risk • Detailed Security Security - Certification and Accreditation • PA02 Assess PROGRAM • PA02 Assess Controls Controls Procedures Impact Impact AREA • System Security • PA08 Monitor • PA08 Monitor • PA05 Assess • PA05 Assess MANAGER Security Posture Security Posture Configuration - Customer Security Support Vulnerability Vulnerability OPERATIONS • PA05 Assess • PA05 Assess Vulnerability Vulnerability USAID’s MISSP “Rosette Stone”

  7. USAID and MISSP • MISSP Best Security Practice (BSP) approach is accelerating efforts to eliminate USAID security material weaknesses • MISSP feeds and draws from USAID security efforts • To USAID: Security process framework (SPF), BSP package format, candidate BSPs e.g. security tools • From USAID: OIG reviewed BSPs for risk/vulnerability assessment, security training, intrusion detection, ISSPP, C&A • MISSP approach has been adapted by the Federal CIO Council as its initiative (bsp.cio.gov)

  8. Micro-Metrics • Technical and management measures • Tactical Plans measures to BSP level metrics • Implementation goals • Tool results work great - Hydra, Cybercop • Cost Benefit Analysis - failure of tool • Failure of the ISS dashboard icon - will revisit • TQL use focuses on schedule and cost for repeated application • Leadership dynamic - democratization of metrics

  9. BLUE GREEN YELLOW This Month RED WHITE Last Month Resources Used for Security % Funds % Staff Language for inclusion into all contracts states: • A risk assessment will be conducted (type and level of detail approved by ISSO) • An individual will be designated to represent security concerns of the contact or task • A security plan will be developed for the contract, its systems and applications • Security mechanisms will be implemented and tested (certification) • Formal approval will be obtained to operate Systems and Applications (accreditation) • Security status will be reported monthly using the ISS Dashboard Icon USAID ISS Dashboard Icon

  10. Macro-Metrics • Program-wide and Agency-wide success • Rollup of micro-metrics - bottom up approach • Need to be simple to understand • Material weakness and audit finding measures • “Roadmap” progress coordinated with OIG • Threat of shrinking budget, metrics as budget battle ammunition

  11. Risk Assessment 12/7/99 39 45 40 43 47 46 50 35 42 32 44 37 60 33 36 30 31 34 51 38 41 27 28 29 48 59 49 57 58 61 65 54 62 77 55 74 69 52 76 72 75 20 22 5 13 16 63 71 78 53 8 6 79 73 10 2 17 64 68 7 9 70 24 4 19 67 81 56 1 23 66 12 25 80 26 15 14 11 3 21 White Identifies that no Computing resources exist. Identifies On-site Risk Assessment Activity conducted. Identifies that an Automated Scan was conducted and only Minor Vulnerabilities exist. Identifies that an Automated Scan was conducted and High to Medium Vulnerabilities exist. Identifies that no Risk Assessment Activity was conducted.

  12. Risk Assessment 12/17/99 45 39 40 43 46 50 47 42 35 44 32 37 60 33 51 31 30 36 34 38 41 28 29 27 49 48 59 57 58 61 54 65 62 77 55 74 69 52 76 72 75 20 22 5 63 13 16 71 78 53 8 6 79 64 73 10 2 17 68 7 9 24 70 4 19 81 67 56 1 23 66 12 25 80 26 15 14 11 3 21 White Identifies no AIDNET connected Computing resources. Identifies On-site Risk Assessment Activity conducted. Identifies that an Automated Scan was conducted and only Minor Vulnerabilities exist. Identifies that an Automated Scan was conducted and High to Medium Vulnerabilities exist. Identifies that no Risk Assessment Activity was conducted.

  13. White Identifies no AIDNET connected Computing resources. Identifies On-site Risk Assessment Activity conducted. Identifies that an Automated Scan was conducted and only Minor Vulnerabilities exist. Identifies that an Automated Scan was conducted and High to Medium Vulnerabilities exist. Identifies that no Risk Assessment Activity was conducted. Risk Assessment 1/7/00 45 39 40 43 46 50 47 42 35 44 32 37 60 33 51 31 30 36 34 38 41 28 29 27 49 48 59 57 58 61 54 65 62 77 55 74 69 52 76 72 75 22 5 63 20 13 16 71 78 53 8 2 6 79 18 17 64 73 10 7 68 9 24 70 19 4 81 67 56 1 23 66 12 25 80 14 26 11 15 3 21

  14. Risk Assessment 2/17/99 45 39 40 43 46 50 47 42 35 44 32 37 60 33 51 31 30 36 34 38 41 28 29 27 37a 49 48 59 57 58 61 54 65 62 77 55 74 69 52 76 72 75 22 5 63 20 13 16 71 78 53 8 2 6 79 18 17 64 73 10 7 68 9 24 70 19 4 81 67 56 1 23 66 12 25 80 14 26 11 15 3 CHANGES SINCE LAST REPORT: Conducted On-Site Risk Assessment: GREEN => 74 Follow-up Scan Results: Initial Scan Results: Red=> Blue: 31, 45 Red => 37a Red => Red: 34, 35, 55, 63, 67 21 White Identifies no AIDNET connected Computing resources. Identifies On-site Risk Assessment Activity conducted. Identifies that an Automated Scan was conducted and only Minor Vulnerabilities exist. Identifies that an Automated Scan was conducted and High to Medium Vulnerabilities exist. Identifies that no Risk Assessment Activity was conducted.

  15. Africa 1.Angola 19.Rwanda 2.Benin 20.Senegal 3.Botswana 21.South Africa 4.Democratic Republic of Congo 22.Sudan 5.Eritrea 23.Tanzania 6.Ethiopia 24.Uganda 7.Ghana 25.Zambia 8.Guinea 26.Zimbabwe 9.Kenya 10.Liberia 11.Madagascar 12.Malawi 13.Mali 14.Mozambique 15.Namibia 16.Niger 17.Nigeria 18.Cote D’Ivoire Europe and Eurasia 27.Albania 37.Kazakhstan 48.Tajikistan 28.Armenia 37a Kosovo 49.Turkmenistan 29.Azerbaijan 38.Kyrgystan 50.Ukraine 30.Bosnia and 39.Latvia 51.Uzbekistan Herzegovina 40.Lithuania 41.FYR of Macedonia 31.Bulgaria 42.Moldova 32.Croatia 43.Poland 33.France 44.Romania 34.Georgia 45.Russia 35.Hungary 46.Slovakia 36.Italy 47.Switzerland Asia and the Near East 52.Bangladesh 53.Cambodia 54.Egypt 55.India 56.Indonesia 57.Israel 58.Jordan 59.Lebanon 60.Mongolia 61.Morocco 62.Nepal 63.Philippines 64.Sri Lanka 65.West Bank and Gaza Latin America and the Caribbean 66.Bolivia 67.Brazil 68.Colombia 69.Dominican Republic 70.Ecuador 71.El Salvador 72.Guatemala 73.Guyana 74.Haiti 75.Honduras 76.Jamaica 77.Mexico 78.Nicaragua 79.Panama 80.Paraguay 81.Peru

  16. Specific Results • Engineered stronger systems security controls • For major system (NMS) and GroupWare (Lotus Notes) • Developed USAID’s security risk assessment/site support BSPs - exporting ISS to USAID missions worldwide and now developing countries central banks • Integrated security into USAID Target Information Technology Architecture • Increased support throughout Agency

  17. Specific Results • Maintaining security under escalating attacks • Strengthened USAID’s WAN perimeter safeguards (Firewall, RAS) • Speeding the fielding of standard ISS products Agency-wide (anti-virus, RAS, PKE, IDS, etc.) and cutting costs • Total Quality Leadership principles creating a process-driven security culture • Now moving to support international development work

  18. Planned & Actual Results Planned Actual 100% = Program Fully A-130 Compliant

  19. What is a “Good” Metric • Easy to collect • Maps to business goals and strategy • Useful input to a decision • Identifies who it is aimed at • Fits into an accepted model for using organization • Highlights good and bad behavior • Flexible but objective

  20. Conclusion • USAID is benefiting from a TQL Metrics-based approach and believes that others can also • Leverages federal and national level initiatives • Program investments showing measurable returns • Helping USAID respond to staff and budget cuts • Focuses on a leadership paradigm • Pragmatic metrics use (bottom up) tailored to culture seems to offer best results • Using the Federal CIO Council Best Security Practice to help support metrics use in government and industry

More Related