190 likes | 399 Views
IT Governance Capability Maturity within Government. Vernon John SITA. Topics. Preamble Brief overview of COBIT Overall COBIT Framework IT Governance Capability Maturity Assessment Framework Assessment Approach Assessment Results Importance and Performance General observations
E N D
IT Governance Capability Maturity within Government Vernon John SITA
Topics • Preamble • Brief overview of COBIT • Overall COBIT Framework • IT Governance Capability Maturity Assessment Framework • Assessment Approach • Assessment Results • Importance and Performance • General observations • Conclusion Enterprise Governance IT Governance Capability performance management + Risk Management = Optimal delivery of IT services (business value) References: Control Objectives for information and related Technology (COBIT)
Preamble Objective: Gauge IT Governance capability maturity levels • IT Governance Capability Maturity Assessment Framework • Development of templates (assessment and reports) • Board briefing on IT Governance 2nd Edition, ITGI • COBIT 4.1 ® Management Guidelines • COBIT Implementation Guide • IT Governance Implementation Guide, ITGI • Maturity Measurement –Fit the Purpose, Then The Method, Guldentops E, ISACA, 2003 • 4 x National Departments • 4 x Provincial Departments • 5 x Municipalities • 13 government departments were measured • This presentation provides insight into: • IT Governance Capability Maturity Assessment Framework and assessment approach • Measurement outcomes
Brief overview of COBIT • A set of accepted best practices for IT management and guidance materials for IT Governance • Developed by the Information Systems Audit and Control Association (ISACA) and the IT Governance Institute (ITGI) • According to ISACA, “COBIT is an IT governance framework and supporting toolset that allows managers to bridge the gap between control requirements, technical issues and business risks. COBIT enables clear policy development and good practice for IT control throughout organizations. COBIT emphasizes regulatory compliance, helps organizations to increase the value attained from IT, enables alignment and simplifies implementation of the COBIT framework Domains (4) Processes (34) Control Objectives (> 200) Control Test Statements (> 800)
Overall COBIT Framework Business objectives Governance objectives ME1 Monitor and evaluate IT performance. ME2 Monitor and evaluate internal control. ME3 Ensure compliance with external requirements. ME4 Provide IT governance. PO1 Define a strategic IT plan. PO2 Define the information architecture. PO3 Determine technological direction. PO4 Define the IT processes, organisation and relationships. PO5 Manage the IT investment. PO6 Communicate management aims and direction. PO7 Manage IT human resources. PO8 Manage quality. PO9 Assess and manage IT risks. PO10 Manage projects. For achieving Business Processes Plan and Organise Monitor and Evaluate To DS1 Define and manage service levels. DS2 Manage third-party services. DS3 Manage performance and capacity. DS4 Ensure continuous service. DS5 Ensure systems security. DS6 Identify and allocate costs. DS7 Educate and train users. DS8 Manage service desk and incidents. DS9 Manage the configuration. DS10 Manage problems. DS11 Manage data. DS12 Manage the physical environment. DS13 Manage operations. Information Confidentiality Efficiency Effectiveness Integrity Compliance Availability Reliability AI1 Identify automated solutions. AI2 Acquire and maintain application software. AI3 Acquire and maintain technology infrastructure. AI4 Enable operation and use. AI5 Procure IT resources. AI6 Manage changes. AI7 Install and accredit solutions and changes. Provide IT Resources Deliver and Support Acquire and Implement Applications Information Infrastructure People
IT Governance Capability Maturity Assessment Framework Raise awareness Awareness and Communication Policies, Plans and Procedures Tools and Automation Skills and Expertise Responsibility and Accountability Goal setting and Measurement Envision Solution COBIT Attributes Assess Current Capability Maturity Importance Maturity Model Accountable Responsible Audited Control Weaknesses Technology Used Vulnerabilities (Technology) PO1..POn Determine Target Capability Maturity AI1…AIn Analyse Gaps and Identify Improvement Initiatives Performance DS1…DSn ME1…MEn Plan Solution
IT Governance Capability Maturity Assessment Framework Raise awareness Awareness and Communication Policies, Plans and Procedures Tools and Automation Skills and Expertise Responsibility and Accountability Goal setting and Measurement Envision Solution COBIT Attributes Assess Current Capability Maturity Importance Maturity Model Accountable Responsible Audited Control Weaknesses Technology Used Vulnerabilities (Technology) PO1..POn Determine Target Capability Maturity 1 -Not at all2 - Can survive without it if need be3 - Make things easier4 - Very significant5 - Critical AI1…AIn Analyse Gaps and Identify Improvement Initiatives Performance DS1…DSn ME1…MEn Plan Solution
IT Governance Capability Maturity Assessment Framework Raise awareness Awareness and Communication Policies, Plans and Procedures Tools and Automation Skills and Expertise Responsibility and Accountability Goal setting and Measurement Envision Solution COBIT Attributes Assess Current Capability Maturity Importance Maturity Model Accountable Responsible Audited Control Weaknesses Technology Used Vulnerabilities (Technology) PO1..POn Determine Target Capability Maturity 1 - Some aspects rarely2 - Some aspects sometimes3 - All aspects sometimes4 - Parts are always done well5 - All is always done well AI1…AIn Analyse Gaps and Identify Improvement Initiatives Performance DS1…DSn ME1…MEn Plan Solution
IT Governance Capability Maturity Assessment Framework Raise awareness Awareness and Communication Policies, Plans and Procedures Tools and Automation Skills and Expertise Responsibility and Accountability Goal setting and Measurement Envision Solution COBIT Attributes Assess Current Capability Maturity Importance Maturity Model Accountable Responsible Audited Control Weaknesses Technology Used Vulnerabilities (Technology) PO1..POn Determine Target Capability Maturity COBIT 4.1 Maturity Attribute Table AI1…AIn Analyse Gaps and Identify Improvement Initiatives Performance DS1…DSn ME1…MEn Plan Solution Note: Assessment results excluded from this presentation
Assessment approach • SITA facilitated a two-day work-session with IT representatives • During the work-session the following was done • Created an awareness of IT Governance and our assessment framework and approach • Presented on the 34 COBIT processes and control objectives. Thereafter, the representatives we given an opportunity to: • Provide information related to the IT process such as Accountability, Responsibility and whether or not the process has been Audited • Rate test statements for control objectives itoImportance and Performance • Rate the process maturity attributes per IT process ito how well they perceived that they are currently performing and where they would like to perform. The facilitator probed participants to ensure that they understand the process and control objectives and to support a more informed scoring • The ratings were used to calculate the overall maturity levels • A sample of evidence was requested by the SITA assessment team from the Department representatives to support ratings provided • The assessment outcomes were analysed and initiatives to improve IT governance were identified and prioritised Given the short duration of the exercise the assessment was not done in too low a level of detail, but it was sufficient to provide a sense of the IT Governance maturity level and identify areas for improvement Report
Assessment resultsImportance and Performance Per Domain Legend Importance (Imp) 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance (Perf) 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
Assessment resultsImportance and Performance Per Domain PO AI DS ME Legend Importance (Imp) 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance (Perf) 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
Assessment results Average Importance and Performance Per Process Per Domain Legend Importance (Imp) 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance (Perf) 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
Assessment results Very Significant Processes (17) Process with highest Performance (17) Legend Importance 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
Assessment results Very Significant Processes (17) Process with highest “Differences” (17) Legend Importance 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
Overall average • The overall average level was between a level 1 and a level 2. According to the COBIT Generic Maturity Model the level 1 and 2 description are as follows • “1 Initial/Ad Hoc—There is evidence that the enterprise has recognised that the issues exist and need to be addressed. There are, however, no standardised processes; instead, there are ad-hoc approaches that tend to be applied on an individual or case-by-case basis. The overall approach to management is disorganised. • 2 Repeatable but Intuitive—Processes have developed to the stage where similar procedures are followed by different people undertaking the same task. There is no formal training or communication of standard procedures, and responsibility is left to the individual. There is a high degree of reliance on the knowledge of individuals and, therefore, errors are likely. “
Observations • Participants gave their full cooperation and were well receptive to the final reports • The was an awareness of IT Governance at a conceptual level but limited knowledge on the details as stipulated in COBIT or on IT Governance implementation • Participants understood the importance of IT Governance and acknowledged that they have a key role to play in the implementation thereof. However, in many instances emphasis was placed more on “operational responsibilities” being a higher priority than on IT Governance type responsibilities. • Some participants were not able to effectively indicate who was accountable and responsible for the execution of IT processes • Very few had explicit IT Governance and IT Process frameworks • Some formal IT policies, processes, procedures or plans have been instituted, however this was not done in the context of an overall IT Governance framework and furthermore there was limited periodic reviews done • Some IT processes underwent auditing albeit that some are done on ad hoc basis • There are limited tools used in support of executing the IT processes. Desktop productivity tools are primarily used and has limited functionality to support effective and efficient execution of the IT processes • Unavailability of funds
Conclusion • COBIT is a very comprehensive IT Governance framework and there is a need to simplify the implementation of COBIT IT Governance within Government departments, which could be done by: • Establishing a “minimum” IT Governance framework • Compiling an implementation method for the “minimum” IT Governance framework • Compiling and making available e.g. generic policies and process that are aligned to the “minimum” framework and that could be easily adapted • Initiating IT Governance practitioner training • Conducting periodic assessments