160 likes | 317 Views
Comparison of NREN Service Level Agreements. Life Cycle and Portfolio Management Workshop Terena, 23/11/05. Ann Harding HEAnet Network Operations Manager. Topics. Why Scope Deliverables Actions Resources Case study - HEAnet. Why. Internal factors Technical management e.g. upgrades
E N D
Comparison of NREN Service Level Agreements Life Cycle and Portfolio Management Workshop Terena, 23/11/05 Ann Harding HEAnet Network Operations Manager
Topics • Why • Scope • Deliverables • Actions • Resources • Case study - HEAnet
Why • Internal factors • Technical management e.g. upgrades • Technical considerations • External factors • European and international projects • Operating environment • Client demand • Better controlled introduction than hasty imposition
Scope • “Comparison of NREN Service Level Agreements” • Narrow? • Achieveable? • Useful? • Internal or external? • Need to include analysis and recommendations or processes?
Deliverables • Taxonomy of service types? • List of NRENs with SLAs for each service type? • Framework for investigating need for SLA provision? • Framework for defining agreed service levels? • Identify shared services which may need a shared SLA?
Actions • Today • Finalise scope, deliverables • Identify interested parties • Before end 2005 • Identify individual actions for each deliverable • Start today? • Assign deliverable priorities • Assign work!!!
Resources • WI 1 “ComparisonofNRENserviceportfolios” • Access to appropriate contacts in NRENs • Access to appropriate contacts in Terena/GN2/Other • ... • Our time
Case StudyHEAnet Strategic Objective CS1:Monitoring Service Levels to Ensure Excellence.
Actions • Identify an appropriate set of operational benchmarks and service metrics to measure our performance for clients. • Benchmark against other NRENs. • Benchmark against ‘competitors’. • Benchmark operational performance on an ongoing basis.
Deliverables • Definitions of types of measurements and performance thresholds. • List of client requirements. • Comparison of proposed benchmarks and metrics. • Recommendations for tools. • Methods for communicating data to clients. • Actions in the event of failure to meet thresholds.
Resources • HEAnet teams, Schools & NOC. • Client Contacts. • NRENs & Terena. • HEAnet CTO & Admin team. • Data from Contracts & CfTs. • Data and analysis from existing monitoring tools.
Work in Progress • Example list of Key Performance Indicators (KPIs) assembled. • Analysis of JANET SLA document • New monitoring tools under trial. • Initial rough availability stats compiled for Q1 2005. • First run of draft client questions compiled. • Other resources such as Sonas report and OS1 work identified.
Initial KPIs • Client Accessibility, both v4 and v6. • Outbound access to Géant2 & General Internet. • Service availability • Mean Time Between Failures (MTBF). • RTTs to Clients and other sites. • Throughput on connections. • NOC response times • Per service type • Per request type
Initial Questions • Questions for clients broken into ‘hard’ & ‘soft’. • Start with softer questions to gauge expectations before asking harder, direct and enumerable questions. • E.g. Soft – “Are you happy with your current levels of service with HEAnet?” • Hard – “In scenario X, what availability would you expect from HEAnet?”
New Tools • Cricket – Potential MRTG replacement. • Easier to configure. • More detailed information output. • Greater capabilities. • Nagios – Potential Netsaint replacement. • Greater feature list. • More detailed reporting. • Greater compatibility with latest webservers.
Still to do • Expand comparison with other NRENs • Refine taxonomy for service types • Allow for future flexibility within limits • Link specific benchmarks and metrics to service types • Define operational policies