150 likes | 274 Views
So how did the revised student HESA return actually impact on an institution?. HESA: The Practitioners’ View. Aim of Session. We will:-
E N D
So how did the revised student HESA return actually impact on an institution? HESA:The Practitioners’ View
Aim of Session We will:- Compare the experiences of two institutions which either use an ‘off the shelf’ or an ‘in-house’ student management system. The sessions will aim to review the availability of resources, technical expertise and business knowledge needed within an institution; identify practical implementation issues and describe on-going work.
Content The workshop will review and detail: • How the two institutions differ. • What worked? • What didn’t work? • The sharing of experiences – group work. • Feedback and summary.
Huddersfield Size and Shape information • 20,431 active students in 2008/09 • 122 countries represented at Huddersfield. • 2,338 internationally domiciled students • Circa 20,000 UCAS applications per year • Academic Staff/UG Student ratio, 1:19 • 1,920 people employed at 31st July 2008 • 3 university campuses • Lead institution for West Yorkshire LLN • Lead institution for PCET Consortium
Sheffield Size and Shape information • 24,004 active students in 2008/09. • 131 countries represented at Sheffield. • 4,636 internationally domiciled students. • Circa 35,000 UCAS applications per year. • Academic Staff/UG Student ratio, 1:14. • 5,749 people employed at 31st July 2008.
Corporate Information System:- Student ‘SITS-Vision’ system with Agresso Financial and Professional Personnel HR, with data linked to the data warehouse: • Programme and module management • Admissions and recruitment • Online student registrations and devolved (web) student personal data maintenance • Other web-enabled functions eg results • Student Finance and Fees • Course/Module Assessment • Placements • Progress Records and Thesis Tracking • Ceremonies/Awards and Transcripts • Alumni • Management applications, external returns e.g. HESES, HESA, TDA etc. • Agress0 Financial and Professional Personnel HR • Corporate Data Model (data warehouse). Corporate Information System:- Student ‘Oracle Education System’, with SAP Financials and HR, with data linked to the data warehouse: • Programme and module management. • Admissions and recruitment. • Online student registrations and devolved (web) student personal data maintenance • Student Finance and Fees. • Timetabling. • Departmental Assessment System. • Progress Records and Thesis Tracking. • Ceremonies/Awards and Transcripts. • Facilities Management. • Management applications, external returns e.g. HESES, HESA, TTA etc. • SAP – Financials and HR with eRecruitment. • Corporate Data Model (data warehouse).
HESA Related Resources at Huddersfield:- • Business Requirement • Business analyst • Business liaison • HESA Specification • Project management • Data quality • Technical • XML Support HESA Related Resources at Sheffield:- • Business Requirement. • Business analysts. • Business liaison and systems development. • HESA Specification. • Project Management staff. • Data Quality and MI Team. • Technical. • CIS technical and data infrastructure. • Oracle/XML/SQL programmers and developers.
Implementation:- • Project Staff • Project Manager/Business Analyst • Data Quality • Ad hoc requirements Implementation:- • Project Staff. • Project Manager – liaison. • Business Analyst. • Data quality work. • Oracle programmer with XML expertise. • Various CIS developers as required.
Implementation:- • ASIS Development Group • ARO and School/Service staff • Regular progress monitoring to Deputy Vice Chancellor and Senior Executive Officer Implementation:- • Project Committee (Prince 2 Project Management). • Policy decisions. • Resource allocation. • Guidance and support. • Operational Sub-Group:- • Acquisition of new data (admissions, student services, international office, research office). • Changes to business processes. • Data Quality Review. • Technical Sub-Group:- • Reference Data. • SQL script design. • CIS process changes. • Oracle and XML outputs.
What worked? • Internal Liaison:- • Strengthened existing co-operation between different areas • Opportunity to remind operational staff of wider impact of their work • Strengthened work already carried out on consistency of operations • Gave ‘business case’ for certain operations • Additional data quality checks leading to further improvement in data quality • Support from XML expert • Majority of data in single system What worked? • Internal Liaison:- • Increased co-operation in addressing external data requirements from across the institution. • Greater understanding by operational staff on the wider impact of their work. • Refocus on how the CIS student record was operated so that there was renewed consistency in its use. • Further agreement on data quality responsibilities across operational offices (admissions and student registrations offices). • Improvements in overall data quality for both internal and external users. • Support from HESA in creating a manual OS Aggregate Return in Excel, including XML conversion.
What worked? • Third party software • Programming work done for us thereby allowing us to concentrate on data quality and business process requirements • SITS Forum enabled help from other HEIs • Process documentation • Forced us to sit down and improve what internal documentation we already had • HESA Liaison • Accommodating with requests for extensions • Reassurance that others were in the ‘same boat’ What worked? • CIS Developments:- • Allocation of development resources. • The management of new data fields and reference data changes, which impacted on the ‘live’ operational systems. • Development of specialist algorithms e.g. proportional load calculations. • Overwriting student, programme and unit system data. • Creation of a schema database populated with data errors allowing for easy analysis and identification of records requiring correction. • Schema and XML – no (or very few) problems as created locally (local expertise at hand). • HESA Liaison:- • Realistic in the way they liaised over late returns.
What didn’t work? • Project Management • Fixed deadlines • Shifting specification and late changes to business rules and validation kits • Slowness of HESA guidance on interpretation of specification – not always their fault • Very devolved institution on academic side – ensuring all staff involved in this understand importance and implications of what they are doing? What didn’t work? • Project Management:- • No flexibility to re-schedule/extend timescales. All fixed to a national deadline. • Unable to maintain development schedules due to a shifting specification. • Support resource from HESA – slow responses; continued reference back to their statutory customers for clarification on requirement. Unfair on HESA Liaison and us. • Excessive call on local HESA expert. Revisions in specification tied up this vital resource; interfered with scheduled analysis of the full specification from HESA for supply to programmers. • Inappropriate lead times from HESA’s statutory customers. • Insufficient time to check that the local specification had been created correctly.
What didn’t work? • 3rd party software:- • Timing of release of ‘hot fixes’ with necessary updates for HESA processing • Resources required to provide software too reliant on certain individuals • Changing specification added to delivery problems • Lack of wildcard functionality meant couldn’t cross-check to HESES re-creation in same way as in previous years • Data and Quality Issues:- • UCAS data for HESA (*J) – too late and poor quality • Validation/data quality issues raised by HESA post-submission What didn’t work? • CIS amendments:- • Reference data at the heart of CIS. • Built co-operation from operational areas, until CIS had to be amended – just as finalising recruitment and online registration of students – but had no choice. • Operational and MI reports required rewrites (approx 300 operational reports checked). • Data and Quality Issues:- • Need for unit records against all students. • Relational structure of the HESA record does not reflect operational reality. OWNSTU as a unique student identifier? • UCAS data for HESA (*J) • UofA for student supervisors
What didn’t work? • HESA and their Statutory Customers:- • Too much change in a single year • New reference data and amendments to existing. • Changes to existing data fields. • New data requirements. • Different time scales of implementation between HESA and UCAS. • The specification failed to deliver a stable requirement; too many versions being published (even past key delivery dates). • Conflicting guidance on some key data fields between HESA and funding council – each cross referencing each other. • Business rules and validation questions did not always seem logical. What didn’t work? • HESA and their Statutory Customers:- • Too much change in a single year. • New reference data and amendments to existing. • Changes to existing data fields. • New data requirements. • Different time scales of implementation between HESA and UCAS. • The specification failed to deliver a stable requirement; too many versions being published (even past key delivery dates). • Conflicting guidance on some key data fields between HESA and funding council – each cross referencing each other. • Business rules and validation questions did not always seem logical.
Please discuss and summarize your discussion points on the supplied paper. Identify a member of the group who can feedback points at the end of the session. Workshop Questionsq1: What are the benefits and disadvantages to your institution of the new style HESA return?q2: Have there been any specific areas of the return which have made your institution change the way it operates?q3: In hindsight, what would have made life easier for you in making the new return?