The University PowerPoint Template

Download Report

Transcript The University PowerPoint Template

So how did the revised student HESA return actually impact on an institution?

We will:-

Compare the experiences of two institutions which either use an ‘off the shelf’ or an ‘in-house’ student management system. The sessions will aim to review the availability of resources, technical expertise and business knowledge needed within an institution; identify practical implementation issues and describe on-going work.

The workshop will review and detail:

     How the two institutions differ.

What worked?

What didn’t work?

The sharing of experiences – group work.

Feedback and summary.

Size and Shape information

         20,431 active students in 2008/09 122 countries represented at Huddersfield.

2,338 internationally domiciled students Circa 20,000 UCAS applications per year Academic Staff/UG Student ratio, 1:19 1,920 people employed at 31 st July 2008 3 university campuses Lead institution for West Yorkshire LLN Lead institution for PCET Consortium

Size and Shape information

      24,004 active students in 2008/09.

131 countries represented at Sheffield.

4,636 internationally domiciled students.

Circa 35,000 UCAS applications per year.

Academic Staff/UG Student ratio, 1:14.

5,749 people employed at 31 st July 2008.

Corporate Information System:              Student ‘SITS-Vision’ system with Agresso Financial and Professional Personnel HR, with data linked to the data warehouse: Programme and module management Admissions and recruitment Online student registrations and devolved (web) student personal data maintenance Other web-enabled functions eg results Student Finance and Fees Course/Module Assessment Placements Progress Records and Thesis Tracking Ceremonies/Awards and Transcripts Alumni Management applications, external returns e.g. HESES, HESA, TDA etc.

Agress0 Financial and Professional Personnel HR Corporate Data Model (data warehouse).

Corporate Information System: Student ‘Oracle Education System’, with SAP Financials and HR, with data linked to the data warehouse:             Programme and module management.

Admissions and recruitment.

Online student registrations and devolved (web) student personal data maintenance Student Finance and Fees.

Timetabling.

Departmental Assessment System.

Progress Records and Thesis Tracking.

Ceremonies/Awards and Transcripts.

Facilities Management.

Management applications, external returns e.g. HESES, HESA, TTA etc.

SAP – Financials and HR with eRecruitment.

Corporate Data Model (data warehouse).

HESA Related Resources at Huddersfield:    Business Requirement  Business analyst  Business liaison HESA Specification  Project management  Data quality Technical  XML Support HESA Related Resources at Sheffield:    Business Requirement.

o Business analysts.

o Business liaison and systems development.

HESA Specification.

o o Technical.

o CIS technical and data infrastructure.

o Project Management staff.

Data Quality and MI Team.

Oracle/XML/SQL programmers and developers.

Implementation:  Project Staff o Project Manager/Business Analyst o o Data Quality Ad hoc requirements Implementation:  Project Staff.

o Project Manager – liaison.

o o Business Analyst.

Data quality work.

o o Oracle programmer with XML expertise.

Various CIS developers as required.

Implementation :  ASIS Development Group  ARO and School/Service staff  Regular progress monitoring to Deputy Vice Chancellor and Senior Executive Officer Implementation:  Project Committee (Prince 2 Project Management).

o Policy decisions.

o o  Resource allocation.

Guidance and support.

Operational Sub-Group:  Acquisition of new data (admissions, student services, international office, research office).

 Changes to business processes.

  Data Quality Review.

Technical Sub-Group:  Reference Data.

  SQL script design.

CIS process changes.

 Oracle and XML outputs.

What worked?

  Internal Liaison: Strengthened existing co-operation between different areas  Opportunity to remind operational staff of wider impact of their work  Strengthened work already carried out on consistency of operations  Gave ‘business case’ for certain operations  Additional data quality checks leading to further improvement in data quality  Support from XML expert  Majority of data in single system What worked?

 Internal Liaison: o o Increased co-operation in addressing external data requirements from across the institution.

Greater understanding by operational staff on the wider impact of their work.

o o o o Refocus on how the CIS student record was operated so that there was renewed consistency in its use.

Further agreement on data quality responsibilities across operational offices (admissions and student registrations offices).

Improvements in overall data quality for both internal and external users.

Support from HESA in creating a manual OS Aggregate Return in Excel, including XML conversion.

What worked?

 Third party software  Programming work done for us thereby allowing us to concentrate on data quality and business process requirements  SITS Forum enabled help from other HEIs  Process documentation  Forced us to sit down and improve what internal documentation we already had  HESA Liaison  Accommodating with requests for extensions  Reassurance that others were in the ‘same boat’ What worked?

  CIS Developments: o Allocation of development resources.

o The management of new data fields and reference data changes, which impacted on the ‘live’ operational systems.

o Development of specialist algorithms e.g. proportional load calculations.

o o o Overwriting student, programme and unit system data.

Creation of a schema database populated with data errors allowing for easy analysis and identification of records requiring correction.

Schema and XML – no (or very few) problems as created locally (local expertise at hand).

HESA Liaison: o Realistic in the way they liaised over late returns.

What didn’t work?

 Project Management  Fixed deadlines  Shifting specification and late changes to business rules and validation kits  Slowness of HESA guidance on interpretation of specification – not always their fault  Very devolved institution on academic side – ensuring all staff involved in this understand importance and implications of what they are doing?

What didn’t work?

 Project Management: o No flexibility to re-schedule/extend timescales. All fixed to a national deadline.

o o Unable to maintain development schedules due to a shifting specification.

Support resource from HESA – slow responses; continued reference back to their statutory customers for clarification on o o o requirement. Unfair on HESA Liaison and us.

Excessive call on local HESA expert. Revisions in specification tied up this vital resource; interfered with scheduled analysis of the full specification from HESA for supply to programmers.

Inappropriate lead times from HESA’s statutory customers. Insufficient time to check that the local specification had been created correctly.

What didn’t work?

 3 rd party software: o Timing of release of ‘hot fixes’ with necessary updates for HESA processing o o o Resources required to provide software too reliant on certain individuals Changing specification added to delivery problems Lack of wildcard functionality meant couldn’t cross-check to HESES re-creation in same way as in previous years   Data and Quality Issues: UCAS data for HESA (*J) – too late and poor quality  Validation/data quality issues raised by HESA post-submission What didn’t work?

 CIS amendments: o o o Reference data at the heart of CIS. Built co-operation from operational areas, until CIS had to be amended – just as finalising recruitment and online registration of students – but had no choice.

Operational and MI reports required rewrites (approx 300 operational reports checked).

 o Data and Quality Issues: Need for unit records against all students. o Relational structure of the HESA record does not reflect operational reality. OWNSTU as a o o unique student identifier?

UCAS data for HESA (*J) UofA for student supervisors

What didn’t work?

 HESA and their Statutory Customers: o Too much change in a single year ▪ New reference data and amendments to existing.

▪ Changes to existing data fields.

▪ ▪ New data requirements.

Different time scales of implementation between HESA and UCAS.

 The specification failed to deliver a stable requirement; too many versions being published (even past key delivery dates).

 Conflicting guidance on some key data fields between HESA and funding council – each cross referencing each other.

 Business rules and validation questions did not always seem logical.

What didn’t work?

 o HESA and their Statutory Customers: Too much change in a single year.

 New reference data and amendments to existing.

   Changes to existing data fields.

New data requirements.

Different time scales of implementation between HESA and UCAS.

o o o The specification failed to deliver a stable requirement; too many versions being published (even past key delivery dates).

Conflicting guidance on some key data fields between HESA and funding council – each cross referencing each other.

Business rules and validation questions did not always seem logical.

Please discuss and summarize your discussion points on the supplied paper. Identify a member of the group who can feedback points at the end of the session.