Articulating and Comparing Standards through Benchmarking

Download Report

Transcript Articulating and Comparing Standards through Benchmarking

Dr Sara Booth
University of Tasmania
Argument 1 Explicit

Standards mean uniformity - one size fits all
Argument 2 Implicit Explicit

Implicit Standards in universities are
self-monitoring and self-regulating

Explicit Standards means diversity,
substance, accountability and
transparency

They are a basis for comparison and
collaboration

Universities need to become more
explicit in comparison of standards
To do this:
- Make explicit definition of
standards used
- Make explicit definition of
benchmarking used
- national curriculum

5 sets of sector standards (DEEWR & TEQSA)
for Provider Registration, Provider Category,
Qualification (AQF), Information, Teaching and
Learning, Research

Sets of academic standards
– a contested space including professional (e.g.
teaching standards); quality assurance; minimum
threshold (what is achieved); aspirational and
student achievement standards (Carmichael, 2010)

TEQSA’s discussion paper on Teaching
and Learning Standards (July, 2011)
Learning/Teaching standards/role of TEQSA/role of
universities

Definition of Benchmarking is varied across
sector
Jackson and Lund (2000, cited in Stella & Woodhouse, 2007,
p.14) define benchmarking as
‘ first and foremost, a learning process structured so as to enable
those engaging in the process to compare their services/activities
/products in order to identify their comparative strengths and
weaknesses as a basis for self improvement and/or self regulation’.
Agreed points of comparison – Deakin, UOW, UTAS
• Three Cycle 1 AUQA Audits specified more benchmarking
• Comparable institutions - age, structure, regional presence,
disciplines
• Benchmarking awareness and confidence at similar level
1. Early
Implementation
2. Further Refinement
and Alignment
3. Full Embedding
Universities need to develop
and implement a
benchmarking framework,
processes and partnerships as
part of the Quality System
Universities have begun to
implement benchmarking
processes and partnerships
but further refinement and
alignment with other
university processes is
required
Universities have established
benchmarking frameworks,
processes and partnerships across
the sector and make extensive use
of external reference points and
benchmarking
UOW, Deakin and UTAS
We are currently here!
Key features
• university-wide approach
• aligned to strategic priorities,
data strategy, data warehouse
and risk framework
• applied at unit and course level
• mechanisms for selecting
appropriate institutions;
• benchmarking reference groups
(Booth, 2011)
Ms Heather Sainsbury
Deakin University
Planning
 Establishing
the benchmarking partnership
 Agreement on area and scope
 Planning for success
Implementation
 Communicating
with faculties
 Streamlining the process
 Putting it together
Success factors
 Shared
understanding of
benchmarking goals
 High
level of trust
 Willingness
to share information
and discuss successes and
failures
Success factors
 Similar
enough to offer transferable strategies
Similarities
 All unaligned
 Compatible missions, values and
goals
 Multi-campus structures
 Regional presence
 Comparable discipline areas
 Similar experience of AUQA
audit cycles
Differences




Size
Student
profiles
Offshore
presence
Off campus
delivery
Success factors
 Comparable
commitment
Success factors
 Sustained
commitment
Success factors
 Sustained
commitment
Success factors
 The
more partners there
are the harder it gets
 Communication
and
flexibility the keys to
success
What to benchmark?
Catalyst for assessment project – 2009 AUQF in Alice Springs
 Paper by Linda Davies (Griffith Uni) on ALTC Teaching Quality
Indicators Project – external reference point
 Shared commitment to review assessment practice in the
lead up to our respective AUQA audits in 2011
 Potential to deliver significant benefits to all three
universities
 Support from relevant Executive and other leaders critical

Agreement on scope

Careful scoping through collaborative process involving
senior academic and quality leaders from each university
⁻ Time period
⁻ Coverage – undergraduate but excluding Honours
⁻ Focus on standards – assessment design not covered
Agreement on data to be shared
 Make sure that you are talking about the same thing –
different terminology a potential barrier
 Take the time to get it right…

Agreement on scope
 Keep
sight of the main objective
Agreement on methodology

Derived from existing successful methodology - ACODE
Benchmarking Framework (2007)
− Self-review by each partner
− Peer review
− Action plans (shared)
Adapted indicators and measures developed through TQIP project
 Tested against literature on good practice, expert reviewers and
academic leaders at each university


Agreement reached on:
–
–
–
–
Performance indicators
Good practice statements
Performance measures
Trigger questions
Agreement on performance indicators and measures
PI #1: Assessment purposes, processes and expected standards of
performance are clearly communicated and supported by timely
advice and feedback to students
Good Practice Statement: Students receive clear and timely information on the
aims and details of assessment tasks; marking and grading practices; expected
standards of achievement; and requirements for academic integrity. They are
provided with timely feedback on their performance and supported in making
improvements.
Performance measures:
1.1 Expectations are clearly communicated
1.2 Advice and feedback are provided
Trigger questions under each measure
Agreement on self-review templates
Performance measure
Rating
Rationale
Evidence
State measure as
agreed, with trigger
questions to focus selfreview
4-level scale:
1 Yes
2 Yes, but
3 No, but
4 No
Dot points
identifying
practices that
support the rating
Including references
to policies,
documents, web
references, data
sources (including
student feedback)
Agreement on timelines

Build in flexibility for partners to move at slightly different
speeds at different times, while still all meeting critical
common dates:
⁻
⁻
⁻
⁻

Finalising templates
Completion of self-reviews and sharing of self-review reports
Peer review workshops
Contributions to shared reports
Accommodate internal deadlines of partners wherever
possible (key committee dates, AUQA deadlines)
Ms Anne Melano
University of Wollongong
Communicate with faculties
Prepare a communication plan
 Consider the culture – eg UOW is
very consultative, very engaged faculty T&L chairs
 Hold a high level briefing – establishes importance, brings
faculty leaders together
 Hold informal one-on-one meetings – answers questions
and address concerns
 Don’t rush – do invite comments on documents and
processes – builds ownership
 Send out updates as project progresses
 Thank/acknowledge along the way

Provide support
Appoint a project coordinator
 Encourage faculties to identify
a person to support faculty leader
 Offer funding or admin assistance if possible
 Provide a clear guide to the process
 Provide data packs
 Offer draft emails, information sheets etc
that faculties can send to staff
 Attend faculty self-reviews – helpful as questions of
interpretations do arise

Streamline the process
Faculties are time poor - risk of backlash if
time contributed not rewarded by benefits
 Clear, realistic timeline and expectations
 ONE self-review meeting in each faculty – if
put together the right people, most questions can be answered
 ONE template to work through – all questions clearly set out
 Simple rating scale
 As much as possible of the template completed in that meeting
 A rating on each measure MUST be agreed by the group.
Otherwise there is no clear result
 A similarly streamlined process for institutional reviews and for
the peer review across three universities

But it does need rigour…

Question design based on:
– Griffith ALTC project, additional work
by Boud, advice from Joughin, testing in a faculty

Evidence:
– has to be provided to support each rationale/rating
– collecting this is a major effort by faculty leaders
and their admin assistant
– survey conducted at UTAS – valuable and can be
done centrally
– all evidence checked centrally
Sharing
At each level, encourage the conversations
– these can be just as important as the project outcomes.
Good practice sharing, questioning and problem solving
naturally occurs – let it
 Faculties aren’t mediaeval castles – encourage interaction
 UOW – each faculty leader sat in on another’s self review
 Deakin – four Associate Deans (T&L), very collegial
 Avoid the ‘black hole of benchmarking’. Reward evidencegathering by selecting and disseminating good practice

Putting it together
– the institutional self-review
Faculty reports combined into an institutional report
 All leaders brought together
 Agreement on institutional rating, good practice and
gaps/issues
 Discussion of each measure with top issues agreed – these
form the basis of an action plan for the future

Putting it together
– the three-university peer review
Face-to-face if possible
 Selection of leaders brought together
 Icebreakers, time to mingle
 Template provided to work through – each institution’s results
and ratings on each measure
 Review of institutional ratings
 Discussion of good practice and gaps/issues
 Expect surprises! You may be doing better than you think …
 OR your ‘best practice’ may be just ‘ho-hum that’s what everyone
is doing’!

Ms Lynn Woodley
University of Wollongong
Using and sharpening the tools:
What works and what doesn’t
 The broad indicators of the Griffith TQIP
project (Davies, 2009)
 The ACODE Benchmarking Framework
 Templates – the Pollard Rating Index
"No but yeah but no but yeah but no but...
 Killing two birds : making the most of the project
 Benchmarking logistics: checking the steps and
the flight plan
 Escaping the black hole –the action plan
 Becoming a toolmaker

Collegial partnerships

Institutional:
self-review activity;
cross faculty bonds

Cross- university:
co-ordinators, executive
and academic staff

A mutual learning process
for all involved
Assessment
- Standards at work:
The academic standards trinity: Learning Outcomes,
Assessment, Graduate Qualities
 An “academic” exercise in definition or a “real world”
definition - how academics set, monitor and review
standards?
 Uniformity Vs
Quality and Good Practice

Assessment - Good Practice and Quality Improvement:


Insights and ideas from the practices of others
Good practice and areas for improvement for each faculty and each
university
What we do well:
 For example: Deakin - Online Unit Guide; UTAS - Criterion-referenced
assessment (CRA) supported by faculty champions; UOW - educative
focus of Academic Integrity Policy
What we needed to do better:
 Connecting learning outcomes, Graduate Attributes/Qualities and
Assessment (the crux of academic standards)
 Staff development (incl. sessional staff)
 Marking practices for group work
 Use of best practice models
 Benchmarking at the course/program level (Oliver, 2009)
‘ first and foremost, a learning process structured so as to enable those
engaging in the process to compare their services/activities /products in
order to identify their comparative strengths and weaknesses as a basis
for self improvement and/or self regulation’.
Did we achieve the Project Aims?
Compare processes within faculties, across each university and across
the three universities.
2. Compare the effectiveness of Academic Boards/Senates in performing
their role in policy and standards, across the three universities.
3. Identify good practice and areas where improvements can be made for
the benefit of students and staff at each university.
4. Develop and share knowledge and experience between the three
benchmarking partners about the process of benchmarking.
Your rating? "No but yeah but no but yeah but no but..."
1.