Leisure, Sport and Tourism: Politics, Policy and Planning

Download Report

Transcript Leisure, Sport and Tourism: Politics, Policy and Planning

Leisure, Sport and
Tourism: Politics,
Policy and Planning
A.J. Veal
Chapter 13
Performance
Evaluation
CONTENTS
•
•
•
•
Evaluation in context
Steps in the evaluation process
Approaches
Goals and performance indicators
EVALUATION IN CONTEXT
•
•
•
•
Economic evaluation – see Chapter 12
Managerialism and privatisation
Forms of evaluation
Sustainability/carbon footprint/water
conservation
• Triple bottom line accounting
• Effectiveness and efficiency
Managerialism and privatisation
• Managerialism: the application of formal/ rational
management practices
• Often used (typically negatively)in regard to applying
private sector-style practices in the public sector
• Arises particularly with privatisation or ‘contracting
out’ to the private sector – implying that only
financial criteria are used and traditional public
sector goals are being ignored
• However:
Managerialism and privatisation contd
• Contracts should include all the requirements of the public
sector
• Even public sector organisations need to think about best
use of capital
• Tourism enterprises, while commercial in nature, can also
have wide community impacts
• When services are contracted out to private companies, the
public agency (eg. council) still has responsibilities
regarding the quality of life of the community
• This also applies to single-purpose agencies (eg. arts, sport)
in their areas of concern
• Competitive tendering for public services can ‘concentrate
the mind’ of the public agency to spell out its goals
Forms of evaluation
• Routine internal – eg. annual
• Strategic – relating to strategic planning
• Accountability – inclusion in official annual
reports
• Ad hoc – one-off evaluations
• Internal vs comparative
– Internal comparison with previous years or between
units
– Comparative: comparison with external ‘benchmarks’
Sustainability/carbon footprint/water conservation
• Particular emphasis given to these items in
recent years:
– Sustainability: no decline in quality of the basic
resource
– Carbon footprint: minimisation of carbon-dioxide
etc. emissions
– Water conservation: efficient use of water
Triple bottom line
• The idea that organisations should report
annually not just on finance, but on three
criteria:
– Financial performance
– Social impact
– Environmental impact
Effectiveness and efficiency
• Effectiveness: the extent to which a project
achieves what it is intended to achieve
• Efficiency: is the cost (input) per unit of output
STEPS IN THE EVALUATION PROCESS
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Identify goals
Specify objectives
Devise measures of effectiveness - Performance Indicators (PIs)
Devise measures of efficiency - Performance Indicators (PIs)
Specify data collection methods
Collect base-line PI data
Set targets
Collect PI data collection at specified times (eg. weekly, quarterly, annually)
Identify and obtain external bench-mark data
Compare base-line values of PIs with current values, targets and external
bench-mark data
11. Deliver of verdict
12. Consider of implications
Relationships between strategic goals/objectives and
performance Indicators: examples (Table 13.2)
1. Leisure participation
Goal
Increase participation
Objectives
Increase participation to target levels among
specified socio-demographic groups and in specified
planning zones
Level of participation for target groups (A, B) and
zones (C)
Effectiveness PIs
Efficiency PIs
Cost per additional participant
Environmental PIs
Amount of carbon emitted per 1000 customers
Data collection methods
Resident survey
Administrative
Special audit
Baseline Ithis year)
30%
22%
18%
15%
£2
1.5 kg
PI values
Overall participation:
Participation group A:
Participation group B:
Participation in zone C:
Avge net cost /participant-session:
Carbon/participant-session:
Target, Year X
32%
30%
30%
30%
£2
1.2 kg
Relationships between strategic goals/objectives and
performance Indicators: examples contd
Tourism development
Goal
Objectives
Effectiveness PIs
Efficiency PIs
Environmental PIs
Stimulated employment growth through tourism
Increase no. of tourists & tourist expenditure to
create additional jobs in tourism and related sectors
Number of additional tourists
Number of jobs created
Public sector cost per job created
Total cost per job created
Amount of carbon emitted per 1000 tourists
3. Tourist survey and industry survey
Data collection methods
4. Tourist survey, industry survey & administrative
5. Special audit
Baseline (this year)
Target: year X
PI values
100,000
120,000
Number of tourists: 100,000
£25 million
£30 million
Total tourist expenditure:
600
Jobs in tourism and related sectors: 500
£100
£100
Council annual spend/tourism job:
15kg
13 kg
Carbon per tourist bed-night::
APPROACHES
• Comprehensive Area Assessment (UK)
• Service quality
– The CERM PI system (Australia)
– The National Benchmarking Service (UK)
– Importance-performance analysis (generic)
• Arts, entertainment, cultural venues/events
• Sport (UK)
• Tourism (Aust.)
Comprehensive Area Assessments (CAAs)
• This book went to press in early 2010 and was published in Sept.
2010
• In May 2010 there was a change of government in the UK, from
Labour to Conservative/Lib.Democrat coalition.
• There was time for this to be reflected in Chapter 2.
• However, it was not possible to take account of a 17 Dec. 2010
Dept of Communities & Local Govt media release, which stated:
– Departments across Whitehall have been abolishing an array of
bureaucratic burdens and unjustified data demands. Comprehensive Area
Assessments have ended; the £5m Place Survey stopped; 66 pages of
guidance on how to report efficiency shredded; over 4,700 targets
scrapped including adults doing sport, neighbourhood belonging and a
self assessment on climate progress.
– www.communities.gov.uk/news/newsroom/1802440
Comprehensive Area Assessments (CAAs) contd
• The abolition of CAAs was the ‘small government’ conservative/neoliberal philosophy in action, as discussed in Chapter 2: the priority of
the new government is to reduce expenditure.
• Releasing local councils from the requirement to conduct CAAs was
part of a package of measures which enabled the national
government to reduce its financial support for local government.
• The CAA system nevertheless remains an interesting model of
performance evaluation.
• Even though councils will not be required to report to central
government, it remains to be seen to what extent councils retain
elements of the system for local use.
• A key leisure-related component of the CAA system was the largescale annual Active People Survey, but since this is conducted by Sport
England, which is partly funded by the National Lottery, it seems likely
to continue.
Comprehensive Area Assessments (CAAs) contd
• Three-year agreements between central and local govt.
• 200 National Indicators (NIs): 18 ‘statutory’ and 35
selected ‘priorities’ – must be measured and reported,
and published, annually
• Main leisure/sport indicators (no tourism):
–
–
–
–
–
NI8 Adult participation in sport*
NI9 Use of libraries*
NI10 Visits to museums/galleries*
NI11 Engagement in the arts*
* Measured by annual Active People Survey
• No tourism Nis, but can be included in ‘Local Indicators’
(see Table 13.6)
• Emphasis on participation relates to U-Plan approach
(Ch. 8)
CAAs contd: Published results (Table 13.5)
NI
Area
Year
Result
Local authority
NI 8
Sport/recreation
2008-09
Highest
Richmond upon Thames
26.6
2008-09
Lowest
Newham
13.1
2007-08
Highest
Rugby
60.1
2007-08
Lowest
Boston
32.6
Museums/galleries 2007-08
Highest
Camden*
78.2
2007-08
Lowest
Boston
35.5
2007-08
Highest
Kensington & Chelsea
65.5
2007-08
Lowest
Bolsover
29.6
NI 9
NI 10
NI 11
Libraries
Arts
* includes British Museum
%
CAAs contd: example: City of Birmingham (Table 13.6)
Targets, %
Baseline
% 2007
2008-09
2009-10 2010-11
NI 8
Sport/recreation
17.2
18.2
19.2
20.2
Local
Indicator 1
Tourists who think
Birmingham is a good
place to visit
65.0
66
67.5
69
Local
Indicator 2
Reduce % of residents
in who have not used
any cultural facilities in
four suburbs
1: 33.5
2: 41.3
3: 35.7
4: 33.4
31.6
37.5
31.6
31.6
Service quality
• Systems which evaluate quality of customer
service, typically on an annual basis:
– The CERM PI system (Australia)
– The National Benchmarking Service (UK)
– Importance-performance analysis (generic)
CERM PI system
• Centre for Environmental and Recreation Management
(Univ. of South Australia) Performance Indicators system
• Version of SERVQUAL approach (Parasuraman et al.,
1988)
– compares customer service quality expectations with service
quality received.
• Leisure facility managers who use the system collect
standard annual data from customer surveys
• CERM analyses data and presents results with
comparison of averages for similar facilities
(benchmarking)
CERM PI system contd
A. Customer service:
expectation and performance
1. Safe and secure parking
p
2. Facility cleanliness
3. Value for money
4. Suitable food & drink
5. Staff friendliness
6. Pool water cleanliness
7. Behaviour of others
8. Problem resolution
9. Overall satisfaction
10. Behavioural intentions
B. Organisational
1. Expense recovery of operations (fee
income as % of exp.)
2. Promotion/marketing cost share
3. Total visits per year
4. Visits per square metre
5. Water costs per visit
6. Fit of socio-demographic profile of
centre users to that of the local
community
Items rated using Likert-type scales
National Benchmarking Service (Box 13.9)
• UK Service similar to CERM PI sponsored by Sport
England and operated by the Sport Industries
Research Centre (LIRC) (Sheffield Hallam University)
• http://www.questnbs.org/
Analysis of data: Importance-performance technique
• Similar to SERVQUAL, but can be used:
i.
organisational decision-making as indicated by Harper
and Balmer (1989);
ii. in relation to perceived benefits of public leisure
services, as a form of consumer consultation
(Sieganthaler, 1994);
iii. to measure customer satisfaction (Langer, 1997: 147);
and
iv. to analyse performance data.
Importance-performance analysis: customer service
evaluation example
• 8 service items (A-H) scored by customers of one facility
in relation to:
– a. importance/expectations,
– b. performance – service actually received
• Results in Fig. 13.3
Importance-performance analysis example
5.0
Performance/service quality
4.5
B
4.0
F
G
Doing well
C
3.5
D
3.0
Wasted
resources?
2.5
2.0
H
1.5
E
1.0
Cause for
concern
A
0.5
0.0
0.0
1.0
2.0
3.0
Importance/expectations
4.0
5.0
Importance-performance analysis: comparison
example (Table 13.11)
• The same facility over four years (4 cases): scored on
8 service quality items (could also be applied to 4
different facilities)
• Different service quality items have different levels of
importance/expectation: cored by management or
users
• See Table 13.11
Import-perf. analysis: comparison eg. (Table 13.11)
Service
quality
items
Case 1
A
a.
Importance/
expectation
scores
4.3
1.2
2
2.7
4.5
B
3
4.8
4.8
4.8
4.9
C
4
4
4
4.2
4.3
D
2.5
3.5
3.5
3.7
3.7
E
1.5
1.5
1.6
1.7
1.9
F
4.5
4.5
4.5
4.5
4.5
G
1.2
4.2
4
3.5
3.5
H
3.5
2
2.5
3
4.7
78.9
84
89.6
104.3
6.4
6.7
16.1
51.5
53.3
57.5
3
3.5
7.9
Total score (sum of a x b)
% change
Case 3
Case 4
b. Performance scores
% change
Attendances, '000s
Case 2
50
Other examples
• Arts, cultural venues and events
– Arts items included in the CAA system, discussed above
– In 1991, the UK Audit Commission (now abolished by new
Conservative/LibDem government) produced a system of
PIs for the arts/cultural events – see Table 13.12
• Sport
– See the Sport England strategic plan in Box 7.1
– New South Wales Dept of Arts, Sport & Recreation plan see website: Web-box07.01
• Tourism
– See Tourism NSW Masterplan: Table 13.13
Tourism NSW Masterplan: Towards 2020 (Table 13.13)
1. Sydney’s market position as Australia’s premier tourist destination is
maintained
1.1 Customer satisfaction.
1.2 Share of Australian visitation: visitors, visitor nights, visitor exp., length of stay.
1.3 Index of Sydney’s attractiveness as a destination amongst potential customers:
domestic, international.
1.4 Share of cruise ship visits.
1.5 Share of meetings, incentives, conference/exhibitions (MICE) visitation to
Australia: count, visitor nights, exp.
1.6 Index of repeat holiday visitation to Australia by international visitors.
1.7 Index of loyalty amongst domestic visitors.
2. Tourism destinations in NSW are managed sustainably (6 items)
3. A positive climate for tourism investment and enterprises in NSW (5 items)
4. Tourism in regional and rural NSW is strengthened (8 items)
5. Overall success measures for Towards 2020 (5 items)
GOALS AND PERFORMANCE INDICATORS
• Throughout the book, numerous goals of public
leisure/sport/tourism services are mentioned.
• All can be expressed/evaluated in terms of
performance indicators
• Table 13.14 lists 27
Goals and PIs: Examples from Table 13.14
Access to facilities for
chosen leisure activities
for all
Effectiveness: % of population participating
% of pop’n within reach of facilities
% of population satisfied with service
Efficiency:
Net cost per visit/participant
Provision for need for all Effectiveness: Extent to which needs are met
Efficiency:
Net cost per visit/participant
Maintain existing
provision
Effectiveness: Qualitative measures
Efficiency -
Promote excellence
Effectiveness: Excellence: medals, awards, records, etc
Efficiency:
Minimise state role
Costs per medal etc.
Effectiveness: Short-term: extent of privatisation
Long-term: quantity and quality of service
Efficiency:
Public/private service costs