Transcript Slide 1

2010 Customer
Survey
Main Quantitative Report
Prepared For:
R
11th October 2010
Presentation Coverage
•
Introduction
2
•
Overview
5
•
Focus On DTS
9
•
Focus On DCUSA/ SPAA
42
•
Considerations For Qualitative Phase
69
•
Appendix (sample profiles, background data)
72
R
Slide
2
Background
•
•
Electralink provides a range of services to companies operating in the
utilities market, including the Data Transfer Service, and management of
SPAA Ltd and DCUSA Ltd.
This research is concerned with surveying the opinions of users of each of
these services. Since 2007 the survey has been managed using
Researchcraft.
Previous internally managed quantitative surveys provide comparative data
from previous years.
– DTS Survey
- Carried out online from 1997 to 2006
– SPAA Survey
- Carried out via telephone since 2006
– DCUSA Survey
- Introduced for the first time in 2007
R
•
3
Method
Quantitative Survey
What?
A quantitative survey using a c.15 minute CATI telephone interview.
Who?
Amongst a total of 120 named contacts at companies using Electralink
services for:
DTS -
48 interviews (28 Contract Managers, 31 Gateway
Operations Managers)
SPAA/DCUSA - 72 interviews (33 SPAA, 44 DCUSA)
All interviews conducted between Thursday 9th and Friday 24th September
2010.
R
When?
4
Presentation Coverage
•
Introduction
2
•
Overview
5
•
Focus On DTS
9
•
Focus On DCUSA/ SPAA
42
•
Considerations For Qualitative Phase
69
•
Appendix (sample profiles, background data)
72
R
Slide
5
Rating Versus Other Organisations
Summary
Overall Satisfaction
Mean Score out of 10
DTS
SPAA/
DCUSA
Mean
Score
Change
vs 2009
Mean
Score
Change
vs 2009
Electralink
8.60
+0.54
8.28
-0.04
GEMSERV / MRASCO
Elexon
7.07
7.76
+0.28
+0.67
6.88
7.32
+0.12
-0.01
National Grid
6.00
-0.26
N/A
N/A
JOINT GAS OFFICE
6.50
+0.50
7.14
+0.01
OFGEM
5.68
-0.27
5.58
-0.22
Xoserv
6.15
-0.10
6.21
+0.42
Gas Forum
N/A
N/A
6.61
+0.36
iGT UNC
N/A
N/A
6.57
+0.07
R
Base: All Who Use Each Company (Various)
6
Satisfaction With Electralink Service
Summary
DTS
Mean Score out of 10
SPAA/
DCUSA
Mean
Score
Change
vs 2009
Mean
Score
Change
vs 2009
Overall Rating
8.60
+0.54
8.28
-0.04
Overall Professionalism
8.85
+0.61
8.72
+0.13
Being Responsive
8.56
+0.80
8.43
+0.26
*Being Easy To Work With
8.60
+0.62
8.78
+0.20
Being Highly Efficient
8.00
+0.27
8.19
+0.11
Communicating Clearly
8.42
+0.48
8.26
+0.15
7.98
+0.31
8.03
+0.24
8.27
+0.56
N/A
N/A
Understanding The Service
Support Requirements Of…
Providing Valuable
Base: Total Sample
(48)
* Wording changed in 2010 for SPAA/ DCUSA
(72)
R
Expertise Resource
7
Summary Versus Previous Years
Year
Base
Average
(Mean Score)
No Of Ratings
Compared
DTS Survey
2007
2008
2009
2010
41
46
49
48
4.01
4.13
4.22
4.26
29
29
29
42
SPAA/DCUSA
Survey
2008
2009
2010
68
71
72
4.30
4.33
4.44
26
26
27
The above is like for like comparison on statements scored as follows:
Rating
Score
Very Good
5
Good
4
Adequate
3
Poor
2
Very Poor
1
Those with no experience or not using services / features rated are excluded from the mean
scores.
R
Comparisons are made only across those ratings present in all years shown:
8
Presentation Coverage
•
Introduction
2
•
Overview
5
•
Focus On DTS
9
•
Focus On DCUSA/ SPAA
42
•
Considerations For Qualitative Phase
69
•
Appendix (sample profiles, background data)
72
R
Slide
9
The DTS – Key Headlines (1/2)
Overall, satisfaction with Electralink’s service has improved in 2010 and is close to
the peak levels reached 2 years ago:
−
−
−
•
•
8-10 scores are up 14% on 2009
26% ahead of the nearest competitive benchmark
At least 1 in 3 say Electralink is better than others overall and in terms of value for
money
This sense of improvement comes from a number of specific areas. Electralink is
now seen as even more professional, responsive, easy to deal with and expert,
particularly amongst Contract Managers. All of these attributes are now at their
highest levels recorded.
Of the specific services delivered, the strongest areas are the Helpdesk, Reporting
tools, DTS service and Gateway connections, all of which average scores of good
or better (4+ out of 5).
R
•
10
The DTS – Key Headlines (2/2)
Several specific service areas are perceived to have strengthened further:
−
−
−
−
−
•
•
The website
Quality of service
Notification of scheduled downtime
Feedback from user groups (both Electralink & User Group Reps)
Criticisms and reduced ratings are isolated, but include:
−
−
−
−
•
Some web tools – D-FLOWMASTER & RECOLLECTION tools, & The REPORTS
Some other web tools – Statistical graphs.
Reducing costs
Providing more communication
Gateway upgrades & hardware
On a handful of the 42 areas rated, there are still some customers rating service as
poor, highlighting that there is always room for improvement.
Awareness remains a barrier to uptake of many of Electralink’s new initiatives,
despite being substantially reduced over the past year.
R
•
11
DTS
Overall Rating
Value For Money
100
87
80
85
71
68
60
60
40
27
30
27
2007
2008
2009
20
0
R
% Scoring 8-10 out of 10
2010
12
DTS
Electralink
100
Elexon
Gemserv
87
80
60
OFGEM
Nat Grid
85
71
68
59
59
40
40
48
42
41
29
20
14
7
48
26
16
25
1219
8
0
% Scoring 8-10 out of 10
2008
2009
2010
R
2007
13
How DTS Compares With Other Services
% Slightly/ Much Better
Total DTS
Sample
DTS
CM’s
DTS
GOM’s
How Facilities
Compare With Others
How Compares For
Value For Money*
31
29
45
35
45
43
R
* Excludes don’t knows (58% for total sample, 61% for CM’s and 55% for GOM’s).
14
Main Reasons For Satisfaction / Dissatisfaction
DTS Sample
Give Score
of 1 - 7
Give Score
of 8 – 10
14
41
0
32
0
20
13
0
15
13
43
7
0
12
29
5
0
5
(7)
(41)
Generally Happy With Service
43
Quick Service / Prompt
Turnaround Of Problems
27
17
Approachable / Helpful
Service Efficient /
Professional / Accurate
Any Miscellaneous Negative
Comments
Knowledgeable/ Provide
Necessary Info / Expert
Issues with gateway
upgrades/ hardware
Communication not so good/
clear
10
8
4
Base: Total DTS Sample
(48)
NB: Mentions by 1 person (2%) not shown
R
Total
Sample
15
DTS: Perceived Improvement Over Past 12
Months
% Improved a little/ a lot
Electralink Service
Value For Money
DTS
CM’s
21
15
DTS
GOM’s
11
7
32
19
R
Total DTS
Sample
16
Total
Sample
Have Not Noticed Any
Changes
Service Is Consistent /
Always Good
42
25
Improved A
Lot / Little
Stayed The
Same
0
53
0
32
Communication positive
6
30
0
Better response/ more
proactive
6
30
0
Technology improved
4
20
0
Have Little Contact With
Them / New To Post
4
0
5
0
8
(10)
(38)
Don’t Know/ No Reason
6
Base: Total DTS Sample
(48)
NB: Mentions by 1 person (2%) not shown
R
Main Reasons Electralink Has Improved / Stayed Same
DTS Sample
17
What Would Most Like Electralink To Improve On
DTS Sample
I think it is a very expensive service
overall for the industry and
unnecessary because of the public
internet
No Improvements Necessary
/ Just Stay The Same
46
Reduce Costs
Base: Total DTS Sample
17
8
There is nothing I could say that
they can improve upon. Any
improvement that we can come
across, they are always willing to
listen and take it onboard and
pass on to the user group for
discussion and possible
agreement.
(48)
Proactive communication – telling us about
things in advance. The first time I hear about
anything is when I receive an invitation to a
meeting – it would be nice to know about
things before it reaches this stage
R
More Communication /
Information
Costs – generally to bring
them down. I do not know
how they can do this
realistically, but obviously the
cheaper the better.
18
Satisfaction With Electralink Service
GOM’s
2010
Change
2010
Change
2010
Change
Overall Rating
8.60
+0.54
8.46
+0.50
8.65
+0.35
Overall Professionalism
8.85
+0.61
8.86
+0.60
8.77
+0.40
Being Responsive
8.56
+0.80
8.50
+0.87
8.58
+0.58
Being Easy To Deal With
8.60
+0.62
8.64
+0.60
8.52
+0.39
Being Highly Efficient
8.00
+0.27
7.96
+0.22
7.94
+0.04
Communicating Clearly
8.42
+0.48
8.36
+0.62
8.39
+0.22
Understanding Your
Business Needs
Providing Valuable
Expertise Resource
7.98
+0.31
7.71
+0.30
8.19
+0.12
8.27
+0.56
8.39
+1.06
8.16
-0.04
Base: Total DTS Sample
(48)
(28)
(31)
R
CM’s
Mean Score
19
DTS
Overall Professionalism
Valuable Expertise Resource
Understand Business Needs
100
94
80
80
73
71
60
56
81
61
63
59
59
2008
2009
69
40
34
20
0
% Scoring 8-10 out of 10
2010
R
2007
20
DTS
Being Highly Efficient
Being Responsive
100
88
80
80
78
60
59
63
73
61
56
40
20
0
% Scoring 8-10 out of 10
2008
2009
2010
R
2007
21
DTS
Being Easy To Deal With
Communicating Clearly
100
90
80
88
80
66
60
76
67
65
63
40
20
0
% Scoring 8-10 out of 10
2008
2009
2010
R
2007
22
How Facilities Provided Compare With Others
Slightly Better
Total DTS Sample
DTS CM’s
DTS GOM’s
Base: Total DTS Sample
2009
Much Better
15
17
11
18
19
16
(48)
Slightly Better
31
8
6
29
35
Much Better
7
7
7
3
14
15
10
(49)
R
2010
23
Average Rating Of Main Service Areas
DTS Sample
No. of
Attributes
Rated
Mean
Score
2010
Attributes
Compared*
Change
Since
2009*
All Ratings
42
4.26
42
+0.04
DTS Web Tools
9
4.17
9
+0.25
Electralink Helpdesk
5
4.49
5
+0.06
The DTS Itself
5
4.32
5
+0.08
Electralink Reporting Tools
4
4.39
4
+0.06
Electralink Services
6
4.18
6
+0.13
Gateway Connection
5
4.20
5
+0.07
EDS Helpdesk
3
4.31
3
-0.17
Electralink Website
5
4.11
5
+0.10
R
Service
Area
Base: All rating each attribute
* Change compared only on ratings in both 2009 and 2010
24
DTS - Key Service Changes Since 2009
Change in mean score since 2009
*Web tools – The REPORTS
+1.29
*Web tools – D-FLOWMASTER
+0.63
*Web tools – The RECOLLECTION tool
+0.60
Ease of use of the Electralink website
+0.30
The overall quality of service provided
+0.30
The content of the daily gateway reports
+0.27
Using terminology (on the website) that is easy to understand
+0.26
*Web tools – MPAN Search Facility
+0.26
Administering change requests efficiently
+0.24
The quality of response you receive from the HP helpdesk
-0.21
*Web tools – Statistical Graphs
-0.67
NB: All other changes were less than +/- 0.20
R
* CAUTION: Low base size
25
Rating Of Electralink Helpdesk
The Quality Of
Response You Receive
Poor/Very Poor
Adequate
Good
Very Good
2010
2009
2008
Mean Score
53
4.51
4.41
4.31
51
4.43
4.44
4.22
4.64
4.53
4.33
4.52
4.42
4.21
4.36
4.35
4.15
Average for DTS
4.49
4.43
4.24
Base: All Electralink’s DTS Helpdesk users (DTS)
(43)
(32)
(37)
The Speed Of Response
2
9
37
35
Overall Helpfulness
Receiving a consistent
level of service regardless
of how you get in touch
63
47
7
51
49
42
R
Getting Consistent
information and advice
regardless of how you get
in touch
44
26
Rating Of HP Helpdesk*
2010
The Quality Of
Response You Receive
6
The Quality Of Service
Provided
The Way HP Manages
Fault Situations
Poor/Very Poor
Adequate
Good
Very Good
47
24
59
6
41
Average For DTS
29
35
2009
2008
4.23
4.44
4.04
4.33
4.50
4.08
4.36
4.50
3.92
4.33
4.48
4.01
(17)
(18)
(27)
Mean Score
Base: All HP/ EDS Helpdesk users (DTS)
R
*Prior to 2010 was the EDS Helpdesk
27
Rating Of Electralink Website
DTS Sample
2010
2009
2008
3.86
4.00
3.86
4.22
3.92
3.76
3.96
3.95
4.00
4.18
3.92
4.05
4.33
4.24
3.95
Average For DTS
4.11
4.01
3.92
Base: All Website users
(28)
(25)
(21)
Good
Very Good
32
4
^Ease Of Use
Adequate
39
18
Being Kept Up To Date
4
Using Terminology That Is
Easy To Understand
4 11
Being Clear And Easy To
Login As A User
11
25
39
14
39
54
18
50
36
36
39
Mean Score
R
Overall Usefulness
Poor/Very Poor
28
Website Features Used
DTS Sample
Rating of Feature
% Used
Feature
2010
Mean Score
2009
2008
The MPAN Search Facility
42
4.35
4.09
4.09
The RESUBMISSION Tool
21
4.50
4.50
4.67
The ACMT
35
4.06
3.92
4.27
The Web Tools User Guide &
Context Sensitive Help
42
3.85
4.00
3.92
The AUDIT Facility
54
4.27
4.14
4.15
The RECOLLECTION Tool
21
4.60
4.00
5.00
D-FLOWMASTER
17**
3.63
3.00
4.00
THE REPORTS
15**
4.29
3.00
4.00
Statistical Graphs
15**
4.00
4.67
N/A
4.17
3.92
4.26
Base: Total DTS Sample
** Caution: Low base size
(48)
Features Users (Various)
R
Average For DTS
29
Training Sessions For Users Of The Web Tools
Applications
• 25% of all DTS users claim to have attended a Web
Tools training session
R
• 60% would be interested in attending similar
workshops in the future.
30
Rating Of Electralink Services
2010
2009
2008
4.32
4.08
4.17
4.30
4.21
4.34
4.13
4.19
4.19
4.05
3.90
3.75
23
3.84
3.80
3.75
46
4.44
4.14
4.18
Average For DTS
4.18
4.05
4.06
Base: Total DTS Sample
(48)
(49)
(46)
Administering Change
Requests Efficiently
Managing DTS Fault
Situations
The Quality Of Written
Communications
28
Good
Very Good
52
4 15
2
33
42
21
6
38
38
38
23
25
38
52
Mean Score
R
Overall Quality Of Service
Provided
Adequate
31
6
The Content & Format Of
Newsletters
Being Proactive In
Suggesting Improvements
Poor/Very Poor
31
Rating Of DTS Itself
Poor/Very Poor
Adequate
Good
Very Good
2010
2009
2008
4.28
4.15
4.05
50
4.46
4.45
4.33
44
4.23
4.19
4.19
4.45
4.35
4.36
4.20
4.07
4.03
Average For DTS
4.32
4.24
4.19
Base: Total DTS Sample
(48)
(49)
(46)
Mean Score
Quality Of Info In The DT
Handbook
4
Being Able To Meet Needs 4
Of Current Business
Being Able To Cope With 6
8
Needs Of Future Business
4
Value For Money
Provided By The DTS
2 10
46
40
46
42
27
48
31
R
DTS Overall
50
32
Rating Of Gateway Connection
Poor/Very Poor
Adequate
Good
Very Good
2010
2009
2008
4.47
4.37
4.21
4.09
3.93
3.83
4.39
4.23
4.38
3.93
3.98
4.00
4.11
4.12
3.83
Average For DTS
4.20
4.13
4.12
Base: Total DTS Sample
(48)
(49)
(46)
Mean Score
Quality Of Service From
6
Gateway Connection
Providing A Data Transfer
And Management Service 4 13
That will Keep Pace With
Technology
Overall Capacity Of The
8
Gateway Connection
6 17
Having Flexibility To
Integrate Gateway With
Existing Systems
66
52
44
42
46
40
46
31
27
33
R
Providing The Latest,
Up To Date Software
40
33
How Effective Find Notifications And Notification
Period For Scheduled Service Downtime
2010
Quite
Effective
2009
Very
Effective
Total DTS Sample
29
65
94
DTS CM’s
32
64
96
DTS GOM’s
26
65
90
Quite
Effective
49
41
53
2008
Very
Effective
45
52
40
Quite
Effective
Very
Effective
94
30
67
97
93
27
73
100
93
29
67
96
R
Base: Total DTS Sample
34
Rating Of Electralink Reporting Tools
Rating
% Use
Mean
Score
2009
2008
Content Of Monthly Service
Reports
56
4.11
4.14
4.08
*Quality Of Electralink Billing Info
21**
4.50
4.50
4.17
Content Of Daily Gateway
Reports
15
4.57
4.30
4.15
The Audit Tool
50
4.38
4.36
4.16
None Used
21**
4.39
4.33
4.14
Base: Total DTS Sample (48)
* Asked of CM’s Only (28)
Reporting Tool Users (Various)
** CAUTION: Low Base
R
Average For DTS
35
How Well Electralink Provides Feedback On Topics
Discussed And Issues Raised At The DTS User Group
25% are elected members of the DTS user group in 2010 compared to
10% in 2009 and 15% in 2008.
2010
2009
2008
42
Quite Well
46
Not Very Well
10
35
63
4
Not At All Well 2
Base: Total DTS Sample
27
13
6
(48)
50
2
(49)
(46)
R
Very Well
36
How Well User Group Representative Provides Feedback On Topics
Discussed And Issues Raised At The DTS User Group
Very Well
42
Quite Well
38
Not Very Well
13
Not At All Well
8
Base: Total DTS Sample
(48)
2009
2008
20
26
53
46
20
6
(49)
24
4
(46)
R
2010
37
Awareness & Claimed Uptake Of Initiatives Introduced In
The Past Two Years
Aware &
Planning To
Use
Bulletin Board On Web Tools
Ability To Restore Routing Data
Through ACMT
33
42
Ability To Extract Routing Info.
In CSV Format From ACMT Tool
6
6
Ability To Display List Of All
MPANs In Single Data File
Ability To Deliver
Acknowledgement Files To
Different Directory
6
2
8
33
50
33
8
38
60
10
15
31
25
6
31
27
25
Introduction Of ‘Admin’
Accounts On Web Tools
New ‘STATS’ Graphs On Web
Tools
2
Unaware
44
23
Flexible Filenaming
Aware But
Not Planning
To Use
50
58
13
33
17
46
52
R
Already
Using
38
Claimed Uptake Of Initiatives Introduced In The Past Two
Years
2010
4
6
25
12
8
R
25
6
14
8
2
Introduction Of ‘Admin’
Accounts On Web Tools
New ‘STATS’ Graphs On Web
Tools
16
23
Ability To Display List Of All
MPANs In Single Data File
Ability To Deliver
Acknowledgement Files To
Different Directory
6
4
Ability To Extract Routing Info.
In CSV Format From ACMT Tool
Flexible Filenaming
24
33
Bulletin Board On Web Tools
Ability To Restore Routing Data
Through ACMT
2009
39
Awareness Of Initiatives Introduced In The
Past Two Years
2010
2009
Ability To Extract Routing Info.
In CSV Format From ACMT Tool
Ability To Display List Of All
MPANs In Single Data File
Ability To Deliver
Acknowledgement Files To
Different Directory
Introduction Of ‘Admin’
Accounts On Web Tools
New ‘STATS’ Graphs On Web
Tools
22
50
62
27
27
40
50
31
27
42
54
48
24
31
R
Ability To Restore Routing Data
Through ACMT
Flexible Filenaming
39
67
Bulletin Board On Web Tools
40
DTS – Suggested Improvements / Changes
A number of areas have either been directly raised as criticisms, or have seen
lower ratings than last year:
−
−
−
−
•
•
Some of the web tools – Statistical graphs
The need to reduce costs further
Providing more communication
Gateway upgrades & hardware
There is still room to extend awareness and uptake of all of the recent new
initiatives.
For several areas where 4% or more rate the service as poor:
−
−
−
−
−
−
−
Keeping the website up to date
Using terminology on the website that is easy to understand
Being proactive in suggesting improvements
Quality of written communications
Being able to cope with the business needs of the future
Having the latest, up to date software for Gateway connections
Having the flexibility to integrate the Gateway with existing systems
R
•
41
Presentation Coverage
•
Introduction
2
•
Overview
5
•
Focus On DTS
9
•
Focus On DCUSA/ SPAA
42
•
Considerations For Qualitative Phase
69
•
Appendix (sample profiles, background data)
72
R
Slide
42
DCUSA/ SPAA – Key Headlines (1/2)
The high satisfaction levels already reached have been maintained again this year,
with a strong sense that service is improving – particularly the staff, website and
communications.
−
−
•
•
Value for money score is up to 47% (but still below the 2007 peak of 61%)
Customers see your strengths as the helpful, efficient and prompt service from
staff. Rating of responsiveness and understanding service support have improved
further.
These are well aligned with what they see as the necessary qualities of a code
adminstrator:
−
−
−
•
Electralink’s 8+ score is still 29% ahead of the nearest benchmark (Elexon)
Quality of service
Knowledge
Quality of written work
Demonstrating industry influence and being easy to work with matter much less.
R
•
43
DCUSA/ SPAA – Key Headlines (2/2)
•
•
At an average rating of 4.4 out of 5, customers already rate Electralink highly on the
29 specific service areas covered. This is most strongly supported by the helpdesk
and finance & auditing activities.
There are no major changes in the these 29 ratings since 2009.
Of these, only two (both related to the website) receive more than 3% rating the
service poor:
−
−
Ease of using the website (6%)
Overall usefulness of the website (4%)
R
•
44
SPAA/ DCUSA
Overall Rating
Value For Money
100
84
75
80
77
79
61
60
47
40
38
38
2008
2009
20
0
% Scoring 8-10 out of 10
2010
R
2007
45
SPAA/ DCUSA
Electralink
Elexon
Gemserv
OFGEM
100
80
77
84
79
75
60
40
39
50
50
30
37
42
35
20
12
17
11
7
0
% Scoring 8-10 out of 10
2008
2009
2010
R
2007
46
Total
Sample
Service Efficient /
Professional / Accurate
43
Generally Happy With Service
40
Give Score
of 1 - 7
Give Score
of 8 - 10
13
50
60
35
Quick Service / Prompt
Turnaround Of Problems
23
12
26
Approachable / Helpful
23
13
26
Knowledgeable / Expert
8
0
10
Negative Website Comments
8
13
6
Lack Of Knowledge/ Some
Staff Not Qualified
6
20
3
Communication Good/ Clear
5
0
6
(16)
(62)
Base: Total SPAA/ DCUSA Sample
NB: Mentions by 1 person (1%) not shown
(72)
R
Main Reasons For Satisfaction / Dissatisfaction
SPAA / DCUSA Sample
47
SPAA / DCUSA: Perceived Improvement Over
Past 12 Months
Improved
(A little/ A lot)
Electralink Service
Value For Money
Got Worse
(A little/ A lot)
22
4
3
1
R
No single reason given for improvement by more than 1-2 people
48
Total
Sample
Have Not Noticed Any
Changes
Service Is Consistent /
Always Good
Have Little Contact With
Them / New To Post
31
25
17
Staff - Positive
8
Website Improved
5
Improved A
Lot / Little
Stayed Same
/ Got Worse
0
39
6
30
0
21
31
2
25
0
Communications - positive
3
13
0
Easy to work with
3
6
2
0
2
0
2
(16)
(61)
Staff - Negative
Don’t Know
1
0
Base: Total SPAA / DCUSA Sample
(72)
NB: Mentions by 1 person (1%) not shown
R
Main Reasons Electralink Has Improved / Stayed Same
SPAA / DCUSA Sample
49
What Would Most Like Electralink To Improve On
SPAA / DCUSA Sample
Nothing, just to
maintain their
standards and
not lower them
at all.
The website. I believe everything
is there, but it is just trying to
find it. So to improve on the
navigation.
No Improvements Necessary
/ Just Stay The Same
Improve Website / Non-User
Friendly
43
18
9
Increase Knowledge
9
Base: Total SPAA/ DCUSA Sample
5
4
(72)
Knowledge within individuals – when
something has to be dealt with quickly &
effectively you want someone with
knowledge. This is regarding SPAA & the
meetings. Sometimes you leave uncertain
about something, and feel that you have got
to go and find out more about it.
They need to blow their own trumpet a bit louder and get
out there a bit more – maybe a spokesman.
R
Improve Documentation
Be spokesman for industry/
improve influence
Increase speed of putting
things through/ updating
change documents
Try to help parties develop
better variations – better
documentation.
50
SPAA / DCUSA
Overall Professionalism
100
90
84
Understand Service Support
89
92
80
72
60
61
56
59
40
20
0
% Scoring 8-10 out of 10
2008
2009
2010
R
2007
51
SPAA / DCUSA
Being Highly Efficient
Being Responsive
100
84
80
81
60
81
68
73
72
72
2009
2010
63
40
20
2007
% Scoring 8-10 out of 10
2008
R
0
52
SPAA / DCUSA
Being Easy To Work With
Communicating Clearly
100
84
80
84
60
87
89
75
78
2009
2010
75
62
40
20
2007
% Scoring 8-10 out of 10
2008
R
0
53
Satisfaction With Electralink Service
SPAA / DCUSA Sample
1 = Not at all satisfied
10 = Extremely Satisfied
2010
% Score 8 - 10
76
*Being Knowledgeable
*Demonstrating Industry
Influence
40
*Quality Of Written Work On
Product
8.14
6.60
64
7.82
81
*Quality Of Their Staff
Base: Total SPAA/ DCUSA Sample
2010
Mean Score
8.43
(72)
R
*New statements added 2010
54
Average Rating Of Main Service Areas
SPAA / DCUSA Sample
No. of
Attributes
Rated
Mean
Score
2010
Attributes
Compared*
Change
Since
2009*
All Ratings
27
4.44
26
+0.09
Finance & Auditing
8
4.58
7
-0.02
Management Of …..
8
4.32
8
+0.06
Helpdesk For SPAA/DCUSA
5
4.57
5
+0.06
SPAA/DCUSA Website
6
4.14
6
+0.11
Service
Area
R
Base: All rating each attribute
* Change compared only on ratings in both 2009 and 2010
55
SPAA/ DCUSA - Key Service Changes Since 2009
Management of user access priveleges (website)
+0.23
Ease of use of website
+0.22
Efficiency with which agrees & confirms meetings
+0.22
Handling of company secretarial matters
-0.21
Provision of meeting facilities at Electralink
-0.22
NB: All other changes were less than +/- 0.20
R
Change in mean score since 2009
56
Rating Of Helpdesk for SPAA / DCUSA
2009
2008
4.51
4.41
4.30
4.55
4.41
4.38
4.75
4.67
4.35
4.51
4.54
4.22
4.54
4.50
4.22
Average For SPAA / DCUSA
4.57
4.51
4.29
Base: All Helpdesk users
(53)
(27)
(39)
The Quality Of Response
You Receive
The Speed Of The
Response
4
Receiving a consistent
level of service regardless
of how you get in touch
Getting consistent
information and advice
regardless of how you get
in touch
40
9
Overall Helpfulness
Adequate
Very Good
26
26
53
62
72
2
43
49
2
40
53
2010
Mean Score
R
Poor/Very Poor
Good
57
Overall Usefulness
4 16
Ease Of Use
6
Poor/Very Poor
Adequate
Good
Very Good
54
29
50
Being Kept Up To Date
1
Management Of User Access
Privileges
1 15
35
The Accuracy Of Content
1
9
46
*The Accuracy Of Party Details
On The Website
3
Average For SPAA / DCUSA
13
19
28
21
47
32
41
41
46
35
2010
Mean Score
2009
2008
4.03
3.98
4.13
3.82
3.60
3.77
4.26
4.18
4.25
4.25
4.02
4.19
4.34
4.25
4.33
4.13
4.13
4.00
4.14
4.03
4.11
Base: All SPAA / DCUSA Website users ( 68 )
* Wording changed from ‘ The Accuracy Of Membership Details On The Website’ since 2009
R
Rating Of SPAA / DCUSA Website
58
SPAA
Mean Score
DCUSA
Mean Score
Overall Usefulness
3.94
4.10
Ease Of Use
3.66
3.93
Being Kept Up To Date
4.16
4.31
Management Of User Access
Privileges
4.14
4.32
The Accuracy Of Content
4.19
4.49
*The Accuracy Of Party Details
On The Website
3.90
4.35
Average For SPAA / DCUSA
4.00
4.25
Base: All SPAA / DCUSA Website users ( 68 )
* Wording changed from ‘ The Accuracy Of Membership Details On The Website’
since 2009
R
Rating Of SPAA / DCUSA Website - 2010
59
Rating Of SPAA / DCUSA Finance & Auditing Services (1)
Quality Of Monthly Management
Accounts
Poor/Very Poor
Adequate
Good
Very Good
17
72
2010
Mean Score
2009
2008
4.81
4.64
4.82
Ensuring Financial Controls Are
In Place
33
50
4.60
4.60
4.50
Management Of Year End Audit
Process & AGM
33
50
4.60
4.64
4.90
4.59
4.55
4.55
4.64
4.60
4.78
Managing Overall Financial
Control
How Well Support The Financial &
Audit Committee
39
22
56
39
* New Statement in 2009
** Wording changed since 2009
R
Base: Active SPAA / DCUSA Board Members (18)
60
Rating Of SPAA / DCUSA Finance & Auditing Services (2)
*The Handling Of Company
Secretarial Matters
**How Efficiently Electralink
Supports The Assessment Of
Financial & Commercial Risk
***Input & Support In
Determining/ Drafting The Annual
Budget
Poor/Very Poor
Adequate
Good
Very Good
39
6
39
44
61
28
39
Average For SPAA / DCUSA
2010
Mean Score
2009
2008
4.61
4.82
N/A
4.31
4.45
4.40
4.47
N/A
N/A
4.58
4.61
4.66
Base: Active SPAA / DCUSA Board Members (18)
*New statement in 2009
** Wording changed since 2009
R
*** New statement in 2010
61
Rating Of Electralink’s Management Of SPAA / DCUSA
6
Quality Of SPAA / DCUSA Meeting
Agendas & Minutes
11
Adequate
Good
Very Good
43
40
57
Support Of Production & Circulation Of
Meeting Papers
13
56
Efficiency In Managing SPAA/DCUSA
Change Process (Not Agreement Itself)
7
47
Overall Quality Of Service In Managing
SPAA / DCUSA
1
Efficiency In Operating Annual Voting
System
16
Provision Of Meeting Facilities At
Electralink
31
36
39
61
8
Quality Of SPAA/ DCUSA Meeting Papers 10
Average For SPAA / DCUSA
Base: Total SPAA / DCUSA Sample (71)
26
36
31
39
35
51
33
2009 2008
Mean Score
4.39
4.27
4.29
4.20
4.06
4.02
4.32
4.21
4.21
4.35
4.22
4.15
4.35
4.27
4.19
4.36
4.38
4.16
4.32
4.54
4.32
4.27
4.12
4.13
4.32
4.26
4.18
R
Efficiency With Which Agrees &
Confirms Meetings
Poor/Very Poor
62
Qualities Important For Code Administrator To
Demonstrate – SPAA/ DCUSA Sample
Spontaneous mentions
Accuracy
Being Knowledgeable
Fairness/ Neutrality
Efficiency
Timeliness
Consistency
Helpful/ Approachable
Communicating (Clearly)
Quality Of Service
Reliability
Good Organisation Skills
Assess Impact On Other Industries
Consider All Facts / Details
Being Responsive
18
13
10
10
4
4
4
4
4
3
3
3
3
3
Don’t Know
6
Base: SPAA/ DCUSA Sample
(72)
R
Other Mentions By 1 Person Only
63
Qualities Important For Code Administrator To
Demonstrate – SPAA/ DCUSA Sample
Prompted With A List
96% say that it is important to have continuity of service provision
from the code administrator.
Most
Top 3
Important
Important
32
18
15
14
8
4
3
1
0
60
53
39
28
28
25
15
35
8
Something Else
None In Particular
1
3
1
3
(72)
(72)
R
Overall Quality Of Service
Being Knowledgeable
Quality Of Written Work
Overall Professionalism
Quality Of Their Staff
Value For Money
Being Easy To Work With
Being Responsive
Demonstrating Industry Influence
Base: SPAA/ DCUSA Sample
64
Importance Versus Performance Matrix
Performance (scores 8+ out of 10)
HIGH
LOW
60
50
Importance (Top 3)
HIGH
70
High priority
to address/
improve
High priority
to maintain
performance
Low priority
need
Potential to
exploit
existing
strength
40
30
10
0
30
40
50
60
70
80
90
100
R
LOW
20
65
Importance Versus Performance Matrix
Performance (scores 8+ out of 10)
HIGH
LOW
HIGH
70
60
Quality of service
Knowledgeable
Importance (Top 3)
50
Quality of written
work
40
Responsive
30
Quality of staff
Value for money
Professionalism
20
LOW
Easy to work with
10
Demonstrate
industry influence
0
40
50
60
70
80
90
100
R
30
Base: SPAA/ DCUSA Sample
(72)
66
Additional Services Would like to see
Electralink Provide (Spontaneous)
92% felt there was nothing they particularly wanted Electralink
to provide going forward.
Only 3 points were made by more than one person:
• 3 mentioned involvement in new codes (1 mentioning ROC).
• 2 mentioned smart metering (but also acknowledged that they
were already doing something)
R
• 2 mentioned service related issues, rather than new roles.
67
DCUSA & SPAA: Suggested Improvements
Although some feel it has improved, there is some indication of room for
improvement of the website, with two areas rated poor by more than 3%:
−
−
•
•
Ease of using the website (6%)
Overall usefulness of the website (4%)
Additionally, feedback on the SPAA website is not yet as strong as the DCUSA
website, with gap greatest in terms of accuracy.
Beyond this, there is some call for improvements in documentation and staff
knowledge.
R
•
68
Presentation Coverage
•
Introduction
2
•
Overview
5
•
Focus On DTS
9
•
Focus On DCUSA/ SPAA
42
•
Considerations For Qualitative Phase
69
•
Appendix (sample profiles, background data)
72
R
Slide
69
Considerations For The Qualitative Phase (1)
There are several issues uncovered by the quantitative research that can be
understood in more detail through the qualitative phase:
DTS - Investigate several specific themes highlighted in the research
−
−
−
−
•
Website content being up to date, and terminology used
Quality & quantity of communication – why is there still a lack of awareness of new
initiatives amongst a substantial number – how can this gap be reduced?
Gateway connections – up to date software & integrating with existing systems
The future role of Electralink in the industry and taking a more proactive position
SPAA/ DCUSA - Explore a number of areas in more depth
−
−
−
−
Website ease of use/ navigation
SPAA website – accuracy of content versus DCUSA
Staff knowledge (depth of and contingency)
Written documentation
R
•
70
Considerations For The Qualitative Phase (2)
We recommend that Electralink review the verbatim comments in detail with two
purposes in mind:
• Identifying general issues to be covered in the discussion guide with all/ most
respondents
• Identify individuals who have hade specific experiences or hold specific opinions we
wish to investigate further. Assuming they have agreed to participate in the qual
phase we can then recruit them (their identity may still remain anonymous to
Electralink unless they give permission otherwise).
R
A rapid decision will need to be made on the latter in order for them to be included in
recruitment for interviews from w/c 18th October.
71
2010 Customer
Survey
Main Quantitative Report
Prepared For:
R
11th October 2010