Transcript Document

Contract management
– new approaches
Measuring Contractor Performance
Nick Capon
Centre for Enterprise Research and Innovation
www.ceri.ac.uk
Acknowledgements to Stephen DeBoise,
Portsmouth City Council
www.portsmouth.gov.uk
Scope
•
•
•
•
•
Apologies for absence of Richard Tonge
Recent research at Portsmouth City Council
Current methods they use
Strengths, weaknesses
Proposed improvements
Definitions?
• Before an order is placed
– ‘Contractor EvaIuation’
– Pre Qualification Questionnaire (PQQ)
• After an order is placed
– ‘Contractor Appraisal’, or
– ‘Vendor Rating’
Benefits of measurement?
Other stakeholders
Value
Customer
Align aims and
priorities
•
•
•
•
•
•
Your
organisation
Contractor
Align aims and
priorities
Joint understanding of customer needs
Motivating improvement
Cost of control minimised
Reduced waste and complaints
Benchmarking of contractor performance
Demonstrating control of the vendor base.
Challenges – excuses?
• Workload – labour and data intensive
• Subjectivity – evidence? rewards/ penalties
cause bias? measures recorded depend on how
explained?
• Motivation – measures can create argument
rather than benefit
• Historical nature – slow
• Investment needed to mechanise data
collection and communication
Theory - QTCC
• If contractor not critical to continued success
Quality
Communication
Cost
Time
Theory – 7’C’
If contractor critical:
Competency
Cost
Control
Consistency
Cash resources
Commitment
Capacity
"Measure for Measure." Supply Management, 1
February 2001, 39
What to measure?
• Or same QTCC in greater detail:
– Cost: Prices, target costs, non-performance costs, savings
achieved/year
– Quality: End customer complaints and feedback, SPC capability
analysis and SPC trend reporting, SLA achievement, documented
service design improvements
– Time: Source reliability, staff turnover, compliance to procedures,
financial stability, total workload for us as % of total turnover
<30%, on time
– Communication: Relationships, understanding of needs and values,
communication delays.
Purchasing Principles and Management, Bailey and Farmer, 2005
What do we measure now?
Organisation
concern consistency
Outputs
- Achieve specification
Outcomes
- Survey of clients
Organis
ation Difficul
t
Contractor
concern –
trends,
review
Contractor
– Low
response
Process
- Would we work with
you again?
Org –
ethics,
innovat
ion
Contractor –
Must
compare
What would we like to measure?
Contractor
1. Outputs, compliance with
specification
2. Sustainability, continuity
3. Value for money
4. Innovation
5. Competition
6. Partnership, shared values
7. Skills, best practice
Organisation
1. Partnership, shared values
2. Outcomes for service users
3. Communication, trust
4. Understanding of needs
5. Value for money
6. Sustainable company
What would help?
What can organisation do What can contractor do to
to help contractor?
help organisation?
At need
identification
At ITT
At PQQ
At tender
At SLA
At contract
review
• Time to plan
•
• Information sharing,
also within
organisation
departments
• Reduce complexity
• Transparency
• Stable agreed
•
expectations
• Feedback, clarity
Soft market testing
to stimulate new
suppliers
Willingness to
change measures to
suit
Conclusion
Measure of Overall
Satisfaction
(9 Excellent, >6 Good, >0 Improvement
required, <0 Failing to perform)
Customer
Perception
(Outcomes)
Contract
requirements
(Outputs, service
levels defined in specs)
Satisfaction survey >90%,
>75%, >60%, >0
plus complaints low
Score 3, 2, 0, -1
Process
Expectations
(subjective)
Exceeds, Meets, Mostly,
None
Exceeds, Meets, Mostly, None
Score 3, 2, 0, -1
Evidence required
Contractor data, plus periodic
independent check
Score 3, 2, 0, -1
‘Least good at, best at…’
How to measure?
• Check goal alignment
– What does contractor think are your priorities?
• Remove your role and allow end customer to
communicate direct to contractor if possible
– Examples: Website feedback from customers, measure plus a quote
– Travel agent website of hotels
• Self assessment by contractor of trends
– Encourages involvement
– Reduces workload
– SPC, trends more important than KPI
• Independent assessor for depth
– If customer is not web literate
– Example: Help the Aged to assess Care Homes
• 360 degree feedback
Constraints
• Resources in Organisation
– to create three appropriate measures for each
contract
– simple transparent database to update results
• Training for Contractors
• Template for contractors to provide
information
• Sustaining
Pilot testing - Outcomes
• Volume of valid complaints
– compared to a target agreed with the contractor.
• Asking customers
– ‘How likely are you to raise/recommend us to a
friend (0-100%)?’
– if required ‘What extra should we have done to
get 100% score?
• Method
– User surveys, comparison with benchmarks
Pilot testing - Outputs
• Meet timescale:
– Non-achievement, missed/late deliveries
– Rectification response time
• Quality:
– Capability, maintaining adequate resources
and skills
– Work completed in sufficient detail
– Safety, environment, discrimination
• Price:
– Variations to contract pricing
– Number of cost saving improvements
Pilot testing - Process
•
•
•
•
•
•
•
Problem resolution including 360o feedback
Communication response
Invoice accuracy
Technical innovation
Financial stability ongoing
Cultural ethos/ values same as ours
Subjective assessment, with evidence
% who
measure
this
100
Illustrative current practice Outcomes
Methods:
Complaints:
-Unsolicited praise/
criticism by letter,
newspaper or telephone
-Realtime update of
shared web database
-Minuted monthly if
serious
80
60
40
20
Satisfaction:
-Proactive survey
0
Complaints
(reactive,
volume)
Satisfaction
(proactive,
%)
% who
measure
this
100
80
60
Illustrative current practice Outputs
Methods:
- Self assessment by
contractor
- Some use SPC to
monitor trends
40
20
0
Meet Rectification
Non
Maintaining
Work
Number of cost
timecale response timeachievement, adequate completed saving
missed
resources in sufficient improvements
deliveries
and skills
detail
quality
Safety,
Financial
environment, overspend,
discrimination Changes/
variations to
contract in
price
% who measure
this
Illustrative current practice Process
100
Methods:
80
- Verbal dialogue
60
-Monthly minuted
discussion
40
20
0
Problem
Communication Invoice
resolution, response
accuracy
including
360o
feedback
Technical Financial Cultural
innovation stability ethos
ongoing same as
ours
How reported?
• Monthly report, face to face discussion
• Geographical analysis to direct improvement
action
• SPC charts to highlight significant issues
Action resulting?
• Financial sharing of improvements/
penalties for failure
• Focus for improvement action
• Planned transparent sharing of vendor
rating results (IT system) with rest of
organisation, who might buy from same
contractor
Questions?