Auditing SMEs

Download Report

Transcript Auditing SMEs

Organisation for Economic Co-operation and Development
Risk Management and
Taxpayer Service
19. Performance measurement
Kampala, 17 – 21 May 2010
Centre for Tax Policy and Administration
Objectives setting targets and measuring performance
4 broad objectives:




2
Managing efficiency and effectiveness
Improving decision making in the budget process
Improving external transparency and accountability
Achieving savings
Measurement : a challenging struggle








3
Setting clear objectives requires agreement on what the
mission is
Stakeholders can have different interests
Finding accurate measurement of performance
Output and outcomes
Outcomes are technically more difficult to measure
(complex, interrelations, time legs, causality)
Establishing and maintaining systems of data collection
can be complex and costly
Target levels (not to high or to low)
Realistic number of target and measurements
Increasing focus on the outcomes to be achieved by
Tax Administrations




Direct and indirect measures of taxpayer’s compliance
Measures reflecting the quality of services delivered to
taxpayers
Reductions in taxpayer’s compliance burden
Measures reflecting the level of taxpayer satisfaction
with, and confidence in the Tax Administration
(FTA guidance note - Monitoring Taxpayers’ Compliance: A practical guide
Based on Revenue Body Experience)
4
Planning and control
Planning
5

Central level

Policy plan (5 years)




Directorate
Office
Team
Employee




Annual plan
Office plan
Team plan
Personal plan
Planning (2)

6
Communicate the objectives to be achieved by the
organization as a whole, including the expected
contribution of the employees
Control

7
Control is the process which ascertains whether, and to
which degree objectives are being achieved and which
risks are being encountered
Stakeholders










8
Minister of Finance
Treasury
Parliament
International organizations
Judiciary
Tax consultants and accountants
Taxpayers
Employees
Government audit committee
Other government agencies
Control
Levels and perspectives




9
Strategic, management, operational control aimed at
outputs and outcomes
Process control directed at functions and processes
Financial control, aimed at transactions and flow of
management
Quality control
Developing the control function




10
Scorekeeper
Financial controller
Management controller
Quality assurance
Quality assurance





11
Assessing work by / on behalf of the team manager
Assessing work by technical experts
Supervision by most experienced employee (mentor)
Coaching
Supervision within specialist groups
Critical success factors



12
Critical success factors are factors that are of decisive
importance to the success of the organization as a whole
Critical success factors stem from the strategy
Performance indicators measure the realization of the
strategic goals
Examples of critical success factors
Critical success factor
Performance indicator
-
Collection maximization
-
Pay as you earn
Cash flow pace
-
Productivity
-
Resources / production
-
Carefulness
-
Objections / assessments
13
Critical success factors / Performance
indicator (2)

Completeness



Satisfaction / acceptance

Confidentiality
14


Coverage by contra
information
Number of detected taxpayers
Appreciation of the
performance by taxpayers
Number of cases where
confidentiality is contested
Critical success factors /
Performance indicator (3)

Equity

Fraud prevention
15






Number of cases where equity
is contested
Number of audits
Number of proven fraud cases
Assessments with corrections
Amount of corrections
Public perception on
compliance/non-compliance
including fraud
Critical success factors / performance
indicators (4)

Product quality



Process quality


Efficiency

16
Accepted corrections
Court decisions in favour of
revenue body
Average throughput time of
assessments, appeals,
collections
Costs / tax revenue
Critical successfactors / performance
indicators (5)

Staff performance




17
Number of staff with a
performance above average
Sickness percentage
Mismatches
Staff satisfaction
PERFORMANCE MANAGEMENT
Business Balanced Scorecard
-
18
Financial perspective: results law enforcement
Innovation perspective: developing organization
Process perspective: process control
Market perspective: valued by stakeholders
Business Balanced Scorecard
Enforcement results






19
Compliance
Control of the cash flow
Risk control
Unity of policy and implementation
Effective fraud control
Protection of society from unwanted goods
Business Balanced Scorecard
Developing Organization




20
Implementation of new technologies
Implementation of new methods
Raising the level of professionalism of emplyees
Intensification of knowledge management
Business Balanced Scorecard
Process Control





21
Cost control in the logistic process
Optimization of the data management
Realization of products and services
Optimization of efficiency and flexibility
Optimization of information security
Business Balanced Scorecard
Valued by stakeholders




22
Integrated and current customer management
Accessible and fast supply of information
Decrease of costs and burden for taxpayers
Competitive terms of employment and a challenging
working envionment
Two Major Functions of Indicators

They reduce the number of measurements that normally
would be required to give an exact presentation of a
situation
• But a set containing of a large number of indicators will

tend to clutter the overview it is meant to provide
They simplify the communication process by which the
results of measurement are provided to the user
• Due to adoption to user needs, indicators may not always
meet strict scientific demands to demonstrate causal
chains
May 2010
23
Criteria for selecting performance
indicators
A compliance indicator should:
 Provide a representative picture of compliance
conditions
 Be simple, easy to interpret and able to show trends
over time
 Be responsive to changes in the environment
 Provide a basis for international comparisons
 Be theoretically well founded and being linked to
compliance models
May 2010
24
Criteria for performance Indicators
(Cont.)
The data required to support the indicator should be:
 Readily available or made available at a reasonable
cost/benefit ratio
 Adequately documented and of known quality
 Updated at regular intervals in accordance with reliable
procedures
May 2010
25
Evaluation model
Effectiveness
Quantity/quality
Input
Activities
Productivity/efficiency
26
Output
Outcomes
Efficiency =
Output
Input
Assertion: If one country gets
1) more output for the same amount of input or
2) the same amount of output with less use of input
than another country,
then the first country is more efficient or a better
performer than the other country.
May 2010
27
Measuring impacts of compliance
treatment
Level 1 Population
Level 2 Group
Level 3 Individual
By looking at
point in time…
or changes over
time….
May 2010
28
Program Evaluation

Designed to scrutinize tax compliance programs in
order to determine their
• Continued relevance in light of present circumstances
• Results achieved in relation to stated objects
• Immediate and future impact on taxpayers
• Cost-effectiveness compared to alternative forms of
delivery
May 2010
29






Key principles for program evaluations and
measurements
Carefully consider and reflect on the needs of different
user groups
Meet the challenges of decision-making and programme
management
Link evaluations and measurements/indicators to policy
targets and ensure that they are responsive to evolving
policy
Reflect and address factors that determine compliance
Recognize that indicators must be interpreted correctly
and meaningfully
Use different categories of evaluation and measurement
in conjunction to maximize their value (indexation)
May 2010
30
Three primary evaluation designs
1.
2.
3.
Experimental designs
•
Involves two essentially identical groups, the program
(“treatment”) group and the control group randomly assigned
Quasi-experimental designs
•
Basically the same as the experimental design but without the
randomness in the selection process
Implicit designs
•
Only post-program measure with no real pre-program
measurement or comparison group
May 2010
31
No treatment
Year 1
Total tax paid:
Year 2
Total tax paid:
Difference
Treatment
group
Control group
1000
1000
Treatment
group
Control group
1200
1100
200
100
These two are compared
May 2010
32
+ 100
Indicator of effect
Chain of different effects
Theoretical correct tax
Confidence in the
tax system
Confidence in the
Tax Authority
Perceived risk of
detection
Social norms
Easy to comply
33
Tax gap
Collection loss
Voluntary
compliance
Taxes paid
Effect indicators
 Taxpayer surveys
 Random selection
 Obvious mistakes
 Business ratios
34
Summary


35
Outcome is difficult to measure but necessary.
The purpose of evaluation is to learn more and improve
the way we work.
Workshop
Performance and Compliance Measurement
1.
2.
3.
4.
36
Does your Tax Administration have a system of performance
measurement. Please describe the system.
Given the strategic goals of your organization for the next 2 – 5 years,
what would be a good set of performance indicators for your Tax
Administration.
Develop (with your group) a set of 15 – 20 indicators using the balanced
scorecard method
What different kinds of compliance measures (registration, filing,
reporting, and payment compliance) are in use in your country, and how
do you collect the data necessary to make the measurements?