Suggested criteria for rankings

Download Report

Transcript Suggested criteria for rankings

Benefit: Cost Ratio

David Pannell

School of Agricultural and Resource Economics University of Western Australia

www.

inffer .org

Is the project worth doing?

   How to judge?

Assume aim is to maximise the value of environmental outcomes “Value” can include 

Environmental, social and economic

Tangible and intangible www.

inffer .org

Resources are limited

  Not every good project can be funded Best strategy is to choose highest ratio of benefits to costs 

Dividing by costs is crucial

Reveals projects that give best value for money (benefits per dollar spent)

Works even if projects are of different sizes www.

inffer .org

Benefit score

BCR

= ─────────────────────────────────────── Project cost

www.

inffer .org

B 2 9 2 C 1 8 6 B/C 2 1.12

0.33

Rank 1 2 3

www.

inffer .org

Measuring benefits

   For ranking to work, benefits need to be measured in a consistent way Not necessarily in dollars Our approach is to define a scoring system and apply it consistently

www.

inffer .org

50  0.2

= 10 (Asset value)  (Prop’n expected impact of project on value)

BCR

= ────────────────────────────────────────────────── Project cost 10 is the benefit score for the project. Indicates that the project generates benefits equivalent to 10% of the value of a 100-point asset.

www.

inffer .org

V

W

V: asset value W: effectiveness of works

A

B

A: adoption B: compliance

F

P

G

F: feasibility P: socio-political G: long-term funding 1/(1 + r)

L

L: time lag to benefits r: discount rate

BCR

Potential project benefits  E(prop’n of required adoption)  (1  Risk of failure )  Discount factor for time lags = ────────────────────────────────────────────────── Project cost C + PV(M+E) 

G

C: project cost M: annual maintenance cost E: polluter-pays compliance costs PV: summed present value over 20 years G: long-term funding

www.

inffer .org

 The benefits are “expected” in a statistical sense – weighted by probabilities

www.

inffer .org

Simulations of other metrics

 In terms of expected value of environmental benefits, the INFFER metric is about 100% better than commonly used metrics 

Some add where they should multiply

Many ignore costs

Most omit one or more of the benefits factors (often feasibility and adoption) www.

inffer .org

Data quality

    Process uses best available data and knowledge Can be based on expert opinion if necessary Can update with improved science or modelling if it becomes available Step 3 elicits 

quality of information

data gaps

strategy to deal with data gaps www.

inffer .org

The asset and spin-offs

   INFFER quantifies benefits that are directly related to the natural asset Other benefits (e.g. increased social capital from doing the project) can be captured qualitatively and reported in Project Assessment Report The asset can be defined more broadly to capture 2 or more sub-assets

www.

inffer .org

Simplification

    The current version of INFFER asks for only one response for each parameter In reality, there might be heterogeneity within the asset  

High feasibility to protect one part Low feasibility for another part

Response should be an overall average The price of simplicity 0.7

0.6

0.5

0.4

0.3

0.2

0.1

0 Heterogeneous asset

www.

inffer .org

Simplification

   Assumes overall benefits are proportional to level of adoption or compliance Might be non-linear Usually too little info to know 10 9 8 7 4 3 6 5 2 1 0 0 2 4

Adoption

6 8 10

www.

inffer .org

How is it used

www.

inffer .org

How is it used?

   Quantitative data is collected by completion of the Project Assessment Form (PAF) 

Qualitative information provides context and helps select quantitative values

PAF is completed for multiple assets Projects ranked on the basis of BCR

www.

inffer .org

How is it used?

  Would not expect mechanistic application of this ranking Priorities also influenced by funder priorities, opportunities, quality of information, … Project Lake X River Y Park Z BCR 5 1.5

2.2

3 2 Rank 1

www.

inffer .org

  

Can also compare versions of the same project

Different scales Different on-ground actions Different policy mechanisms  Example: Gippsland Lakes

www.

inffer .org

www.

inffer .org

Frequently Asked Questions

How does INFFER compare projects for different types of assets?

    On basis of overall value for money The V score is not specific to a particular type of asset The other elements of the benefit ratio are all proportions or probabilities Allows consistent comparison across asset types

www.

inffer .org

V

seems subjective. Is that a problem?

    It is subjective. Values are.

V makes explicit what we already do implicitly  transparency Consistency in scoring is important 

Relate it carefully to table of V examples

Have a group that reviews all V scores for consistency

V is usually not the most uncertain factor 

Often W or A www.

inffer .org

How important is accuracy of the numbers   We rarely have highly accurate numbers It matters, but great precision is not needed 

W = 0.1 vs W = 0.8 makes a big difference

W = 0.11 vs W = 0.13 doesn’t

   Missing out a variable matters a lot The design of the BCR metric matters a lot Data inaccuracy matters a bit

www.

inffer .org

How does INFFER compare a one-year project with a five-year project?   On the basis of overall value for money. We ask, which of the two projects has the greatest environmental benefits per dollar spent? In both cases, we ask for information about the need for ongoing expenditure (beyond the project) and factor that in.

www.

inffer .org

How does INFFER compare large and small projects?  On the basis of overall value for money. We ask, which of the two projects has the greatest environmental benefits per dollar spent?

www.

inffer .org

How does INFFER deal with projects that require investment over a long time frame?  In the Project Assessment Form, we ask for an estimate of ongoing annual maintenance costs, which are factored into the assessment of cost effectiveness. Maintenance costs are converted to a “present value” using standard discounting methods.

www.

inffer .org

How does INFFER deal with projects where there is a long time frame until the benefits are generated?    It’s designed for a project of say 5 years, with benefits over longer time frame. E.g. project may avert degradation that is not expected to happen for decades, or it may take years for current actions to repair an already degraded asset. We collect information about the likely time lag until benefits. Then, the BCR includes a discount factor. Projects with more immediate benefits get higher weight (other things equal).

www.

inffer .org

For more details

See the BCR page and FAQs at http://www.inffer.org

www.

inffer .org

Acknowledgements

  Affiliations of the INFFER team    

University of Western Australia Department of Primary Industries, Victoria North Central Catchment Management Authority Future Farm Industries CRC

Other key funders  

Australian Research Council (Federation Fellow Program) Department of the Environment, Water, Heritage and the Arts (CERF Program)

Department of Sustainability and Environment , Victoria www.

inffer .org