Day 03 01 Monitoring information for CEAFM decision making

Download Report

Transcript Day 03 01 Monitoring information for CEAFM decision making

Monitoring information for
CEAFM decision making:
reflections on LMMA’s learning
Caroline Vieux- SPREP
James Comley- USP
Previous experience- purpose of
monitoring-J
• Community/stakeholder involvement:
• Adaptive management
– Community/stakeholder learning for management
– Project or organizational learning for management
•
•
•
•
•
Stock assessment
Project/donor M&E
Network or portfolio learning
Global or academic learning
Advocacy
Previous experience- what has been
monitored- and how has it been done-C
• Species population status – UVC, belt transects, CPUE,
interviews..
• Ecological processes e.g. SPAGs - UVC
• Habitat health indicators – point intercept transects,
photo, videotransects
• Socio-economic status including governance and
compliance – Household surveys, Key informant/focus
group interviews
• Physical conditions (temperature) - loggers
• Water quality – sampling and analysis
Previous experience- who has been
involved in monitoring - J
• Community unaided and unsupported by
outside agencies
• Community assisted directly by outside
agency/NGO
• Outside agency assisted by community
• Outside researchers
Lessons learned- purpose of
monitoring- J
• LMMA network set out an ambitious framework
• Need to define purpose of monitoring- ensure fit for
purpose
• Monitoring tied to objectives of management plan
• Standardisation unlikely to equate to primary
motivations/interest of individual sites
Lessons learnt – biological monitoring
results - C
Methodology issues- not surveyor
Many lessons for the biological monitoring, the main one being:
In all the studies reviewed statistical power is not sufficient to detect changes,
SD are too high:
Differences in the implementation of the methodologies (number and
length of transects varie from site to site) = ?? ( do we really know what is
the effort needed?)
Variation of transects needed between sites and species (ex from Fiji
LMMA: number of transects needed to detect changes, within the tabu
area: Lutjanus gibbus=153, Naso unicornis=200, Scarus ghobban:4, within
the control site: Lg=38, Nu=60, Sg=5)
Not enough transects done, wrong placement
Current design not suited for most invertebrates that are too patchily
distributed
Analysis done at the species level, if fish assemblage are looked at
through multivariate analysis, results are more robust
Lessons learnt – Socio-Economic monitoring
results - C
•Socioeconomic monitoring:
 still very new in most cases,
not many lessons to date except for LMMA network where
data have been of a very poor quality.
Development of SEM-Pasifika, training conducted and
funds allocated through NOAA and accessible by all PICs
but interest has been quite limited so far…is it really needed?
More one-point in time socioeconomic surveys than
monitoring
•Perceptions: varies quite a lot from the biological surveys
•CPUE: low cost and low tech compared to Uderwater Visual
Census but sampling effort has to be done over a sufficient
amount of time to be relevant
The role of communities in
monitoring - J
Motivations
– Participation/stewardship
Successes
– Ability of communities to count reliably
– Opportunity monitoring presents for AM
Challenges
– Resourcing- remuneration?
– High turnover
– On going comittment to monitor
Have monitoring results been used for
management? C
•
•
•
Some instances of it being used- though generally results have not be
widely used for adaptive management
In Fiji, PNG, 25% of the sites used the results of monitoring for adaptive
management
Reasons:
 Communities do not understand the results (no training on data
interpretation)
 Data are not significant
 Other factors drive the decision-making
 Adaptive management is taking place without the results of monitoring
 The data are not relevant to management questions
 Certain species are not accurately assessed
 Data collected do not inform on resource stocks
Has it been worth it? What
information is needed-J
• 60% of budget of some project countries
spent on monitoring
• CBEAFM (vis-a-vis CBAM) in purest form
intended to be “learning by doing”
Key questions/issues of
concern-J
•
•
•
•
What information is needed for CEAFM
Who has responsibility for monitoring?
Who should pay for monitoring- and how much
of the total budget should be spent on
monitoring?
What methods are most cost effective and
appropriate?
Direction in Fiji-J
• Responsive to community needs
• Re-Tired approach
– Less-data monitoring at all sites
– Community monitoring on specific factorsrelevant to them- at small number
– Ad-hoc research driven monitoring at small
number of sites