IAOD Evaluation Seminar “Demystifying Evaluation in WIPOBest Practices from Initial Evaluations” Evaluation Section Internal Audit and Oversight Division (IAOD) World Intellectual Property Organization (WIPO) Geneva November,
Download ReportTranscript IAOD Evaluation Seminar “Demystifying Evaluation in WIPOBest Practices from Initial Evaluations” Evaluation Section Internal Audit and Oversight Division (IAOD) World Intellectual Property Organization (WIPO) Geneva November,
IAOD Evaluation Seminar “Demystifying Evaluation in WIPO Best Practices from Initial Evaluations” Evaluation Section Internal Audit and Oversight Division (IAOD) World Intellectual Property Organization (WIPO)
Geneva November, 8 2012
Some myths on evaluation
Evaluations are « audits in disguise » Evaluations are assessing individual performance Evaluations are redundant, as often stating the obvious to those who know their business better than any independent « experts » There are only wrong-time evaluations
…and their antagonisms
Evaluations use standard methodologies and criteria Evaluations draw on each individual’s experiences but information is never attributed to persons Evaluations are bringing independent external perspectives into programs and projects Evaluations are planned according to needs and potentials for accountability and learning, either after or towards the end of a program or project (cycle)
By which frameworks is the Evaluation Function’s work guided?
Internal Audit and Oversight Charter
UN Norms and Standards for Evaluation WIPO Evaluation Policy and Evaluation Guidelines 4
Evaluations to do what?
In General: Assess whether the Organization is doing the right things and whether it is doing them the right way; For managers: Contribute to learning and knowledge sharing (from good practice and challenges); Generate and use evaluative information for decision-making; and Identify what works to develop a balanced and accessible international IP system and that can be replicated and scaled up.
5
And what did we do?
UNDERTAKING OF 6 INDEPENDENT EVALUATIONS / VALIDATION: 1) Pilot Country Portfolio Evaluation
Kenya
(2005-2010)
2) Startup
National Intellectual Property
Academies
Agenda Project DA_10_01) (Development 3) Technology and Innovation Support Centers (
TISCs
) (Development Agenda Project DA_08_01) 4) Developing Tools for
Access to Patent Information
Agenda Project DA_19_30_31_01) (Development 5) Improvement of
National, Sub-regional and Regional
Institutional and User DA_10_05)
Capacity
IP (Development Agenda Project 6) IAOD Report:
Validation of the Program Performance Report
2010-2011 for 6
Lessons Learned: CPE Kenya Evaluation Process
COUNTRY PORTFOLIO EVALUATION Ups Downs
POTENTIAL ADDED VALUE
IMPROVES COUNTRY ASSISTANCE HOLISTIC APPROACH USEFUL FOR DESIGNING COUNTRY PLANS IMPROVE EFFICIENCY, EFFECTIVENES, RELEVANCE, SUSTAINABILITY AND POSSIBLY IMPACT 7
Four Development Agenda (DA) Project Evaluations
Main Characteristics of DA Project Evaluations
1.
Single projects with unified project documents including performance framework to allow for evaluation of their efficiency and effectiveness 2.
3.
Time frame under evaluation is the duration of the project, typically of 2 or 3 years Most projects evaluated – sometimes pilots - had pending decisions (by CDIP) for extension / follow-up phases or a special relevance
What were the main challenges?
1.
2.
3.
4.
Choice of experts to assist with evaluations is key Level of effort is similar to bigger evaluations Need to reach out to key stakeholders including in-country (time and cost factor) Reference group as learning entity to benefit from the process 8
Lesson Learned from Project Evaluations
Process: Open Dialogue with all members of the Project Team Learning Resource or Reference Group meets at critical junctures of the evaluation process Regularly monitoring performance data Keep all performance records on file, including those gathered through face-to face interviews Products: Short and well-documented reports and debriefing presentations Showing the evidence chain (findings – conclusions – recommendations Validate recommendations through dialogue and keep them specific and actionable
PPR Validation Process: Accountability vs. Learning
Where do we want to go now?
Q4 2012 – Q1 2013 Program Evaluation (Patent Law, Program One) Thematic Evaluation (Capacity Building) Q2 - Q4 2013 Country Evaluation II Strategic Evaluation …as needed?
Discussion Points
How to improve frameworks and the monitoring of outcome oriented Key Performance Indicators (KPI) How to identify knowledgeable experts who have not been involved in WIPO activities under evaluation How to raise ownership of key stakeholders for the uptake of evaluation recommendations When best to launch an evaluation in the course of a program or project?
What approaches and focuses work best to be useful for management?