Information Sharing, Team Coordination, Evaluation, and

Download Report

Transcript Information Sharing, Team Coordination, Evaluation, and

How did we do? Insights from a CLIMAS pilot evaluation

Dan Ferguson, Anne Browning-Aiken, Gregg Garfin, Dan McDonald, Jennifer Rice, Marta Stuart Climate Prediction Application Science Workshop March 7, 2008

Overview

• CLIMAS/RISA • Purpose of the project • Evaluation project team • Process and methods • Who is participating • Research/evaluation questions • Where we are in the process

CLIMAS/RISA

• Climate Assessment for the Southwest (CLIMAS) is one of 8 currently funded Regional Integrated Science and Assessments (RISA) programs

CLIMAS mission/mode

We both do climate research and work

iteratively

with stakeholders, partners, and collaborators to provide timely, pertinent, and (hopefully)

useful

information, tools, services (or access to these) about climate to those who need these to make decisions.

CLIMAS team

• Program is 10 years old—cast of characters changes through time • Currently 10 investigators + affiliate investigators + grad students + core office staff • HQ at University of Arizona, but currently have investigator (Deborah Bathke) at New Mexico State University • Highly interdisciplinary: anthropology, climatology, decision-support system development, geography, hydroclimatology, Latin American studies, paleoclimatology, resource economics

Purpose(s) of the evaluation project

Purpose of the evaluation project

• Broad evaluation of the RISA model as expressed by CLIMAS – Not eval. of particular product, info source, etc, but rather a first crack at an overall evaluation of CLIMAS – Roughly bounded in time—2002-2007 • Looking for key insights about penetration of information; perceived salience, credibility, and legitimacy* of CLIMAS; and changes in knowledge, behavior, understanding as a result of interactions with CLIMAS *After: Cash, D., W. Clark, et al. (2002). Salience, Credibility, Legitimacy and Boundaries: Linking Research, Assessment and Decision Making. Cambridge, MA, John F. Kennedy School of Government, Harvard University: 24 pp.

Purpose of the evaluation project (cont.)

• Input to CLIMAS program manager and investigators • Input to the Climate Program Office and the other RISAs • Input to NIDIS as it develops

Evaluation Team

Evaluation team

• Mixed team: – two members directly affiliated with CLIMAS (Ferguson and Garfin) – four members not previously affiliated with CLIMAS (Browning-Aiken, McDonald, Rice, Stewart)

Evaluation team roles

• Garfin=‘Encyclopedia of CLIMAS’ • Ferguson=lead investigator/coordinator, but not conducting data collection • Browning-Aiken + Rice=interviews • McDonald=survey • Stewart, Browning-Aiken, Rice=focus groups

Evaluation methods and team process

Methods

• Survey (online) • Interviews (primarily telephone) • Focus groups will follow survey and interviews; FG will be used to probe deeper into issues, ideas that emerge from survey and interview results

Survey

• Multiple iterations involving whole team • Piloted survey with ~15 colleagues – Teased out obvious issues – Included required IRB disclaimer language, but made access (hid) behind a click • Used professional web team to develop/build – Very fast turnaround, reliable product, able to customize and troubleshoot – Helped us understand options, e.g. email login

Team Process

• Develop research questions based on broad strokes of proposal • Utilize whole team for development of research questions + all data collection instruments – Collaboratively and iteratively develop and refine data collection instruments=a very good thing

Who is participating?

A

sample

of organizations with whom we work Arizona Department of Environmental Quality National Agroclimate Information Service Pima Association of Governments National Climatic Data Center National Drought Mitigation Center Pima County Pinal County Salt River Project National Interagency Coordination Center Arizona Division of participants Arizona State University National Interagency Fire Center San Carlos Apache Tribe National Environmental Satellite, Data, and Information Service Bureau of Reclamation California Department of Water Resources National Park Service National Resources Conservation Service Central Arizona Project Sandia National Laboratories Santa Cruz County Sonoran Institute The Nature Conservancy Clark County Cornell University National Wildlife Federation – ~25-35 interviews United States Department of Agriculture United States Geological Survey Desert Research Institute Environmental Defense New Mexico Department of Agriculture New Mexico Office of the State Engineer University of Arizona University of California, Irvine New Mexico Rural Water Association University of Montana Environmental Protection Agency Maricopa County Matrix Consulting Group, Inc Mohave County New Mexico State University Northern Arizona University Northwest Interagency Coordination Center Pacific Institute University of New Mexico US Fish and Wildlife Service USDA Forest Service Western Governors' Association

Our spectrum of relationships

• Communication: e.g., receive Southwest Climate Outlook, e mail updates or other publications; call or e-mail CLIMAS team members with specific questions • Consultancy: e.g., ask for expert speaker for workshop/meeting; seek consultation on project development • Partner: e.g., co-sponsor an event; been invited to speak at a meeting or workshop • Collaboration: e.g., ongoing or long-lasting research collaborations; long-term engagement to address a particular issue

Research Questions

Research/evaluation questions

• Is CLIMAS achieving the overall RISA goals of being responsive, stakeholder-oriented, and use inspired?

• Is CLIMAS perceived as salient, credible and legitimate?

Research/evaluation questions

• Is CLIMAS perceived by collaborating organizations as a reliable and responsive partner?

• What are the outcomes (short and medium term) that result from interactions with CLIMAS?

• How is CLIMAS accessed and is it reaching populations in need of climate information?

Lessons learned (so far) or what I know today that I didn’t really know in October but probably should have

Lessons learned so far (common sense warning)

• Try to keep track of your stakeholders • Mixed team (inside/outside program) has worked out very well • Use

whole

investment team to develop/refine research questions and instruments • Pros for developing web survey interface and database=major time/headache saver • Take the time to pilot a survey; big return on small • Institutional Review Board, oye • Understand that sometimes “evaluation is intervention”

Where we are in the process

• Interviews beginning this week • Survey link is being distributed over the next week • Focus groups will follow, probably late April/May

Then what?

• White paper aimed at RISA, NIDIS, other programs and organizations grappling with similar issues • Peer reviewed pub • Better grip on next steps for ongoing evaluation

Questions?

Dan Ferguson University of Arizona/CLIMAS [email protected]

http://www.climas.arizona.edu/