www.ncat.edu

Download Report

Transcript www.ncat.edu

The NSF Course, Curriculum,
and Laboratory Improvement
(CCLI) Program
Jill Singer
Program Director, Division of Undergraduate Education
Directorate for Education & Human Resources
National Science Foundation
Email: [email protected]
UNCG Research Expo
April 22, 2009
Elliott University Center
1
Applying what you learn during this workshop
can make preparing your CCLI proposal easier
2
Outline of Topics





The CCLI Program
What’s new in 2009/2010 solicitation
Advice and Resources
What Happens to Your Proposal?
Questions
3
NSF web site (www.nsf.gov)
4
Division of Undergraduate Education
5
Course, Curriculum, and Laboratory
Improvement (CCLI): Vision and Scope (1)
Vision:
Excellent STEM education for all undergraduate students
Supports efforts that:









Bring advances in STEM disciplinary knowledge into the
curriculum
Create or adapt learning materials and teaching strategies
Develop faculty expertise
Promote widespread implementation of educational innovations
Prepare future K-12 teachers
Enhance our understanding of how students learn STEM topics
Enhance our understanding how faculty adopt instructional approaches
Build capacity for assessment and evaluation
Further the work of the program itself
 Note: The CCLI solicitation has changed – read NSF-09-529
carefully
6
Course, Curriculum, and Laboratory
Improvement (CCLI): Vision and Scope (2)
Program especially encourages projects that:
 Have the potential to transform undergraduate
STEM education
 Produce widespread adoption of classroom
practices based on how students learn
 Explore cyberlearning
7
What is New for 2009/2010
 TYPES have replaced PHASES
 Raised limit on proposal size
 Explicit encouragement of projects with the potential
to be transformative
 New Central Resource project opportunity
 Increased emphasis on building on knowledge of how
student learn, building on prior work, and
encouraging widespread adoption of excellent
teaching methods.
8
Project Types: Scale, Scope, Stage, &
Sustainability
 Three levels of support – Type 1, 2, and 3
 Types are independent
 Type 2 and 3 projects reflect greater dependence on
previous work
 Type 1 Projects: total budget up to $200,000 ($250K when 4-year
colleges and universities collaborate with 2-year colleges) for 2 to 3
years
 Type 2 Projects: total budget up to $600,000 for 2 to 4 years
 Type 3 Projects: Budget negotiable, but not to exceed $5 million
over 5 years
 NEW! CCLI Central Resource Projects – budget negotiable,
depending on the scope and scale of the activity, duration
up to 5 years
 Projects provide leadership and implementation of activities that
sustain a community of practice engaged in transforming
9
undergraduate STEM education
Important Project Components
 Creating Learning Materials and Strategies
 Instrumentation and equipment requests are appropriate
but must be based on their impact on student learning
 Implementing New Instructional Strategies
 Program encourages projects that lead to widespread
adoption of promising pedagogical techniques
 Developing Faculty Expertise
 From short-term workshops to sustained activities
 Assessing and Evaluating Student
Achievement
 Conducting Research on Undergraduate STEM
Education
10
Creating New Learning Materials and
Teaching Strategies
 Type 1 projects can focus on piloting new
educational materials and instructional
methodologies; Type 2 projects on larger-scale
development, broad testing, and assessment.
 Type 1 projects can focus on outcomes at a
single site, but must include assessment and
community engagement.
 Can be combined with other components,
especially faculty development in Type 2.
11
Implementing Educational
Innovations
 Type 1 projects generally
 Projects must result in improved STEM education
at local institution via implementing exemplary
materials, laboratory experiences, or educational
practices developed and tested at other
institutions.
 CCLI-Implementation projects should stand as
models for broader adaptation in the community.
 Proposals may request funds in any budget
category supported by NSF, including
instrumentation
12
Instrumentation and CCLI
 Acquisition of instrumentation fits best under first
two program components
 A focus can be the integration of data collection
and analysis into classroom and research
experiences


Tip: Proposal should center around the impact of the
project activities on student learning and not focus on
the instrument and its capabilities
Tip: Budget can include salary for faculty members and
students involved in the development of the project
13
Developing Faculty Expertise
 Methods that enable faculty to gain expertise
 May range from short-term workshops to sustained
activities
 Foster new communities of scientists in
undergraduate education
 Cost-effective professional development
 Diverse group of faculty
 Leading to implementation
 May be combined with other components, especially
materials development and assessment
 Excellent opportunities exist for you to
participate in regional and national workshops
14
Assessing Learning and
Evaluating Innovations
 Design and test new assessment and
evaluation tools and processes.
 Apply new and existing tools to conduct
broad-based assessments
 Must span multiple projects and be of general
interest
15
Conducting Research on STEM
Teaching and Learning
 Develop new research on teaching and
learning
 Synthesize previous results and theories
 Practical focus
 Testable new ideas
 Impact on STEM educational practices.
 May be combined with other components
16
Ways CCLI Can Support
UGR Activities
 Acquisition of research quality equipment and its
integration into undergraduate courses.
 Labs can be constructed that integrate advanced
equipment, prepare students for research, and draw on
faculty research expertise.
 Incorporation of inquiry-based projects into laboratory
courses.
 Partnerships with local research and informal education
institutions.
 Service learning can provide relevant problems while
addressing the needs of the local community.
17
Human Subjects and the IRB
(Institutional Review Board)
 Projects collecting data from or on students or faculty
members are considered to involve human subjects
and require IRB review
 Proposal should indicate IRB status on cover
 Exempt, Approved, Pending
 Grants will require official statement from IRB declaring
the research exempt or approved
 Not the PI
 See “Human Subjects” section in GPG
 NOTE: For CCLI, IRB approval usually is obtained during
award negotiations
18
Important Features of Successful
CCLI Projects






Quality, Relevance, and Impact
Student Focus
Use of and Contribution to the STEM
Education Knowledge Base
STEM Education Community-Building
Expected Measurable Outcomes
Project Evaluation
19
Quality, Relevance and Impact
 Innovative
 State-of-the-art products, processes, and ideas
 Latest technology in laboratories and
classrooms
 Have broad implication for STEM education
 Even projects that involve a local implementation
 Advance knowledge and understanding
 Within the discipline
 Within STEM education in general
20
Student Focus
 Focus on student learning
 Project activities linked to STEM learning
 Consistent with the nature of today’s
students
 Reflect the students’ perspective
 Student input in design of the project
21
STEM Education Knowledge Base
 Reflect high quality science, technology,
engineering, and mathematics
 Rationale and methods derived from the
existing STEM education knowledge base
 Effective approach for adding the results to
knowledge base
22
Community-Building
 Include interactions with
 Investigators working on similar or related
approaches in PI’s descipline and others
 Experts in evaluation, educational psychology or
other similar fields
 Benefit from the knowledge and experience
of others
 Engage experts in the development and
evaluation of the educational innovation
23
Expected Measurable Outcomes
 Goals and objectives translated into expected
measurable outcomes
 Project specific
 Some expected measurable outcomes on
 Student learning
 Contributions to the knowledge base
 Community building
 Used to monitor progress, guide the project,
and evaluate its ultimate impact
24
Project Evaluation
 Include strategies for
 Monitoring the project as it evolves
 Evaluating the project’s effectiveness when
completed
 Based on the project-specific expected
measurable outcomes
 Appropriate for scope of the project
25
Lessons From Prior Rounds
of the Program
 Type 1 is an open competition – many new
players;
 Type 2 requires substantial demonstrated
preliminary work;
 Type 3 is for projects from an experienced
team with a national scale.
26
Write CCLI Proposal to Answer
Reviewers’ Questions
What are you trying to accomplish?
What will be the outcomes?
} Goals etc.
Why do you believe you have a good idea?
Why is the problem important?
Why is your approach promising?
} Rationale
How will you manage the project to ensure
success?
How will you know if you succeed?
} Evaluation
How will others find out about your work?
How will you interest them?
}
27
Dissemination
Program Director’s Notes (1)
 Read the program solicitation
 Determine how your ideas match the solicitation and
how you can improve the match
 Articulate goals, objectives, & outcomes
 Outcomes should include improved student learning
 Build on existing knowledge base
 Review the literature
 Present evidence that the proposed project is doable;
will enhance learning; is the best approach
 Explore potential collaborations (industry,
business, academic)
 Use data to document existing
shortcomings in student learning
28
Program Director’s Notes (2)
 Describe management plan
 Provide tasks, team responsibilities, timeline
 Provide clear examples of the approach
 Integrate the evaluation effort early
 Build assessment tools around defined objectives
and expected outcomes
 Connect with independent evaluation experts
 Identify strategies for dissemination
 Define a plan to contribute to knowledge base
 Address broader impacts
 Collaborate, form partnerships (build community)
29
Program Director’s Notes (3)
 What does the knowledge base say
about the approach?
 What have others done that is related
 What have been the problems/challenges
 Why is this problem important?
 Is it a global or local problem
 What are potential broader impacts
 How will it improve quality of learning
 What is the evidence that the approach
will solve the problem?
 Address and achieve the defined outcomes and
student learning
 What are alternative approaches?
30
Funding and Deadlines
 Expect to fund, all disciplines




130 Type 1 projects
45 Type 2 projects
4-6 Type 3 projects
1-3 Central Resource projects (CRP)
 Proposal Deadlines
 Type 1: May 21-22 2009
 Type 2 and 3, and CRP: January 13, 2010
 Focused CRP workshops by agreement
31
What’s ‘hot’ in the Geosciences?
 Bringing new research findings into the classroom
 Understanding how our students learn geoscience concepts
 Visualization software and improving our students’ ability to
visualize data in 3D
 Research equipment for undergraduates (e.g., Lidar)
 Topics of special interest: climate change, sustainability,
energy
 Interdisciplinary projects that combine geosciences with
other STEM disciplines
 To find out what is ‘hot’ in your particular STEM
discipline, contact a program officer (solicitation
provides names and emails for program officers
working in the various STEM disciplines)
32
Resources for Models
and Examples
 Disciplinary Education Journals
 Journal of Geoscience Education
 SERC – the Science Education Resource Center
a Carleton College (http://serc.carleton.edu)
 CUR “Quarterly”
 Faculty Development Workshops – “Cutting Edge”
 NSF Award Search
 http://nsf.gov/awardsearch/
 Search by program, key word(s)
 Programs often includes link to recent awards
(abstracts)
33
Merit Review Criteria
 Intellectual merit of the proposed activity
 How important is the proposed activity to
advancing knowledge and understanding within
its own field or across different fields?
 How well qualified is the proposer to conduct the
project?
 How well conceived and organized is the
proposed activity?
 Is there sufficient access to resources?
34
Merit Review Criteria
 Broader impacts of the proposed activity
 How well does the proposed activity advance
discovery and understanding while promoting
teaching, training, and learning?
 How well does the proposed activity broaden the
participation of underrepresented groups?
 To what extent will it enhance the infrastructure
for research and education?
 Will the results be disseminated broadly to
enhance scientific and technological understanding
 What may be the benefits of the proposed activity
to society?
35
Writing a Proposal: Preparing to Write
 Start EARLY
 Outline what you want to do
 Review the literature and descriptions of funded projects.
Know what is being done in your field and how your
project is similar/different
 Use NSF Awards Search (http://www.nsf.gov/awardsearch/)
 Read program solicitations to find the program that best
meets your needs
 If you still need clarification, contact (e-mail is best) the
appropriate program officer to discuss your idea.
 This may cause you to refine your idea and may prevent you from
applying to the wrong program
 Give yourself and your grants’ office enough time to
complete the process and submit the proposal
36
Writing a Proposal: Writing
 Organize the proposal - use proposal guidelines
 Make it easy for reviewers to find key items in your proposal by
using such aids as bullets and an outline format
 Be sure you clearly describe what you want to do and how you will
do it as well as the problem you want to solve (goals and
objectives)
 For programs such as CCLI, describe how you will follow the
progress of your project, determine whether it is successful and
how you will disseminate the results
 Consider the research potential of the project. Could the results
add to the knowledge we have about what works and why in STEM
education? If appropriate, relate your efforts to current research
about what works and why.
 Be sure the budget and budget explanation ‘match’ and that the
budget reflects the size of the project team and the level of
commitment for each member of the project team.
Instrumentation, participant support, and/or travel requests should
37
be clearly explained and justified.
Some Common Reasons for Proposal Decline
 Lack of evidence the PI is aware of the relevant literature
and is building upon it
 Diffuse, superficial and unfocused plan
 Lack of sufficient detail
 Apparent lack of the requisite expertise or experience by the
proposers
 Lack of a clear plan to document and evaluate activities and
outcomes and to disseminate the results
 Evaluation plans that are mainly surveys to determine user
satisfaction with no clear mechanism for documenting
changes in student learning, faculty approaches to
presenting material, and/or approach to education (at the
disciplinary, department or institutional level)
 Proposals that do not explicitly address both Intellectual
Merit and Broader Impact and exceed the page limit are
38
returned without review
Formatting, Fastlane, and Grants.gov
 NSF proposal format requirements
 15 single-spaced pages
 Check type fonts required
 Intellectual Merit & Broader Impact explicit in Project Summary
 Fastlane submission
 Web-based software – access from any browser
 Mature, well-supported system for NSF
 Accepts many file types, converts to .pdf
 Grants.gov
 Stand-alone software downloaded to local computer
 May eventually be used for any Federal agency
 Still under development and does not support all NSF processes
(for example, collaborative proposals)
 Accepts only .pdf files
 Delayed error messages
39
What Happens to your Proposal?
 Submission of proposal via FastLane
 Proposals are reviewed by mail and/or panels of faculty within
the discipline(s) [Note: DUE primarily uses panels]
 A minimum of three persons outside NSF review each proposal
 For proposals reviewed by a panel, individual reviews and a
panel summary are prepared for each proposal
 NSF program staff member attends the panel discussion
 The Program Officer assigned to manage the proposal’s review
considers the advice of reviewers and formulates a
recommendation
 Negotiations may be necessary to address reviewers’
comments, budget issues, and other concerns
40
What Happens to Your Proposal (2)
 NSF is striving to be able to tell applicants whether their
proposals have been declined or recommended for
funding within six months.
 Verbatim copies of reviews, not including the identity of
the reviewer, is provided to the PI.
 Proposals recommended for funding are forwarded to the
Division of Grants and Agreements for review.
 Only Grants and Agreements Officers may make awards.
 Notification of the award is made to the submitting
organization by a DGA Officer.
41
How to Really Learn about
Programs and Process
 Become a reviewer for the proposals
submitted to the program
 Give us a business card
 Send e-mail to the lead or disciplinary program
officer
 Your name will be added to the database of
potential reviewers
 We want to use many new reviewers each
year, especially for Type 1
42