Transcript Nuts & Bolts Session National Science Foundation CCLI
Nuts & Bolts Session National Science Foundation CCLI Grant Writing
Linnea Fletcher [email protected]
ASMCUE
Program 7 – 9 pm (2 hours) May 28, 2009
Who is My Audience?
Graduate Students Postdoctoral Students New Faculty Faculty Administrators
Caution
Most of the information presented in this talk represents the opinions of the individual program officers and is not an official NSF position.
Outcomes
Describe how to navigate to DUE and find information on programs, awards, and sign up for NSF updates Describe the CCLI program Type 1, Type 2, Type 3 and Centers Identify the Common Strengths and Weaknesses of CCLI Proposals and Provide Strategies for Dealing with Them Explain Intellectual Merit, Broader Impacts, and Transformative Work and give examples Explain the Practical Aspects of the Review Process Explain What Should be Included in a Proposal Outline
Format
Questions Posed Throughout Presentation: Answer Individually (30- 45 min) Work Together (1 hour) Think-> Pair-> Share with the Group More Questions (15min)
Navigating the NSF Site
www.nsf.gov
“ EHR’s Mission is to promote the development of a diverse and well-prepared workforce of scientists, engineers, mathematicians, educators, and technicians and a well informed citizenry who have access to the ideas and tools of science and engineering.”
Look Up Awards and Ask for Proposals
Course, Curriculum, and Laboratory Improvement (CCLI)
NEW PROGRAM SOLICITATION NSF 09-529
CCLI
Vision Excellent STEM education for all undergraduate students. Goal Stimulate, disseminate, and institutionalize transformative or innovative developments in STEM education through the production of knowledge the improvement of practice . and *Most comprehensive program Have you participated in an exemplary CCLI workshop? What was stimulating, transformative, or innovative?
CCLI
Supports efforts that
Bring advances in STEM disciplinary knowledge into curriculum using the appropriate pedagogy
Create or adapt learning materials and teaching strategies (must be significantly different)
Develop faculty expertise
Promote widespread implementation of educational innovations (Type 3> Type 2 > Type 1) For the majority of high declines, the science is fantastic but the learning and teaching Need work!
CCLI
Supports efforts that
Prepare future K-12 teachers
Are you involved in teacher training or have teachers in your classes? Enhance our understanding of how students learn STEM topics
Enhance our understanding how faculty adopt instructional approaches
What infrastructure needs to be in place? How much support do faculty need and for how long? What is the best way to educate faculty?
Increase knowledge of assessment and evaluation
Further the work of the program itself
CCLI
Program especially encourages projects that:
Have potential to transform undergraduate STEM education
Produce widespread adoption of classroom practices based on how students learn
Explore cyberlearning (find info at NSF site)
How often do you visit the NSF website?
PROJECT COMPONENTS
New Materials Materials and Strategies
Incorporate ideas from research on teaching and learning AND
Incorporate scientific advances in disciplines What can you do if you’re not an expert on teaching and learning ?
PROJECT COMPONENTS
New Instructional Strategies
Implement proven or promising techniques in ways that encourage widespread adoption What are some techniques that would accomplish this goal? Be proactive –a passive website IS NOT ENOUGH!
PROJECT COMPONENTS
Developing Faculty Expertise
Increase instructor’s knowledge and skills on curricula and teaching practices
How are you going to ensure this goal is accomplished?
Involve diverse group of faculty
IF YOU DO THIS MAKE SURE IT IS COMPREHENSIVE!
PROJECT COMPONENTS
Assessing and Evaluating Student Achievement
Develop and disseminate valid and reliable tests of STEM knowledge and skills
Linked to the latest educational research
Sources of information?
PROJECT COMPONENTS
Conducting Research on Undergraduate STEM Education
Explore how undergraduate STEM students learn Explore how practices have diffused and how faculty and programs implement changes Again, what are the sources for this type of information and/or expertise?
DUE and DRL
PROJECT COMPONENTS
NOTE: Instrumentation and equipment requests are appropriate -- based on learning impact!
There are examples for using expensive, large pieces of equipment in undergraduate settings
Type 1 Projects
70 to 75 awards expected
Total budget up to $200,000 for 2 to 3 years
250,000 when 4-year and 2-year schools collaborate
What do you think reviewers want to see?
Deadline
May 21, 2009 (A-M states)
May 22, 2009 (N-Z states)
Type 1 Projects
Typically involve a single institution & one program component
Contribute to the understanding of undergraduate STEM education
Type 2 Projects
20 to 25 awards expected
Total budget up to $600,000 for 2 to 4 years.
Deadline January 13, 2010
Type 2 Projects
Typically involve multiple institutions & several program components – but exceptions Typically based on prior work with results explicitly described – but exceptions Produce evidence on the effectiveness Institutionalize at the participating schools
Type 3 Projects
Large scale efforts
Typically based on prior work with results explicitly described – but exceptions
Produce evidence of student learning in a broad population
Describe impact of the work on the prevailing models
Describe strategies for implementation in new contexts
CCLI Central Resource Projects
1 to 3 awards expected Budget negotiable, depending on the scope and scale of the activity
Small focused workshop projects -- 1 to 2 years & up to $100,000 Large scale projects -- 3 to 5 years & $300,000 to $3,000,000 Deadline January 13, 2010
CCLI Central Resource Projects
Implement activities to sustain the STEM community
Increase the capabilities of and communications in the STEM community
Increase and document the impact of CCLI projects
NSF Review Criteria
Intellectual Merit Scientific, educational, management plan…..
What else?
Broader Impact Within your school, community, state, and across the nation Diversity Transformative –will fund high risk!
Activity 1 Strengths & Weaknesses Identified by Reviewers
Pretend you analyzed a stack of panel summaries to identify the most commonly cited strengths and weaknesses
Predict the outcome of the analysis by:
Listing the four most frequently cited strengths found in proposals
0
Most Common Strengths
Strengths Cited in More Than 20 % of the Panel Summaries
Evaluation plan Build on prior work or products Large impact Dissemination, contribution to KB Potential for involving W&M Collaboration details PI's strong Important, timely, or responsive
10 20 30
Percent
40 50 KB=knowledge base) 60
Answer Now
Is this strength identified as part of (a) intellectual merit (b) broader impacts or (c ) both 10. Important, timely, or responsive 11. PI’s strong 12. Collaboration details 13. Potential for involving women and minorities 14. Dissemination, contribution to the STEM knowledge base 15. Large impact 16. Build on prior work or products 17. Evaluation plan
Activity 1 continued
Generate a list of suggestions for potential PIs to ensure these strengths are part of their proposal
Activity 1 continued
Listing the four most frequently cited weaknesses found in proposals
0
Most Common Weaknesses
Weaknesses Cited in More Than 20 % of the Panel Summaries
Collaboration details Large impact Innovative or novel Build on prior work or products Potential for involving W&M Dissemination & contribution to KB Activities doable & related to outcomes Evaluation plan Sufficient detail and clear plans
10 20 30
Percent
40 50 60
Answer Now
Is this weakness identified as part of (a) intellectual merit (b) broader impacts or (c ) both 18. Sufficient detail and clear plans 19. Activities doable and related to outcomes 20. Innovative or novel
Activity 1 continued
Generate a list of suggestions for potential PIs to ensure that these weaknesses are not part of their proposal
Practical Aspects of the Merit Review Process
Phase I
•
At the DUE Web Site
•
http://www.nsf.gov/div/index.jsp?div=DUE
•
Create a personalized alert service
•
Consult the program solicitation and NSF Proposal & Award Policies & Procedures Guide (PAPPG) (NSF 09-1) Alert the Sponsored Research Office
Test drive FastLane Get copies of previously funded proposals
Directly from the PI From Leslie Jensen ( [email protected]
)
If you are NOT going to turn in a proposal, contact a program officer (PO) and offer to review proposals
42
Phase I: Write the Proposal
Cover Sheet Data Sheet: Project codes Project Summary: Description, Intellectual Merit, Broader Impacts Table of Contents: 15 pages Project Description: No. the pages, refer to supplemental doc.
References Biographical Sketches Budget Current and Pending Support Facilities, Equipment and Other Resources Special Information/Supplementary Documentation
Phase I: Write the Proposal
Start with an Outline
Follow NSF requirements for proposals involving Human Subjects (IRB)
Project Summary: Separate Intellectual Merit and Broader Impacts
Discuss prior results
Provide details
Include evaluation plan with timelines and benchmarks
Cite the literature Follow page and font size limits Check Grammar and Spelling Have someone else read the proposal who understands the Merit Review Process
Meet deadlines
Phase 1: Writing the Proposal
Put yourself in the reviewers’ place
How much time do reviewers Have to be impressed by your Proposal?
Phase II
Reviewers picked by NSF PDs based on qualifications and interest Reviewers are expected to read ahead of the panel meeting, and enter their reviews on FastLane before the panel meets. There are usually 10 to 13 proposals per panel. They specifically look at the Intellectual Merit and Broader Impacts of the proposal. They also consider if it is Transformative!
They also rate the proposal (E, V, G, F, P) Panel meets and discusses the proposals; reviewers can change their ratings. Panels meet for 1 and ½ days (SHORT TIME FRAME) Reviewers are expected to write complete sentences or at least complete thoughts and use proper grammar.
Proposals that end up getting funded usually have E’s and V’s, proposals with average less than 3.5 are usually considered non-competitive. E = 5, V = 4, G = 3 , F = 2, P = 1 PDs meet to decide on which proposals are recommended or declined The Best WAY TO LEARN ABOUT A PROGRAM IS TO VOLUNTEER TO BE A REVIEWER!
Phase III
An officer in the Division of Grants and Agreements reviews the recommendation from the Business Program. The decision is usually made within 30 days Only an officer in DGA can make the award Bottom line: Proposals need to follow guidelines established both by the Division and DGA
Answer Now
7. The program solicitation should a) never be read b) be read by the PI c) be read by all of the PIs and the SRO officer
Answer Now
9. An NSF proposal is awarded by the a) PD b) Director of the Division (DD) c) an officer in DGA
Application
Work Together
Prof X is a new assistant professor and has been assigned to teach the Introduction to Microbiology course. She finds the course to be antiquated by her standards, and students appear to be not interested in learning the material in the lecture or in the laboratory. She has no training in educational methods.
Scenario: Improving an Introduction to Biology Laboratory Course
She has an idea for greatly improving the course by adding “new stuff ”..
“New stuff ”
Material Activities (e. g., modules, web-based instruction) (e. g., laboratories, projects) Pedagogy (e. g., problem based learning) She has done a preliminary evaluation She decides to prepare a CCLI proposal Now what should she do?
Scenario: Professor X’s Initial Prepare a Proposal Outline
EXAMPLE
Goals: Develop “ new stuff ” learning at College of Y
to enhance student Rationale: Observed shortcomings in educational experience of the students at College of Y and felt that new stuff would improve the situation Project Description: Details of “ new stuff “ Evaluation: Use College of Y’s course evaluation forms to show difference or value Dissemination: Describe “ new stuff “ using conference papers, workshops, journal articles, and web site
Think Like a Reviewer
What are you trying to accomplish? What are the goals?
How will you accomplish your goals? What are the objectives?
What will be the outcomes?
Did the PI review the literature?
Why does the PI believe this is a good idea? Why is the problem important? IS IT TIMELY? Why is your approach promising? How will you manage the project to ensure success? How will you know if you succeed?
How will others find out about your work? How will you interest them? How will you excite them?
}
Goals etc.
}
Rationale
}
Evaluation
}
Dissemination
Goals
Objectives
intention or ambition
Outcomes
Goal: Broad, overarching statement of Sample Goal for Prof. X Design microbiology laboratory curriculum that is vertically integrated with the lecture curriculum so that theoretical concepts are illustrated through engaging, application driven exercises.
Defining Objectives
A goal typically leads to several objectives
Specific statement of intention
More focused and specific than a goal
Measurable
List possible objectives for the stated goal
Outcomes
From your list of objectives, develop a list of measurable outcomes
Project Rationale
Rationale is the narrative that provides the context for the project
It’s the section that connects the “ Statement of Goals and Outcomes ” to the “ Project Plan ” What’s the purpose of the rationale?
What should it contain ?
What are the potential problems & limitations ?
What can be done about them?
Has the applicant done prior work ?
Has funded work lead to interesting results ?
Are there any show?
preliminary data and what do they What should it accomplish ?
What should Professor X include in her rationale?
Description of Activities
Easy to visual Concise Detailed “Less” is Usually Better in Terms of the Number of Goals, Objectives, Activities, and Outcomes!
Goals, Objectives Miracles Occur!
Outcomes Achieved
Evaluation and Assessment
Evaluation & assessment have many meanings
Individual’s performance (grading)
Build assessment tools around defined objectives and expected outcomes
Program’s effectiveness (accreditation)
Project’s progress or success (monitoring and validating)
Project evaluation
Formative – monitoring progress Summative – characterizing final accomplishments
Examples of Tools for Evaluating Learning Outcomes
Surveys
Forced choice or open-ended responses Interviews
Structured (fixed questions) or in-depth (free flowing) Focus groups
Like interviews but with group interaction Observations
Actually monitor and evaluate behavior Olds et al, JEE 94:13, 2005 NSF’s Evaluation Handbook
Provide examples of student assessment that would be applicable for the listed outcomes
Dissemination Plan
Consider multiple modes and venues of communication,
For example:
NSDL, NISOD, PKAL, NABT, ASMCUE
State Academy of Science meetings
Science news publication
Lay press
Professional Society listservs (Also specialty listservs – e, g., the signal processing site at Rice.)
Commercial publishers, software houses, equipment manufacturers
OTHERS?
Additional Proposal Strategy Suggestions
Use data learning
to document existing shortcomings in student If the data or information is NOT in the literature, sometimes it is worthwhile to do a small pilot study Describe management plan
Provide tasks, team responsibilities, timeline Provide clear examples of the approach Integrate the evaluation into the project description (not just a paragraph inserted at the end)
Build assessment tools around defined objectives and expected outcomes Evaluation should be formative and summative Connect with independent evaluation experts
Proposal Strategy Suggestions
Identify strategies for dissemination
Define a plan to contribute to knowledge base Address broader impacts Collaborate, form partnerships (build community)
Lessons Learned
What are the two most surprising ideas you encountered in the session?
Reflective Exercise
Identify the single most important writing a CCLI proposal piece of advice you would give to a colleague Write it down with your earlier answers