Incidence and Mortality Ends Committee

Download Report

Transcript Incidence and Mortality Ends Committee

Kevin Mandernack, PhD
Department of Earth Sciences
IUPUI
• Basic Information on NSF
• Who they serve
• Recent budgets and funding rates
• Merit review criteria
• Overview of review process
• Tips on Writing a Successful Grant Application, do’s and don’t’s
• Contacting your Program Officer
• Interpreting proposal reviews
• What to do after you receive an NSF proposal
• Leveraging your existing funds
• Non-conventional sources of NSF funding
• Some useful links to NSF
• Answer YOUR Questions
Including but not limited to:







Astronomy
Atmospheric
Sciences
Biological Sciences
Behavioral Sciences
Chemistry
Computer Science
Earth Sciences







Engineering
Information
Science
Materials Research
Mathematical
Sciences
Oceanography
Physics
Social Sciences
NSF-7
Universities and colleges
 Academic consortia
 Nonprofit institutions
 Small businesses
 University and industry collaborations
 National research centers
 International research and education
efforts

NSF-14
NSF RESEARCH
GRANT PROFILE
(FY 2012)
Awards (Research):
6,636
Average annual award:
$165,831
Median annual award:
$125,171
Average duration (research):
2.89 years
Recent NSF research budgets
2012: $5.7 billion
2013: $5.5 billion (sequester)
2014: $5.8 billion
NSB Recommendations:
Three Merit Review
Principles
and
Five Review Elements
Three Merit Review Principles
1. All NSF projects should be of the highest quality and
have the potential to advance, if not transform, the
frontiers of knowledge.
2. NSF projects, in the aggregate, should contribute more
broadly to achieving societal goals. These “Broader
Impacts” may be accomplished through the research
itself, through activities that are directly related to
specific research projects, or through activities that are
supported by, but are complementary to, the project.
Three Merit Review Principles
(continued)
3. Meaningful assessment and evaluation of NSF funded
projects should be based on appropriate metrics,
keeping in mind the likely correlation between the effect
of broader impacts and the resources provided to
implement projects. If the size of the activity is limited,
evaluation of that activity in isolation is not likely to be
meaningful. Thus, assessing the effectiveness of these
activities may best be done at a higher, more aggregated,
level than the individual project.
Five Review Elements
1.What is the potential for the proposed activity to:
a. advance knowledge and understanding
within its own field or across different fields
(Intellectual Merit);
b. benefit society or advance desired societal
outcomes (Broader Impacts)?
2. To what extent do the proposed activities suggest
and explore creative, original, or potentially
transformative concepts?
Review Elements (continued)
3. Is the plan for carrying out the proposed
activities well-reasoned, well-organized, and
based on a sound rationale? Does the plan
incorporate a mechanism to assess success?
4. How well qualified is the individual, team, or
institution to conduct the proposed activities?
5. Are there adequate resources available to the PI
(either at the home institution or through
collaborations) to carry out the proposed
activities?
NIH only gives you 2 shots, not so at NSF, but
it is still good to follow this rule. Consult with
Program officer regarding resubmittals.
Page limits impact getting your ideas and
plans across to reviewers
Congress and economy are not being
researcher-friendly
Be Informed & Efficient
_____________________________________________________________________________________________________________________________________________
Do not be afraid to ask questions
The more you know, the better you can plan
Time is your most precious commodity
It is crucial you make the most of it
Advice will vary
Seek guidance from more than one source
The key to success is persistence
Learn from your mistakes, try again
Don’t put all your eggs in one basket
Diversify your funding portfolio
Strategies for Success: Proposal
_____________________________________________________________________________________________________________________________________________
1. Target programs with high success rates.
2. Target special programs you may qualify for
3. Run your own mini-review.
4. Suggest appropriate and available reviewers.
5. Learn from rejection.
6. Know how to interpret reviews/feedback.
7. Be persistent, but know when it’s time to move on.
Strategies for Success: Professional
_____________________________________________________________________________________________________________________________________________
1. Write down ideas as soon as you get them.
2. Volunteer to be a reviewer.
3. Volunteer to be a panelist.
4. Arrange to meet and talk with agency Program
Officers at national meetings.
5. Visit the agency and “do the walk”.
6. Attend and participate in agency sponsored
“community workshops.

1. Address agency/program mission

2. Discuss size and scope of intellectual payoff

3. Hypothesis driven proposal, with tests



Must have novel ideas and show evidence of
being transformative (not a simple extension
of Ph.D. thesis or postdoc work)
Clearly stated hypotheses (no “fishing
expeditions”)
Limit hypotheses to ~3, which capture the
overarching goals of the entire proposal




Sell the “Big Picture” and global significance early
How your proposed work will significantly
advance the “Big Picture” presented above
Hypotheses should come soon after both of the
above (~page 3-4)
Address any external or negative issues directly.



Periodically bring your reader back to the “big
picture”
Summarize for the non-expert (panel member)
the relevance and implications of the
details/methods you presented
The above tips are particularly important for
interdisciplinary proposals that necessitate
diverse reviewers, and for any proposal receiving
a panel review

Broader Impacts need to be creative and
original at many levels
◦ Scientific impact
◦ Societal impact, public outreach, underrepresented
groups
◦ Educational impact, including creative approaches
that foster learning amongst diverse communities,
different levels (K-12, teachers, undergrad., grad.),
policy makers
◦ Utilize in-house facilities (eg., CTL, Signature
Centers)



Simple as possible is best, but with all details
and necessary methods included, at least
briefly – convince panel you can do this
Panelists read stacks of proposals, the faster
and easier they can see the idea and
remember it the better; use frequent
informative subheadings, flow charts, figures
Show good productivity from prior results
(PAPERS!)

Topic appropriate for intended program?
◦ The “black holes” of NSF (eg., coastal processes
often do not get reviewed by NSF OCE)
◦ Have a conversation with the Program Director first




1. “This proposal suggests a clear, elegant, welldocumented approach to a problem that has plagued this
field for decades.”
2. “This is certainly adventurous, and I frankly would have
doubted it could be done. Yet the PI has proven the method
in preliminary work AND had it accepted by a peer-reviewed
journal!”
3. “The PI has a beautiful plan. Undergraduates or new
graduate students can step right into this work, yet it solves a
major problem and will be publishable in a first-rate journal.
4. “This reads like a dream. I have rarely seen a proposal,
even from long-established investigators, that shows such
careful thought and meticulous presentation.”







Include a picture of students doing something
without safety gear
Leave out clearly visible hypotheses
Scope of work not proportional to budget or time
requirements (don’t be overly ambitious)
Not enough details of methodology/sampling
Frequent typos, grammar mistakes, using smaller
font for added verbage
Failure to reference previous & important studies
Oversell the value of what you are doing
The Program Officer: Your Secret Weapon
_____________________________________________________________________________________________________________________________________________
• Answers questions
• Solicits proposal evaluations
• Runs merit review process
• Informs/makes funding recommendations
• Administers grant, revises budgets
• Helps you prepare competitive proposals
Seek their input, preferably in person
Program Officer: Not All Are Created
Equal
_____________________________________________________________________________________________________________________________________________
• Some are permanent, some are rotators
• Some are activists, others more conservative
• Some more knowledgeable than others
• Some travel more than others
• Some more autonomous than others
Program Officer: Questions
_____________________________________________________________________________________________________________________________________________
• Do you fund the kind of thing I want to do?
• Am I eligible? (if applicable)
• What is the review process and who makes
the final recommendation?
• What is the projected success rate?
• How much money is for new initiatives?
• Are there special programs I qualify for?

Everyone Gets Bad Reviews!

Reasons: 1. Flaw in idea, logic, or approach
2. Written in a way that allows that criticism
3. Reviewer wrong
(if noted by more than one reviewer, you’ve got a problem)

Strategy:








Read review
Blow off steam (in private, not to the program people)
Read again, annotate trouble spots in proposal
Now read pretending this is someone else’s proposal
Think about what they are REALLY saying

Don’t be Fooled by High Marks!*

An Example: “An excellent proposal, but….”



(Analysis: Not ready for prime time)
It is the content, not the score that matters!
*Note: People in the same subdiscipline always feel that work
is very important. People who rank proposals across the
scientific spectrum do not always agree with that view.

Typically
funded
Almost
Always
funded
Grey Zone
Almost Never
funded
After you receive your first NSF Award
• Supplemental funding
• International travel funds
• Research Experience for Undergraduates (REUs)
• Other (early June is a good time to approach your PD for
supplements, before fiscal year end, including “Careers”)
• Approved up to 20% of original funded level at sole
discretion of PD (external review if > 20%)
• Give a seminar at NSF of results from your grant
NSF Proposal Resources
• Grant Proposal Guide (GPG)
www.nsf.gov/pubsys/ods/getpub.cfm?gpg
• Early Concept Grants for Exploratory Research
(EAGER), section II.D.2 of GPG
• RAPID Response Grants, section II.D.1 of GPG
www.nsf.gov/pubs/policydocs/pappguide/nsf11001/gpg_2.js
p#IID1
NSF Proposal Resources
• RAPID Response Grants, section II.D.1 of GPG
“proposals having a severe urgency with regard to availability of, or access to data,
facilities or specialized equipment, including quick-response research on natural or
anthropogenic disasters and similar unanticipated events”
Contact relevant Program Officer first
3-5 page project description
up to $200k for one year, no external review
No-cost extension OK, supplemental funding possible,
renewal possible with external review
NSF Proposal Resources
• EAGER Grants, section II.D.2 of GPG
“to support exploratory work in its early stages on untested, but potentially
transformative, research…"high risk-high payoff”…”
Contact relevant Program Officer first
5-8 page project description
up to $300k for two years, no external review
No-cost extension OK, supplemental funding possible,
renewal possible with external review
NSF Proposal Resources
• Workshop proposals
• <$50K, only Program Director approval
• >$50k but <$100k, internal review only
• >$100K, external review
NSF Proposal Resources
Resources
• Grant Proposal Guide (GPG)
www.nsf.gov/pubsys/ods/getpub.cfm?gpg
• NSF Publication on Broader Impacts
www.nsf.gov/pubs/2003/nsf032/bicexamples.pdf
• 2013 Report on NSF Merit Review system
www.nsf.gov/bfa/dias/policy/merit_review/
• Recently Funded NSF Proposals
www.fastlane.nsf.gov/servlet/A6RecentWeeks
• NSF Program Announcements -- eligibility,
goals, special requirements
NSF Organizational Chart
Office of Diversity &
Inclusion
National Science Board
(NSB)
Office of the General Counsel
Director
Deputy Director
Office of International &
Integrative Activities
Office of Legislative &
Public Affairs
Office of the
Inspector General
(OIG)
Biological
Sciences
(BIO)
Social,
Behavioral
& Economic
Sciences
(SBE)
Computer &
Information
Science &
Engineering
(CISE)
Engineering
(ENG)
Education
& Human
Resources
(EHR)
Mathematical
& Physical
Sciences
(MPS)
Geosciences
(GEO)
Budget, Finance
& Award
Management
(BFA)
Information
& Resource
Management
(IRM)
41
FY 2014 Request: Total R&D by Agency
Budget Authority in Billions of Dollars
USDA, $2.5
DOC, $2.7
All Other, $6.7
NSF, $6.2
Total R&D =
$144.2 billion
DOE, $12.7
DOD, $71.2
NASA, $11.6
HHS
(NIH)$32.0
42
FY 2014 Request by Account
($7,626 million)
Education & Human
Resources
($880 million)
Research & Related
Activities
($6,212 million)
Major Research
Equipment &
Facilities
Construction
($210 million)
Administrative
Accounts
AOAM ($304 million)
OIG ($14 million)
NSB ($4 million)
Totals may not add due to rounding.
Kathy Marrs
School of Science, IUPUI


Prepare students to be leaders,
teachers, and innovators in
emerging and rapidly changing
STEM fields
Develop a scientifically literate
populace
Both depend on the nature and quality of
the undergraduate education experience
Research-based and research-generating
approaches to:
 Understand/advance STEM learning
 Design, test, and study curricular change
 Widely disseminate and implement best
practices
 Broaden participation of individuals and
institutions in STEM fields







Develop the STEM/STEM-related
workforce
Advance science
Broaden participation in STEM
Educate a STEM-literate populace
Build capacity in higher education
Improve K-12 STEM education
Encourage life-long learning








Experiential learning
Assessment/metrics of learning and
practice
Scholarships
Foundational education research
Professional development
Institutional change
Formal and informal learning
environments
Undergraduate disciplinary research
NSF DUE website:
https://www.nsf.gov/funding/pgm_list.jsp?org=DUE
Includes a variety of specific and cross-cutting programs:
 Robert Noyce Teacher Scholarship Program March 5, 2014
 STEM-C Partnerships: MSP (STEM-CP: MSP) March 18, 2014
 Research Experiences for Undergraduates (REU) May 23, 2014
 NSF Scholarships in Science, Technology, Engineering, and
Mathematics
(S-STEM) August 12, 2014
 International Research Experiences for Students (IRES) August 19,
2014
 Science of Learning Centers (SLC) (Accepted Anytime)
Improving Undergraduate STEM
Education (IUSE)
Replaces previous DUE programs: (TUES, WIDER,
STEP)
NSF is seeking projects that:
 Broaden participation and student retention in
STEM
 Prepare students to participate in science for
tomorrow
 Improve students' STEM learning outcomes
 Generate knowledge on how students learn and on
effective practice in undergraduate classrooms


In FY14, NSF is also accepting proposals for
developing “IDEAS Labs” in biology,
engineering, and geosciences
Intent: bring together relevant disciplinary
and education research expertise to
produce research agendas that address
discipline-specific workforce development
needs
 Note: 2/4/2014 deadline; watch for further opportunities
•
•
Intellectual Merit – the potential to advance
knowledge.
Broader Impacts – the potential to benefit
society and contribute to the achievement
of specific, desired societal outcomes.
Both criteria, Intellectual Merit and Broader Impact,
will be given full consideration during the review and
decision-making processes. Proposers must fully
address both criteria.



What is the potential for the proposed activity to:
 Advance knowledge and understanding within its
own field or across different fields (Intellectual
Merit); and
 Benefit society or advance desired societal
outcomes (Broader Impacts)?
To what extent does the proposed activity suggest
and explore creative, original or potentially
transformative concepts?
Is the plan for carrying out the proposed activities
well-reasoned, well-organized, and based on a
sound rationale? Does the plan incorporate a
mechanism to assess success?

How well qualified is the individual, team, or organization to conduct the
proposed activities?

Are there adequate resources available to the PI (either at the home
institution or through collaborations) to carry out the proposed activities?
Human Subjects:
 IRB exemption or approval documentation is required at the time of the
award - in order to receive FY 2014 funding
 Must plan for the timing necessary to obtain institutional IRB approval
Reviewers are also asked to review: Facilities, Equipment and Other
Resources, Data Management Plan, and Postdoctoral Researcher Mentoring
Plan
NSF's FY 2014 budget request is $7.626 billion,
an increase of $592.69 million (8.4%)
over the 2012 level.
Classroom Resources:
 http://www.nsf.gov/news/classroom/
Kathy Marrs

[email protected]
Jeff Watt






The summary is the first thing the reviewers and NSF
staff read!
It will determine who the project director selects as
reviewers.
It must introduce the reviewer to the story that your
proposal is going to tell.
It must be written clearly and concisely, stating:
◦
◦
◦
◦
◦
the
the
the
the
the
problem,
objectives
expected outcomes
project activities, and
audience to be addressed
NSF will publish the summary, if funded.
Considerable effort and thought should be spent in
preparing a well-written summary.
A well-written summary does the following:
 paints a picture or tells a story that sticks in the
reviewer’s head after reading 20 proposals;
 uses terms easily understood or known by various
reviewers;
 is jargon free of local or specific institution vocabulary;
 describes the problem to be studied and why it should
be solved (importance);
 provides realistic numbers on size and scope of impact;
 communicates work already done or expertise of the
investigators, on which the proposal will be built;
 provides an overview of the activities funded by this
proposal;
 provides an overview of the expected outcomes; and
 describes how will the project be sustained after the
grant ends.
By
Wesley Wright, Grant Services Manager
Indiana University
Office of Research Administration





Before review begins a Kuali Coeus Proposal
Development Document must be fully signed by all
Responsibility Centers
A fully signed routing lets our office know there is a
Proposal Development Document in the queue for
assignment
The e-mail for Proposal Development Documents is
monitored by Front Office Staff for assignment to a
Grant Consultant
Once the Grant Consultant receives the Proposal
Assignment an intro e-mail is sent to the Principal
Investigator and Department Contact
The proposal is then reviewed for to ensure
compliance with applicable guidelines






As the Grant Consultant is reviewing any corrections
that are needed are relayed to the appropriate
contacts
An example of this is biosketches being limited to
two pages.
Once any needed corrections are completed the
proposal is then submitted based on agency
guidelines
Ideally ORA likes to submit electronic proposals the
day before the due date
For paper proposals it is ideal they are mailed 2 days
before the due date.
With respect to NSF we encourage submission in
Fastlane.




Intro e-mails going out to appropriate parties
in a timely manner
Reviewing the proposal’s content for things
such as page limits and keeping all parties in
the loop of communication
Ensuring the budget categories follow any
appropriate guidelines and ensuring they are
calculated properly
We can also pre-review budgets upon request





Department contacts and Principal Investigators
need to be available to facilitate needed proposal
corrections
Contacting ORA when there are technical issues
involving routings
Also when multiple Responsibility Centers are
involved make sure they are in the
communications loop about the need to approve
For agencies that require COI disclosures have
them in place before proposal submission
Adherence to the ORA Deadline Policy





The Deadline Policy became effective in 2011
http://www.researchadmin.iu.edu/Policies/Intern
al_Submission_Policy_2011_01_04.pdf
It applies to all Sponsored Project submissions
For electronic submissions the all pieces except
the technical proposal and cited literature are to
be provided to ORA in final form 5 business days
before agency deadline
This also includes a fully signed Proposal
Development Document in Kuali Coeus
The narrative and cited literature are due 2 days
prior to sponsor’s deadline




For paper proposals all pieces except the
technical proposal and cited literature are to be
provided to ORA in final form 5 business days
before agency deadline
The complete proposal in final form must be
delivered to ORA within 3 days prior to sponsor’s
submission deadline with the number of copies
required by sponsor and one copy for Grant
Services
Any violation of the above results in an e-mail
being sent to Principal Investigator, Department
Contact and Dean or Chair of the respective
Center
Once the e-mail is received the Principal
Investigator must respond as to why the proposal
is late before submission consideration is given



Be sure to give Sponsored Research Office
(SRO) access so your proposal can be
reviewed by ORA staff
Be mindful of spending as Grants
Management Officers are monitoring
income drawn down in the Award Cash
Management System
NSF now requires COI disclosures be in
place upon proposal submission this is for
all Key Personnel. For Subcontracts a Non IU
Conflict of Interest disclosure is required if
the organization is not a member of the
Federal Demonstration Partnership (FDP).




Make sure Progress Reports are submitted
in a timely manner in Research.gov as
funding from NSF is now incremental.
Time is of the essence with things such as
Proposal Updates, Letters of Intent, and No
Cost Extension Requests
Voluntary Committed Cost Sharing is
prohibited unless required by the
solicitation
Be sure to include the current proposal
being submitted in the Current and Pending
Support section
Questions ?