Performance Measures to Data Collection and the DCTAT

Download Report

Transcript Performance Measures to Data Collection and the DCTAT

Performance Measurement and the
OJJDP Data Collection Tool
presented at the
OJJDP National Grantee Orientation
April 6–7, 2010
1
CSR’s DCTAT & Performance Measurement Team
•
•
•
•
•
•
•
Agnes Cholewa
Ashley Hayward
Mary Leonard
Elizabeth Logan
Ursula Murdaugh
Monica Robbers
Matt Watson
2
Outline
• Requirements
• Performance Measurement
• Data Collection
• Reporting Performance Measurement Data to OJJDP
3
Requirements
Projects are required to:
• Collect and report performance measurement data
• Participate in an OJJDP DCTAT training session
• Submit a report on these data to OJJDP semiannually
4
Performance Measures
• Concerned with collecting information to determine
whether a program achieved its goals and objectives
• Information from performance measurement is used to
improve the operation of the program
• Inputs, outputs, and outcomes are collected and reported
5
Performance Measurement vs. Evaluation
Feature
Performance
Measurement
Evaluation
Question
How much?
What does it mean?
Example
Game score
Game analysis
Offers
A tally
Causality
Timeframe
Continuous (Ongoing)
Interval (Discrete)
Cost
Less expensive
More expensive
Performance measurement is necessary, but not sufficient, for evaluation.
6
Performance Measurement and Data Collection
• Performance measures and data collection are building
blocks of evaluation
• Hard proof of what/how/when/why your program is doing
• Documentation supports sustainability efforts
• Specifically:
–
Strengthens accountability
–
Enhances decision-making (helps governments and
communities determine effective resource use)
–
Improves customer service
–
Supports strategic planning and goal setting
7
Federal Initiatives on Performance Measurement
• Government Performance and Results Act (GPRA, 1993)
–
–
Shift from accountability for process to accountability for results
Programs must show effectiveness to justify funding
• Federal Agency Rating of Programs
• President's Agenda – “Transparency and accountability a
priority”
• Several State-level efforts also in place
8
Funding and Information Flows
Congress and OMB
$
OJJDP
$
Grantees
$
Communities
9
History of Performance Measurement at OJJDP
A Brief History…
JABG
Performance
Measures
Developed
2002
JABG PART
DCTAT
opened for
JABG Data
Reporting
DCTAT
opened for
Title V and
Formula Grants
Data Reporting
2004
JABG Report
to Congress
Included
Quantitative
Performance
Data
DCTAT
opened for
TYP, EUDL
BG, and
Discretionary
Grant Data
Reporting
2006
PART of
Juvenile
Justice
Programs
DCTAT opened
for T-JADG Data
Reporting
2004/2005
Title V Report
to Congress
Included
Quantitative
Performance
Data
10
Office of Juvenile Justice and Delinquency Prevention
Mission/Purpose:
• Authorizing legislation is the Juvenile Justice and
Delinquency Prevention Act of 2002
• Focus is on helping States and localities to respond to
juvenile risk behavior and delinquency
• Primary function of the agency is to provide program grant
funding, and support research and technical
assistance/training
• Long-term goal is prevention and reduction in juvenile crime
and victimization
11
Diversity of Programs
• Formula, Block Grants for States
• Tribal Youth Programs
• Discretionary Competitive Programs
• Enforcing Underage Drinking Laws (Block and Discretionary
Grants
• Victimization Grants (Amber Alert, Internet safety)
• Congressional Earmark Grants
12
OJJDP Funding
OJJDP generally funds 4 types of programs/projects:
• Direct-Service Prevention
• Direct-Service Intervention
• System Improvement
• Research and Development
13
Development of Core Measures for OJJDP Programs
• A small number of measures that directly link to OJJDP’s
core mission
• Comparability within and across programs
• A focus on quality services and youth outcomes
14
Snapshot of Discretionary Performance Measures
Category
Mandatory Measures
1.
2.
3.
4.
5.
6.
7.
8.
Funds awarded for activity/program
Number of youth or families served
Implementation of an evidence-based program
Number served with evidence-based program
Number who successfully complete program
**Number who exhibit desired change in targeted behavior
Number who offend/reoffend or
Number victimized/re-victimized
System
Improvement
1.
2.
3.
Funds awarded for activity
Implementation of an evidence-based program
Number served with evidence=based program
Research and
Development
1.
2.
3.
4.
5.
Funds awarded for activity
Funds awarded for evaluation
Number of programs (by type) evaluated
Number of final reports accepted
Number of training curricula accepted
1.
Direct-Service
Prevention
2.
Direct-Service
Intervention
3.
4.
15
27% of discretionary grantees are implementing
one orEvidence-Based
more evidence-based
programs: (JulyPrograms*
December 2009 Reporting Period)
*Definition: Programs and practices that have been shown, through rigorous evaluation and
replication, to be effective at preventing or reducing juvenile delinquency or victimization, or
related risk factors. Evidence-based programs or practices can come from many valid sources
(e.g., Blueprints for Violence Prevention, OJJDP’s Model Programs Guide). Evidence-based
practices may also include practices adopted by agencies, organizations, or staff which are
generally recognized as “best practice” based on research literature and/or the degree to which
the practice is based on a clear, well-articulated theory or conceptual framework for
delinquency or victimization prevention and/or intervention.
16
OJJDP’s “Behavior” Measure Options
Percentage of program youth who exhibit a desired change in the
targeted behavior. (Several options – select most relevant behavior)












Substance use
Social competence
School attendance
GPA
GED
High school completion
Job skills
Employment status
Family relationships
Family Functioning
Antisocial behavior
Gang-related activities
17
Other Data Results
From the July – December 2009 Reporting Period
•
•
•
Number of Youth Served: 109,656
Number of Youth Who Offend or Reoffend: 644
Funds Used For:
–
–
–
–
Direct Service Prevention: $3,253,214
Direct Service Intervention: $1,719,168
System Improvement: $1,163,804
Research and Development: $336,584
18
OJJDP’s Performance Measures Website
http://ojjdp.ncjrs.org/grantees/pm/
19
Data Collection
20
Data Collection
•
•
•
•
•
Need up-front planning
Need a sense of what you are trying to accomplish
What data will you collect and why?
What data sources are available and which will you use?
How will you use the data beyond just reporting it to
OJJDP?
21
Purpose of Data Collection
• An ongoing process that keeps the project focused
• Provides the information needed to report on performance
measures
• Data and data collection are the building blocks of
performance evaluation
• Use data collection to enhance your ability to monitor and
evaluate your program
22
Data Collection Standards
• Program documentation
– Clearly describe and document performance measures
– Keep logic model and performance measure
documentation together as part of the history of your
program
• Formal agreements for data collection
– Make sure that written agreements are clear
• Collect valid and reliable data
– Report accurate data
23
Data Collection Standards (cont.)
• Analyze Data
–
Quantitative data (i.e., data from surveys) and qualitative data
(i.e., from interviews) should be appropriately and
systematically analyzed
–
Obtain training and technical assistance for this if necessary
• Justify Conclusions
–
Justify the conclusions you make from your data
• Protect Rights of Program Participants
–
Design and conduct data collection to protect the rights and
welfare of all participants
–
Obtain training and technical assistance for this if necessary
24
Keeping Track of Data
•
Use data collection planning tool
•
Identify staff member to coordinate and monitor data collection
•
Assemble data collection checklists
– Develop forms and instruments
– Develop procedures or policies for collecting needed data
• Must collect accurate data in a systematic manner
• Develop a codebook to define the data you collect
• Policies and data collection codebooks can help keep the
program on track even with staff turnover
Pilot-test your procedures!
25
Plan for Performance Measurement in Ongoing
Program Assessment
To assess your program, include plans for:
• Analysis/synthesis – How performance measurement data
will be analyzed and summarized
• Interpretation – How the program will interpret what the data
mean
• Dissemination – Which program stakeholders will receive the
results of the performance measurement?
• Recommendations – How the group will identify
recommendations based on the results of the performance
measurement
26
Reporting Performance
Measurement Data to OJJDP
27
The Data Collection and Technical Assistance Tool (DCTAT)
•
•
The OJJDP Data Collection Tool (DCTAT) is a resource
for your program
– Lists data submission deadlines
– Includes a training power point for how to use the DCTAT
– Lists webinar-based training schedules, phone number and email for technical assistance
– Links to performance measure (indicator) grids
– Generates reports
– Generates documentation for your program
• Include with bi-annual CAPRs
• For use in your program
Changes and improvements to the DCTAT are ongoing
28
The DCTAT
Steps to Complete Reporting in the DCTAT:
• Log in
• Profile (Review, Complete, or Revise)
• Select a Reporting Period
• Step 1: Enter Award Information (Includes Target Population
Information)
• Step 2: Select Program Categories
• Step 3: Select Performance Indicators
• Step 4: Enter Data
• Step 5: Create a Report to Submit to OJJDP
• Complete the User Feedback Form
29
DCTAT Sign-in Screen
This screen contains
information and
resources for your
program
The Grantee will be provided with a
user ID and password from the
System Administrator
Grantee (Grantor) is defined
as the primary recipient of
funds from OJJDP.
Website address:
http://www.ojjdp-dctat.org
30
Profile Screen
Profile screen
contains
information
received via a
download from
GMS.
If you are a first-time user,
the system will take you
to this screen first.
Please update this page
frequently to receive
important e-mails from
the DCTAT.
Most screens in the
DCTAT have
helpdesk contact info
31
Grant Program Selection Screen
If you are a returning user, the system will
take you to this screen first.
The purpose of this screen is for you to
select the reporting period for which you
need to enter data for a current reporting
period or view data entered previously. If
you are not sure, please call the DCTAT
help desk.
32
Designation Screen
The purpose of this screen is for you to inform the DCTAT how you as the
Grantee administer your funds. There are 2 methods: 1) Grantee spends
funds and/or awards funds to subaward recipients (subgrantees); 2) the
Grantee solely uses all funds.
NOTE: Subgrantees are secondary recipients of funds from the Grantor (not
from OJJDP). Secondary awards were made from the primary award
received from OJJDP.
33
Grantee Status Summary Screen
This screen provides
the status of
performance
measures data entry
at the grantee level
34
Grantee Status Summary Screen with Subgrantees
This screen provides the
status of performance
measures data entry at the
grantee level and subgrantee
level (if applicable)
The system has red buttons
that lead you to the next
action or step. “Follow the
red buttons!”
35
Step 1: Award Information Screen (1 of 3)
This is a view of the first data entry screen. It is general
info questions about your award or subaward.
36
Step 1: Award Information Screen (2 of 3)
Target Population Information Continued
• Tell OJJDP about the population that is
served/funded by your award. This will be
different at the grantee level or subgrantee
level.
• Programs that directly provide
services/programs to youth are asked to
define the population by race/ethnicity,
justice involvement, gender, age,
geographic location of population served
by the federal award.
• Grants that use funds for “system
improvement” type projects should select
the option “Youth population not directly
served”.
37
Step 1: Award Information Screen (3 of 3)
Target Population Information Continued
The “other” category is to
define other factors that may
define the population that you
are serving.
Are these additional factors
that were proposed when you
applied for funding?
38
Step 2: Program Category Selection Screen
The next step is to Select
Program Categories.
Remember activities funded by your award are organized
into these 4 categories:
• Prevention – Youth has not had any involvement in the juvenile
justice (jj) system but may have risk factors for involvement.
• Intervention – Youth has had some involvement in the jj system
and you would like to intervene to prevent further involvement
• System Improvement – a program or project may need hiring
of staff, staff training, new policies/procedures; MIS
development/enhancement
• Research and Development – a project is research or
evaluation focused; related to a juvenile justice program or
population; or development of materials that will be considered
for use with a juvenile justice population or program
39
Step 3: Indicator Selection Screen
Can Be Mandatory and Optional
The next step is to select
indicators (performance
measures) that represent
your grant-funded activities
The indicators are presented as
mandatory (those that OJJDP requires
you to report to support their “core”
measures) and then optional
indicators. The optional indicators are
additional measures for which you are
encouraged to select as many as apply
to your grant-funded activities. This is
data that may help to maintain and
manage your program activities.
40
Step 4: Data Entry Screen
This screen provides you with all of the
mandatory and optional measures that
were selected for data reporting.
• If a mandatory measure does not relate
to your grant-funded activities, enter
zero.
• If you do not have data this reporting
period for a selected optional or
mandatory measure, just enter zero.
• In the comments section of the
Performance Data Report, you can
explain the zero values that were
reported.
41
Step 5: Reports Menu
There is 1 Mandatory Report Type:
Once all data entry has been completed,
you are ready to create the mandatory
report that should be submitted to OJJDP
through the Grants Management System
(GMS).
Performance Data Report:
Aggregates your data; submit this one to
OJJDP through GMS
42
Step 5: Reports Menu (cont.)
1
1. Performance Data Summary Report – Provides
a comparison of a grantee’s aggregated data to an
aggregate of national data by federal program.
2
2. Subaward Detail Data Report – Contains
performance measurement data for all active awards
at the grantee and/or subgrantee level for the
reporting period.
3
In addition to the mandatory
report, the DCTAT provides
other reports for your use.
3. Performance Data Report by Subgrantee – An
aggregate data report by subgrantee by federal
award. (only displays when applicable)
4. Close Out Report – Provides in aggregate form,
data reported across all reporting periods during the
life of the award. It should be submitted as a part of
the close-out package when the close out process
has been initiated in the GMS system.
43
User Feedback Form
Wait - before you go!
Let us know about your experience
using the DCTAT and how you would
like to use your data!
44
Please Remember!
• Report accurate data!
• Prepare your data before entering the tool
• Follow the red buttons to get to the next step
• When data entry is complete, select “Mark data as complete
and create final Performance Data report”
• Export the Performance Data Report (PDF or Word format)
and save to your computer
• After saving to your computer, be SURE to upload this
document to GMS as an attachment to get credit for reporting
45
DCTAT and GMS Reporting Schedule
Congressional Earmark and Discretionary Grantees
Activity Period
DCTAT Due Date
Upload to GMS?
Yes
January – June
July 30
by July 30
Yes
July – December
January 30
by January 30
46
Questions/Comments
47
Contact Information
Website
•
To access the DCTAT website, please go to:
http://www.ojjdp-dctat.org
Technical Assistance
•
•
E-mail: [email protected]
Toll-free: 1 (866) 487-0512
48