ECE Finance: Framing Our Discussion

Download Report

Transcript ECE Finance: Framing Our Discussion

QRIS Standards Learning Table
Session #4: Efficiency: Streamlining
QRIS using your State Knowledge and
Data-based Experience
National Center on Child Care Quality Improvement
Introductions and Updates
• Introduce the state team (Name, title, agency)
AL, CA, CT, GA, HI, NV, OR, VI
• Update us on what your state team has been
working on in the development of your QRIS
since our last call.
• If a certain resource or idea has been
particularly helpful, tell us about that.
• What is your current, most pressing
challenge?
2
Homework Discussion
AL, CA, CT, GA, HI, NV, OR, VI
• What did your state consider in the
development of QRIS standards?
• What type of data are you collecting to inform
future revisions?
• How is your state using research to inform
your selection of standards?
Overview – of Presentation Today
• Data systems and standards
• Using data for decision-making in QRIS design
and revision
• Oregon experience using data
• NAEYC experience using data
• Data efforts (national)
• KY – slides and notes at the end as a resource
QRIS Data Systems Support Implementation
• Online application (provider portals for uploading
documents, connecting to relevant resources)
• Data import from other systems (regulation,
registry, onsite assessment reports, etc.)
• Calculating ratings, relationship between
standards/policies and program participation and
levels of quality
• Supporting the QI/TA functions
• …Data!
Use Data to Eliminate Criteria
• If your state data show that all or most
providers meet a criterion (no variation by
level), consider dropping it.
• Or move the criterion to Level 1.
• Or if it’s an essential element defining quality,
keep it, but don’t use it to determine ratings.
Use Data to Move/Revise Criteria
• Suppose your state data show that very few or
no providers meet a criterion.
• If it’s not an essential element of quality,
consider dropping it completely.
• If it is an essential element of quality, consider
– moving the criterion to the top Level or
– moving it into the CQI section of your QRIS
– focusing TA and PD on improvement on it, and not
including it in ratings until practice has advanced.
Use Data to Find ‘Predictor’ Criteria
• With research partners, explore the
relationships among criteria.
• Is there a set of items that consistently are
met?
• It is possible to determine statistically if one of
them is a “predictor” (if it’s met, very highly
likely that the others are also met)
Use Data to Revise QRIS
• Suppose the data shows that programs in your
state QRIS are meeting many criteria (but not all)
in the block above where they are now.
• Use criteria level data from the programs
currently participating in QRIS to model how
programs might score in alternative rating
structures – points or hybrid. KY has done that (as
resource at the end)
• OR – will tell us about OR’s use of research to
inform QRIS development
Oregon’s Process to Streamline
QRIS Standards
National Center on Child Care Quality Improvement
Brought together two groups
Structural
Indicators of
Quality (QI)
Environmental
Indicators of
Quality (OPQ)
Vision, Mission and Guiding Principles
Workgroups Charge
1. Merged indicators of quality together with
intensive input from the Standards
Workgroup. (Dec - March)
2. Reviewed input to the standards. (Jan-Aug)
3. Provided final recommendations based on
input. (May-Sept)
Input to the Standards
• Gather input from as many interested parties as possible.
• Give interested groups both access and time to provide
input.
• Seek input in a variety of ways.
• Work for a balance between achievability and perfection.
• Remember TQRIS isn’t a silver bullet.
• Considered Recommendations in the larger context of
whole system.
•
•
•
•
•
Few and powerful
Understandable, relevant and intuitive
Measureable and feasible to monitor
Progressive/distinct among the levels
Goals of the State
Sources of input to the
Standards Development
•
•
•
•
•
•
•
•
•
Standards Workgroup of Statewide Partners
Research from Oregon’s Quality Indicators
Research from Oregon Program of Quality Field Test
Monitoring Learning Labs with North Carolina
Early Learning Guidelines including the Head Start Child
Development Early Learning Framework and Birth to Three
Early Learning Guidelines
Race to the Top Grant Feedback
Cost Modeling from national TQRIS experts
Cultural and Linguistic Competency Technical Assistance
from Build Foundation
Oregon’s Licensing Regulations
Focus Group input to the
Standards Development
• Focus Groups of 250 child care and early education providers and
programs across Oregon
• Focus Groups of 13 Child Care Resource and Referral agencies
• Focus Groups of Oregon’s licensing specialists
• Focus Groups of health and nutrition specialists across Oregon
• Focus Groups of child care union members
• Focus Groups of Oregon’s Professional Development Committee
NAEYC Accreditation Reliability
and Validity Study
Why NAEYC Accreditation is important and can
inform QRIS development
Findings of note in re QRIS and accreditation
• Validity: Meaningful and significant differences in the
percent of criteria met in several standards (Teaching,
Relationships, Assessment of Child Progress) between
programs that achieve accreditation and those that do not.
• Content: Strong positive relationship between meeting lead
teacher qualifications and meeting higher proportion of
criteria in Relationships;
• Content: On overall diversity and cultural competence
criteria, significant difference between programs that
achieve accreditation (91% met) and those that do not
(77% met)
NAEYC Accreditation as a Mark of
Program Quality
Kyle Snow, Ph.D.
Senior Scholar and Director
Center for Applied Research
National Association for the Education of Young Children
Research Policy Practice
Goals
1. Short Overview of NAEYC Accreditation
2. What do we know about Accreditation
3. NAEYC Accreditation & QRIS Congruence
About NAEYC Accreditation
•NAEYC Accreditation is a meaningful tool for program quality
improvement for programs serving children birth through
kindergarten.
•Developed in the early 1980s
•A comprehensive system review and reinvention was fully
implemented in fall 2006.
•In 2010 an independent review of the site visit and decision
protocols was completed validating these processes.
A Portrait of Accredited Programs
As of 11/24/12, there are 6,748 accredited programs serving 592,675 children
• Corporate Structure:
–
–
–
–
Non Profit 60.3%
Public Agency 19.0%
For Profit 19.0%
Not stated 2 1.6%
• Special Populations:
– None 47.6%
–
–
–
–
Migrant workers 4.8%
Teen parents 23%
Homeless families 17.5%
Other: 19.0% (incl. 13.5% low
income)
• Program Affiliations:
–
–
–
–
–
–
–
–
–
–
–
–
College/University 5.6%
Employer-Sponsored 7.1%
Faith-based Institution 9.5%
Head Start 31.7%
Hospital 2.4%
Migrant services 1.6%
Military Installation 2.4%
Public School 19.8%
US Government Facility 3.2%
Parent Cooperative 11.1%
Indian Tribe .8%
Alaskan Native Village .8%
About NAEYC Accreditation
4-Step Process
Site Visit
Enrollment in
2
Becoming an
3
Becoming a
Self-Study
Applicant
Candidate
and Maintain
4 Meet
Standards
Quality Improvement
Self-Assessment
1
NAEYC Program Standards and Criteria
NAEYC Program Standards
1 – Relationships
2 – Curriculum
3 – Teaching
4 – Assessment of Child Progress
5 – Health
6 – Teachers
7 – Families
• Standard
– Topic
• Criteria
–Indicator(s)
8 – Community Relationships
9 – Physical Environment
10 – Leadership and Management
Sources of Evidence
NAEYC Program Standards and Criteria
Possible Outcomes:
– Accredited
– Deferred
– Denied
To be accredited:
– 80% of all assessed criteria in each standard
– 70% on all criteria assessed in each group
– All Required Criteria
NAEYC Accreditation - Recap
•
•
•
•
Programs strive to meet NAEYC program standards
Programs self-assess
Assess programs against 10 standards that are research based
Performance based upon multiple indicators and multiple
sources of evidence
• Process allows for self-assessment and NAEYC performance
feedback
• Process includes quality indicator and improvement systems
• But – does it really define quality, can programs attain it, can
they maintain it, and can it be monitored?
What do we know about Accreditation?
• Reinvention and Criteria validation
– During field tests for reinvention, NAEYC (2005) reported significant correlations
between criteria (at the standard level) and Early Childhood Environment Rating
Scale (ECERS) scores among 70 early childhood programs. The strongest relationships
were found between overall quality and program standards for relationships,
curriculum, and teaching.
• Validation studies
– Sachs and Weiland (2010): schools engaged in accreditation scored higher on
subscales of the ECERS-R, and children had higher scores on the Peabody Picture –
Vocabulary Test (PPVT-III) compared to peers in programs not accredited (even after
controlling for initial PPVT scores).
• State-level data within QRIS systems
– PA Keystone STARS program (OCDEL, 2010) showed significant correlations between
accreditation and environmental ratings of program quality (ECERS, ITERS, SACERS)
What do we know about Accreditation?
• Trend Briefs (http://www.naeyc.org/academy/primary/trendbriefs)
– communications intended to share data on programs seeking accreditation
and to connect the findings to early childhood research trends.
– Releases to date:
• Teaching: Accreditation of Programs for Young Children Standard 3
• Assessment of Child Progress: Accreditation of Programs for Young
Children Standard 4
• Relationships: Accreditation of Programs for Young Children Standard 1
• Supporting Cultural Competence: Accreditation of Programs for Young
Children Cross-Cutting Theme in Program Standards
– Upcoming:
• Family Engagement: Accreditation of Programs for Young Children CrossCutting Theme in Program Standards
What do we know about Accreditation?
• Trend Briefs: Data source:
– Sample included 130
100%
programs receiving
90%
accreditation site visits
80%
between September 2009
70%
and July 2010.
60%
– Data captured on all 417
50%
NAEYC criteria
40%
30%
– Comparisons between
accredited and not accredited 20%
10%
programs’ performance on all
0%
criteria
Criterion3.G.03 Met/Not Met Rates for Accredited vs.
Not-Accredited Programs
10%
69%
Fail
90%
Pass
31%
Accredit Programs
Defer or Deny Programs
What do we know about Accreditation?
• Trend Briefs - Selected findings:
– Relationships (NAEYC Standard 1)
• Differences are noted in terms of programs’ means of dealing with
challenging behavior, but even more so in the degree to which programs
provide a “predictable, consistent, and harmonious” classroom.
– Teaching (NAEYC Standard 3)
• Programs differ primarily among criteria that assess the use of
scaffolding strategies in the classroom.
– Assessment of Child Progress (NAEYC Standard 4)
• Programs accredited by NAEYC demonstrate a planned, intentional use
of child assessment and communication of assessment results: using
assessments to improve instruction and program design, and to
effectively communicate assessment results to other teachers and
families.
What do we know about Accreditation?
• Trend Briefs - Selected findings:
– Supporting Cultural Competence (Cross-Standard)
• Many of the same criteria that prove the most challenging overall also
differentiate between programs that became accredited and those that
did not.
• Differences in how programs can connect with diverse families and
engage them in the child’s program
• Differences in programs’ ability to understand, and respect, diversity in
family values, especially when they may differ from those of the teacher.
• Differences in hiring diverse staff and ensuring staff receive training that
includes working with diverse families.
• Differences in providing children with varied and deep experiences to
support their own cultural competence.
What do we know about Accreditation?
• Some data to suggest valid indicator of quality
– Need more validation studies and data
• Analysis of Accreditation data show differentiation between
programs accredited and those not accredited, even when all
attempt to reach same criteria
– Future analyses can identify performance clusters, possible
examine program performance pre-self-study to site visit to
examine potential for quality improvement processes
Accreditation and QRIS Congruence
• State recognition of accreditation within QRIS ratings
• Some states use NAEYC Standards for specific areas
– Alignment of program standards
– Streamlining for programs that meet accreditation standards
• Accreditation Facilitation (Program Quality
Improvement) Project models
Accreditation and QRIS Congruence
• State QRIS systems include accreditation in various
ways:
– Not recognized
– Awarding additional points towards rating (overall or in
specific areas, varying by system)
– Enter at top (or near-top) rating
• Some combine accreditation with ERS visits
• Some differentiate accrediting bodies
Accreditation and QRIS Congruence
In what ways can states benefit from NAEYC experience
through accreditation in designing and implementing
QRIS systems for program quality recognition and
improvement, and in communicating with families?
Data Can Facilitate Cross-State
Sharing and Comparison
• What data elements does your system need?
• Are there common definitions of data
elements?
• National data efforts to be aware of…
Common Education Data Standards
• Early Learning is one domain in the overall P20 data model
• https://ceds.ed.gov/Default.aspx
Quality Initiatives Research and
Evaluation Consortium (INQUIRE)
• INQUIRE supports high quality, policy-relevant research
and evaluation on quality rating and improvement
systems (QRIS) and other quality initiatives by
providing a learning community and resources to
support researchers.
• The INQUIRE Consortium also provides input and
information to state administrators and other
policymakers and practitioners on evaluation
strategies, new research, interpretation of research
results, and implications of new research for practice.
• Child Trends helps to facilitate INQUIRE activities
INQUIRE and Data
QRIS/QI Data Elements workgroup of INQUIRE
• worked with US Department of Education group
focusing on Common Education Data Standards
(CEDS) to create a recommended list of data
elements, which is out now for public comment.
• developing a list of recommended data elements
for QRIS and Quality Improvement purposes
• will be developing a set of data elements,
especially for child care state administrators and
CCDF reporting
Questions, Reflections, Comments?
Homework for January 17, 2013
Effective Cross-Sector QRIS: Challenges and
Opportunities
Cross-sector QRIS means one that aims for participation
by most group early care and education providers,
regardless of funding stream or auspice. At a minimum,
this includes child care centers and family child care
homes, Pre-K and Head Start, i.e., all publically supported
and licensed settings, but not informal caregivers.
A survey monkey link will be emailed to you for use in
completing the homework questions. –
Due January 4th (for January 17, 2013 webinar)
Homework Questions for 1.17.12 Session
•
•
•
•
•
•
•
Do you have a plan to include a cross sector approach in the QRIS? Why did you
make that decision? Identify the phase in plan for different sectors (i.e. Are you
beginning with ‘all in’ or phasing in over a few years)?
What challenges have you experienced in your efforts to develop and/or
implement a cross-sector QRIS?
What successes have you had with cross-sector QRIS?
How do license-exempt centers (e.g. preK programs located in public or private
schools) participate in your QRIS? Have you created an 'equivalent' standard for
licensing?
What have you learned about strategies for effectively engaging the support
systems of other sectors (e.g. the Head Start T/TA system or early intervention
training) in QRIS supports?
Have you tried to engage monitoring or accountability systems from other sectors
(such as collaborating with Head Start or PreK monitoring)?
Have you worked with systems like early intervention, child welfare, and others to
ensure that they understand QRIS and prioritize child placements in higher-quality
settings?
Thank You
National Center on Child Care Quality Improvement
NCCCQI does not endorse any non-Federal organization, publication, or resource.
Follow-up Contacts:
[email protected]
[email protected]
[email protected]
[email protected]
[email protected]
www.qrisnetwork.org
[email protected]
[email protected]
Presented with permission from Child Trends (2012)
Presented with permission from Child Trends (2012)
Presented with permission from Child Trends (2012)
Presented with permission from Child Trends (2012)
Presented with permission from Child Trends (2012)
Presented with permission from Child Trends (2012)
Presented with permission from Child Trends (2012)