The Munro Review of Child Protection

Download Report

Transcript The Munro Review of Child Protection

The Munro Review of Child Protection

Implications of the Interim Report: “The Child’s Journey” for Data Collection and Reporting

Karen Marcroft and Mike Pinnock Local Authority representatives on the Munro Review Performance and Inspection Sub-Group

Contents

• • • • • • overview of Interim Report the cost of measurement the purposes of recording the twin-core who needs what?

some closing thoughts

Overview

• • • • • • • • • • Focus on “the characteristics of an effective child protection system” Unintended consequences of previous reforms Evaluating impact of systems change in five partner local authorities End announced inspections and broaden unannounced Changes to SCR process to emphasise learning Separate out professional advice from statutory guidance Potential strengthening of role of LCSB Clear lines of local accountability to DCS “Twin-set” minimum data set: reduced national set & standard local set Acknowledges role of early identification and family support

Overview continued …

• • • • • • • • • Support to consensual common assessment process Improving referral-taking by shift to integrated/locality-based working Prospect of changes to over-prescribed assessment processes Highlights the effect of bureaucratisation of process– particularly ICS Emphasises need for critical reflection and evidence-informed practice Need to improve career pathways for front-line practitioners Improve public understanding of complexity of child protection work Need for professional oversight over whole system working Acknowledges potential impact of financial constraints and need for multi disciplinary approach to systems change

“What gets measured gets done”

Peter Drucker …and so what doesn’t get measured doesn’t get done?

The cost of measurement

• • Human cost – Less quality time with children and families – – – Impact of frustration and distraction on front-line staff Lack of analysis to support reflection and learning Loss of “moral authority” for managers – False assurance and loss of public confidence Financial cost – – – Developing inadequate IS systems Data processing task Efficient ineffectiveness

Three purposes of recording

• individual casework and planning • performance management • workforce development and service improvement

Individual casework and planning

• • • • • To assess, inform, understand, reflect and plan Record for the child in the future to what decisions were made and why Clearly tells the child’s story Accurate reflection of the child’s experiences, history and observations Provides an evidence trail should the worker or agency be held to account for their work

Performance management

• • To collect information required for national reporting purposes, with data collection and data entry not over-burdensome for social workers. Data collection above the minimum requirements should be justified in terms of data being required, and useful to enable understanding and management of performance. CIN Review link.

Workforce & service improvement

• • • • senior managers and commissioners need accessible information to help services adapt to changing demand for partnerships to understand impact of their collective efforts stability of staff group, staff turnover, vacancies (agency staff levels?) motivation of staff & sickness levels

“A reduced and re-focussed data set”

• National minimum data set: – Outcome-based accountability (partnership level) • • Rate of offences against children Perception of safety – Performance-based accountability (agency level) • • • How much?

How well? Including timeliness e.g. CPP duration What difference are we making? e.g. reunification

continued…

• Local discretionary set (comparable & published) – – – Understanding context Detecting system changes Monitoring capacity and capability

National Returns: Child level or aggregated data?

• • • • Who needs capability to map child’s journey through the system?

Child level data returns allow for national research, cross-linkage and longitudinal studies by central government but £50K Aggregated returns cheaper, put the onus on local authorities to have high quality information and actively interrogate it Views?

Closing thoughts…

• • Not just what we measure but why and how we measure it: – Shared, understood and acted upon at ALL levels – “Nose to tail” data – Active use of reports at operational – Track record of software houses – Last one out turn the printer off… Must be clear about “the golden thread” between all this effort and better outcomes