Creating Effective Consortia: Insights for Race to the Top Assessment Program

Download Report

Transcript Creating Effective Consortia: Insights for Race to the Top Assessment Program

Creating Effective Consortia: Insights for Race to the Top Assessment Program

Presentation to the US Department of Education Wednesday, January 13 th , 2010

BUILDING EFFECTIVE CONSORTIA – OVERVIEW Design attributes

Focus, clarity and specificity in up-front design elements and objectives

Governance choices

Decision rules and rights must be clearly pre-defined and tiered by type of decision

Capacity investments

Significant investment in institutional capacity to implement, manage partner relationships for performance, and innovate rapidly

Aligned incentives

Metrics, funding, and feedback loops to align consortia incentives with value creation for participants and end users (e.g., educators) McKinsey & Company | 1

TWO EXAMPLES OF EFFECTIVE MULTI-PARTNER CONSORTIA

Design attributes Star Alliance (commercial airlines) The GAVI Alliance (public health)

▪ Alliance had clear objectives: (1) develop consistent customer experience, marketing; ▪ (2) integrate systems (ticketing, ground & baggage handling, airport lounges, crisis management & communications Objectives are clear and focused: 1) support use of vaccines in the poorest countries; 2) leverage partners’ distinctive capabilities to be “more than the sum of its parts”; 3) GAVI plans to be country led

Governance choices Capacity investments

▪ ▪ CEO board sets direction, promotes alliance, approves budget and processes, identifies areas for strategic development Alliance “managing partner” develops policies, plans, and operating budgets, and implements alliance products ▪ ▪ ▪ Board sets strategy, approves policies, budget and programs, and helps align constituencies Independent Review Committee (IRC) approves grants and tracks performance Secretariat develops policies, plans, budgets; program execution/ monitoring ▪ Dedicated “managing partner” who oversees an alliance management company of ~45 people ▪ Dedicated Secretariat that oversees Alliance program implementation comprised of ~110 people

Aligned incentives

▪ Incentives aligned through market mechanisms (customer demand, cross bookings, incremental profits, etc.) ▪ Incentives aligned through country demand and performance metrics McKinsey & Company | 2

FOUR IMPORTANT LESSONS FROM SUCCESSFUL AND UNSUCCESSFUL MULTI-PARTY CONSORTIA AND COLLABORATIONS Successful multiparty collaboration requires…

1.

Focus – agree to agenda-setting approaches and mechanisms to define and re-evaluate (regularly) 2.

Ensure clarity around participation and the “rules of engagement”

Unsuccessful multiparty collaboration…

1.

“Big Bang” approach – take on too much at once 3.

Design roles, decision rights, and rules that balance need for input and efficiency 4.

Tailor performance monitoring and evaluation processes at all levels (e.g., overall alliance and initiatives, member contributions) 2.

Seek consensus among partners with fundamentally different strategies and operating styles 3.

4.

Create too many governance bodies with too many layers and multiple partners involved Lack of focus on value creation for end users Source: McKinsey and Company review of over 150 private, public, and non-profit consortia and public-private partnerships McKinsey & Company | 3

DESIGN ATTRIBUTES: APPLICATION TO RTTT ASSESSMENT PROGRAM

Potential application to effective common assessment consortium Focused purpose and deliverables Pursue clear value creation objectives for end users

• • • Define narrow core objectives (e.g., range, frequency, timeliness, alignment to college-ready standards) to apply to 100% of students in consortium states Credible plan to deliver immediate “must haves” rapidly (e.g., ~18-24 months) Plan to deliver “good to haves” (e.g., additional subjects, adaptive testing) in subsequent annual “release cycles” • • • Standardized and clear reporting on “trajectory” to college- and career readiness to parents and students; alignment to college entrance exams Ensure usefulness to educators in improving teaching practice and shaping career paths – e.g., for how many educators are student learning growth results available before end of school year?

Design protocols to allow school systems, researchers to analyze data to assess effectiveness of schools, intervention programs, teacher training, etc

Economically sustainable model

• • Achieve superior assessment ‘quality’ with sustainable cost structure: – Development cost per test item can be higher than single state (fixed cost) – – Per student cost of test administration must not be higher (variable cost) Credible roadmap to lowering assessment administration costs over time By creating standard interfaces, aim to lower the costs of complementary 3rd party services (e.g., formative assessments, professional development, etc) McKinsey & Company | 4

GOVERNANCE CHOICES: APPLICATION TO RTTT ASSESSMENT PROGRAM

Potential application to effective common assessment consortium Critical decisions and metrics defined in consortium development phase Clear distinction in consortium structure between policy, technical, and operations arms Clear decision rules and rights, tiered by decision type

• • • Core objectives and expectations (e.g., range, frequency, timeliness, alignment to college-ready standards) defined in consortium proposals Participating States aligned through MOUs around core objectives Explicit “end user” outcomes represent “true north” (e.g., alignment to college entrance exams, % of teachers receiving student learning results by end of year, use of aligned formative assessments) • • • • Steering Committee of State Chiefs makes small set of major policy decisions (e.g., sign-off on major assessment module guidelines) Strong operational executive oversees delivery unit that creates plans, operating budgets, and leads implementation of assessment system; holds entire enterprise accountable to end user outcomes Technical working teams (inclusive of state education staff, experts, delivery unit) develop, recommend assessment content guidelines Sub-contractors managed by delivery unit create annual assessments • • • Clear delineation between policy (domain of Steering Committee) and operational decisions (chief executive and delivery unit) is essential Decisions requiring uniformity (e.g., core design elements) handled differently than those that can vary (e.g., exact reporting formats) Decisions to innovate or expand (e.g., number of partners that must agree to extend to new subject areas, procedure for admitting additional state) McKinsey & Company | 5

CAPACITY INVESTMENT: APPLICATION TO RTTT ASSESSMENT PROGRAM

Potential application to effective common assessment consortium Deliver on core objectives and expectations Manage network of partnerships

• • Strong executive leadership (e.g., consortium CEO or Administrator) is needed to make sufficiently rapid operational decisions Highly capable Delivery Unit under consortium chief executive: – Creates/manages implementation plans, operating budgets, contracting – Manages to delivery milestones and end user value creation metrics – Identifies barriers to consortium success; raises to Steering Committee/States • • • • Executive and delivery unit should manage partnerships Multiple vendors will need to be managed for performance against milestones, end user objectives, and fidelity to design criteria including costs – equally true for for-profit and not-for-profit partners Participating state education departments must be aligned functionally to work closely with consortium delivery unit (e.g., on reporting results) Create and refresh “application interfaces” and protocols for third parties to

Innovate, expand, and increase network value

• • • Delivery Unit works with technical teams and vendors to develop roadmap for subsequent “product releases” (e.g., adaptive testing, new subject matter) Build “outreach” capacity to demonstrate methods, impact to other States Create “application interfaces” and protocols for third parties to develop useful tools to increase network value for participating states, LEAs, and educators (e.g., formative assessments, pre-service and in-service training, etc) McKinsey & Company | 6

ALIGNED INCENTIVES: ROLE OF US DEPARTMENT OF EDUCATION

Department of Education role & influence on behalf of end user

Consortia design & selection RFP process Ongoing policy & performance management incentives

• Award points for State alignment on design choices and consortium governance – Specificity of MOUs, translation of common philosophy into design choices – – Consortium operational leadership capacity with clear decision authority Organizational alignment of relevant SEA functions to consortium operations – • Award points for sustainable delivery of value to end users – Don’t penalize for plans that responsibly sequence “good to haves” over time Credible plans to deliver end user value (e.g., number of educators, parents, students, struggling schools benefitting), not just assessments themselves – – Creative consortium support/ease of use for third party “application developers” Lowering assessment administration costs for participating States over time • Release RTTT funds in stages, based on operational milestones (including major formal decisions) and end user value milestones achieved by participating States • Award “bonus” RTTT assessment funds for expansion of winning consortia to additional states and additional subjects (including by subset of States) • If possible, preserve significant funding for creation of end-user applications • Ensure ESEA accountability and incentives geared towards individual student growth, college readiness and persistence (rather than static proficiency metrics) McKinsey & Company | 7