Student Assessment and Data Analysis Oakland Schools MAEDSTammy L. Evans Why are educators so fired up about data? How do we know if teachers are.
Download ReportTranscript Student Assessment and Data Analysis Oakland Schools MAEDSTammy L. Evans Why are educators so fired up about data? How do we know if teachers are.
Student Assessment and Data Analysis Oakland Schools MAEDS 2005 Tammy L. Evans Why are educators so fired up about data? How do we know if teachers are teaching our Superintendents ask… curriculum? How do we maximize the value of dollars spent for assessment and data management? Are all of our students achieving at acceptable levels? MAEDS 6 October 2005 2 Professional learning communities ask What is it we want our students to know and be able to do? How will we know when they have learned it? What will we do when students are not learning? MAEDS 6 October 2005 3 Why are educators so fired up about “data”? Improving Student Achievement! MAEDS 6 October 2005 4 Creating some common language about data in schools What are the major systems? How are they related? What have districts done? Where do we want to go? MAEDS 6 October 2005 5 4 Major Data & Technology Systems in Schools Oakland Schools focus is on Assessment and Analysis Student Information Systems Assessment Systems Data warehouse Data analysis systems MAEDS 6 October 2005 6 (see Data warehouse PP on CD) SAS DAT PURPOSE Student Assessment System & Data Analysis Tool Improve teaching and increase learning for all Useful reports for teachers, principals and district administration Common assessments tied to GLCEs Item banks tied to GLCEs Multiple district on-ramps MAEDS 6 October 2005 7 What is an Assessment System? Tool for gathering achievement information It is assessing what is going on in classrooms. MAEDS 6 October 2005 8 Who needs what data? A single assessment cannot meet all needs. Administrators, public, legislators – Evaluation – Accountability – Long range planning Teachers, parents, students – Diagnosis – Prescription – Placement – Short range planning – Very specific ach info e.g., What percent met standards on 4th grade MEAP math? Are students doing better this year than they were doing last year? Large Grain Size MAEDS 6 October 2005 e.g., Who understood this concept? Why is Becky having trouble reading? Fine Grain Size 9 Oakland Schools’ Path to Student Achievement Fall 2004 – Meetings with focus groups, create RFP Oct 2004 – Meeting with Assessment, Curriculum and Technology directors from Oakland districts to discuss requirements, including multiple “on ramps” June 2005 deadline MAEDS 6 October 2005 10 The RFP Input gathered from LEA focus groups in Curriculum, Assessment, Instruction and Technology RFP authored at Oakland Schools through a collaboration between Career Focused Education, Learning Services, Research Evaluation and Assessment, Purchasing, School Quality and Technology Services. Draft copy provided to LEA Technology and Assessment Directors for input. Click here for details of the RFP Click here for details of the vendor pricing submitted MAEDS 6 October 2005 11 The Committee OCSA charged Oakland Schools and LEAs to move forward on acquisition of assessment and analysis system. The RFP evaluation committee was formed, consisting of ISD and LEA staff representing Assessment, Curriculum and Technology. Representatives from OCREAC, Teaching and Learning Council, Oakland County Technology Directors, OCSA Instruction &Technology subcommittee. Committee members were from Berkley, Huron Valley, Lamphere, Lake Orion, Troy, Novi, South Lyon, Walled Lake and West Bloomfield. MAEDS 6 October 2005 12 ISD Collaboration Jan 2005 – Oakland Schools and Wayne RESA met to review strategic goals around assessment and data analysis. Joint RFP was created Wayne RESA joined RFP evaluation committee Wayne RESA and Oakland Schools separated scoring and recommendation for individual needs and approvals. MAEDS 6 October 2005 13 The evaluation begins 10 vendors responded to the RFP The committee met to review the responses. The committee chose three vendors for demonstrations Click here for the Debriefing Voting Results. MAEDS 6 October 2005 14 The demonstrations • Vendors were asked to cover specific points. • Half day demonstrations for each vendor were held at Farmington Training Center on March 10 & 11, 2005. All Oakland Schools LEAs were invited to send representatives to the demonstrations. Over 100 participants reviewed the products and were asked to complete a survey. – Click here for the Survey results. MAEDS 6 October 2005 15 Further evaluation After the demonstrations, the committee met to discuss the products and created a pros/cons list for each vendor. Using an audience response system, the group prioritized the functionality of the products and rated each vendor on those functional areas. (see SAS-DAT PP on CD for full presentation.) Click here for the Functionality Summary MAEDS 6 October 2005 16 MAEDS 6 October 2005 17 MAEDS 6 October 2005 18 Vendor References A subcommittee was formed to conduct reference interviews. Included committee members from Huron Valley, South Lyon, Walled Lake and West Bloomfield and Oakland Schools Plato – two references, EduSoft – two references, Pearson – three references Click here for the Reference Questions The reference information was synthesized and presented to the committee on April 11. Click here for the Reference Call Summary MAEDS 6 October 2005 19 Further Analysis Reviewed goals of RFP Reviewed priority & ranking from vendor demonstrations Reviewed vendor reference calls Reviewed pricing MAEDS 6 October 2005 20 The Evaluation Filled out evaluation sheets – Click here for the Evaluation Form Results tallied: – Plato – EduSoft – Pearson MAEDS 6 October 2005 4680 4350 5720 21 Site Visit May 4, 2005 – Putnam City Schools, OK Met with Curriculum Director, principals to review product in use. MAEDS 6 October 2005 22 Facilitated Product Demonstration May 5, 2005 – Oakland Schools SAS-DAT Committee members were invited to participate in a test drive of Benchmark and Inform. MAEDS 6 October 2005 23 Principal’s Dashboard Key Feature At school & classroom levels, every bar in a graph links to student names and information Pearson Inform helps you target assistance – provide early2005 intervention 24 MAEDS 6 October Teacher’s Dashboard Key Feature Pearson Inform provides “Concept Analysis” at District, School, and Class Views … A big help in planning instruction, aligning curriculum, and identifying student needs MAEDS 6 October 2005 25 Parent’s / Student’s Dashboard Pearson Inform’s Parent Access & Family Views MAEDS 6 October 2005 26 Oakland Schools Support Models defined to support diverse needs of districts and multiple on-ramps Monetary support Curriculum, Item Banks, and Assessments delivered to all districts MAEDS 6 October 2005 27 The Partnership Created Benchmark “Lite” – Host for Oakland Schools’ • Standard curriculum • Units / Lesson plans • Assessments – – – – MCF – Michigan Curriculum Framework Common assessments tied to GLCEs Item banks tied to GLCEs Allows districts to create assessments Benchmark “Full” – administer tests (scan or web based) – report scoring Inform – Analyzes test responses down to the individual student MAEDS 6 October 2005 28 Where we are now… Conversion for 27 of 29 districts Training Implementation! August 2005+ Sharing experience with other MI districts. – Contract allows for state purchase – Increased participation reduces cost for all MAEDS 6 October 2005 29 MACUL 2006 Presentation will cover… Success stories Lessons Learned Examples of classroom assessment Examples of analysis Website and demonstration MAEDS 6 October 2005 30 Questions MAEDS 6 October 2005 31