Transcript Document

OAA Assistant Principal’s Meeting January 25, 2006

Data Informed Decision Making that Improves Teaching and Learning

Why are educators so fired up about data? Superintendents ask

•How do we know if teachers are teaching our curriculum?

•How do we maximize the value of dollars spent for assessment and data management?

•Are all of our students achieving at acceptable levels?

Professional learning communities ask

•What is it we want our students to know and be able to do?

•How will we know when they have learned it?

•What will we do when students are not learning?

Creating some common language about data in schools

What are the major systems?

How are they related?

What have districts done?

Where do we want to go?

4 Major Data & Technology Systems in Schools.

Student Information Systems Assessment Systems Data analysis systems Data warehouse

Data analysis process

From Matt Stein. Making Sense of the Data: Overview of the K-12 Data Management and Analysis Market, Eduventures, Inc., Nov. 2003.

What is a Student Information System?

• Registers new students • Demographic information (address, emergency contacts, etc.) • Attendance • Scheduling of classes • Achievement data • Examples include: CIMS, Skyward, Chancery, Pentamation, Zangle, etc.

It is not keeping track of what is going on in classrooms.

What is an Assessment System?

Tool for gathering achievement information – Some deliver item banks • Benchmark by NCS Pearson • MAP by the Northwest Evaluation Association – Some deliver intact tests • Assess2Learn by Riverside • EdVision by Scantron, • Homeroom by Princeton Review – Most are web-based It is assessing what is going on in classrooms.

Who needs what data?

A single assessment cannot meet all needs.

• Administrators, public, legislators – Evaluation – Accountability – Long range planning e.g., What percent met standards on 4 th grade MEAP math?

Are students doing better this year than they were doing last year?

• Teachers, parents, students – Diagnosis – Prescription – Placement – Short range planning – Very specific ach info e.g., Who understood this concept? Why is Becky having trouble reading?

Large Grain Size Fine Grain Size

What is a “data analysis system?”

•The vendor maps your data to their system •Predefines the kinds of analyses staff will do •Allows user to create answers to questions •Lots of nice graphs, lists, etc.

Examples: AMS by TurnLeaf, SAMS by Executive Intelligence, QSP, STARS by SchoolCity, Pinnacle by Excelsior Inform by Pearson.

File Maker lets districts invent their own system .

D’Tool and TestWiz are “sort of” data analysis systems.

What is a data warehouse?

• It brings all the various sets of data together – Financial data – Personnel data – Building infrastructure data – Student demographic information – Student program information – Student achievement information – Example: Center for Educational Performance and Information’s Michigan Education Information System.

(80% of work is data cleansing.)

School Infrastructure Database (SID)

What’s in CEPI’s data warehouse?

School Code Master Single Record Student Database (SRSD) Financial Information Database (FID) Registry of Educational Personnel (REP) Student Test and Achievement Repository (STAR) MEAP ACT SAT

Why some things aren’t in a warehouse….

Easier to ignore hoarding stray Not sure what it is or how to measure it overlooked

How are these things related?

You can have a Student Info System and nothing else.

You can have an assessment system and nothing else (but most assessment systems “depend” on data from the SIS).

There is no point in having a data analysis system unless you have data. If you have a SIS & an assessment system, you’ll probably want a data analysis system.

The State of Michigan is creating a data warehouse.

A data analysis system could also use data from the warehouse.

A data analysis system can bring the pieces together without a warehouse.

Oakland Schools Board of Education agreed to spend up to $1,600,000 in 2005-06 to make Pearson Benchmark “Lite” & Inform available to all districts.

What we are trying to do: Provide Technology that Will Help

Improve teaching and increase learning for all

• Useful reports for teachers, principals and district administration • Common assessments tied to GLCEs • Item banks tied to GLCEs • Multiple district on-ramps

Project Planning Process

• Fall 2003 – Meetings with focus groups • Fall 2004 create RFP • Oct 2004 – Meeting with Assessment, Curriculum and Technology directors from Oakland districts to discuss requirements • Dec 2004 – RFP sent out to bid • Jan 2005 – 10 responses received • May 2005 – Committee selects products • July 2005 – Oakland School BOE approval

Oakland & LEA Members Only (N = 15) SA S-DA T Team -- Oakland only Higher Usef ul reports f or teachers Usef ul longitudinal reports f or administrators Comprehensive data analysis tool Usef ul reports f or administrators Aligns to district curriculum I m Sof tw are is user f riendly p o Easy to create & manage tests Usef ul item & test statistics f or administrators r t Scanning & scoring is good a n Usef ul longitudinal student prof ile f or teachers c e Allow s creation of multiple item types Helps w ith collaboration on instruction Lower Provides item bank SIS interf ace is good Training model is good Usef ul reports f or students & parents Web based testing is good Provides instructional resources St. 1.5

Disa 2.5

Not 3.5

Ag 4.5

St. Agree 5.0

Vendor A Vendor B Pearson Items are arranged by “Importance” rating.)

Major Parts of Each System

“Pearson Benchmark” Student Assessment System “Pearson Inform” Data Analysis Tool • Curriculum Framework (GLCE’s) • Interface with SIS • Items • Import external tests • Tests • Import Benchmark tests • Administer tests • Select & Analyze Groups • Score & Report – Graphs – Drill Down An assessment Portfolio “for learning” An electronic CA-60 “of learning”

Measure, Manage and Maximize Student Achievement

Benchmark Test Results

By Test • This view displays one or all tests that the selected student population has taken. Student scores are plotted across a proficiency scale.

• The view displays the percentage of students who scored within the range of each level on the proficiency scale.

Benchmark Test Results

By Standard • This view displays each assessed standard and graphs the percentage of students who mastered and did not master the standard on each assessment. • Selecting a single test displays detailed results by standard for that test.

• Selecting all tests displays student performance on the standards over time.

By

Benchmark Test Results

Individual - View Mastery Details • This view displays all mastery records for the given student, sorted by standard. • This represents a detailed running record of a student’s mastery across all benchmark tests.

Benchmark Test Results

Item Analysis • This view displays each test question and the percentage of students in the current sample who responded with each option (A, B, C, etc.).

• The bar graph displays the percentage of students who answered each question correctly and incorrectly.

Benchmark Test Results

Item Analysis • Click on the question number to see the question itself. • Click on the icon next to the question number to see a breakdown of the item’s performance by demographic category.

Benchmark Test Results

Frequency Distribution • This view plots a line-dot graph based on the test frequency distribution, and calculates the range, mean, standard deviation, and standard error. • In addition to this baseline data, you can choose to plot up to four graphs for particular demographic groups. • The sample displays the distribution of female scores compared to the overall baseline. • The view also displays how the • scores fall along the selected • proficiency scale.

Pearson Benchmark Benchmark Lite ends here

Science Kindergarten I Constructing New Scientific Knowledge I.1 Constructing New Scientific Knowledge SCoPE

I.1.E.1

Generate questions about the world based on observation.

I.1.E.1.01

Generate questions about the physical characteristics of plants or animals based on observation.

I.1.E.2

Develop solutions to problems through reasoning, observation, and investigations.

I.1.E.2.01

Create clues to help identify physical objects.

I.1.E.2.02

Develop solutions to problems of waste management through reasoning.

I.1.E.3

Manipulate simple devices that aid observations and data collection.

I.1.E.4

Use simple measurement devices to make measurements in scientific investigations.

I.1.E.5

Develop strategies and skills for information gathering and problem solving.

I.1.E.6

Construct charts and graphs and prepare summaries of observation.

I.1.E.6.01

Construct graphs based on observations of the physical characteristics of animals or plants.

I.1.E.6.02

Construct a chart classifying objects based upon physical attributes/properties.

What Attaches Where?

Science -

Sequence of Study, Grade Level Overview (K-11)

Kindergarten -

Units of Study (documents in their entirety), Grade Level Overview (K only)

I

Constructing New Scientific Knowledge

I.1

Constructing New Scientific Knowledge

I.1.E.1

Generate questions about the world based on observation. -

Test Items

I.1.E.1.01

Generate questions about the physical characteristics of plants or animals based on observation . -

Lesson Plans, Test Items

* Resources to be attached (hyperlinked) in blue text.*

Pearson School Systems  Please see comments in Notes Section, using “Notes Page” view.

*** School District Self-Guided Product Tour

Principal’s Dashboard

All users can run queries and reports (Teachers, principals, counselors, etc.)

All tests are also broken down by Concepts (“Strands”)

Parent’s / Student’s Dashboard

Quick Start

We suggest you start simple with Pearson Benchmark…

…that means giving your first few tests using “Answer Key Only”

Quick Start

Why Answer Key Only?

• You get up and running in the shortest amount of time • You get up and running with the least amount of up front set-up • You get access to content based reports • You don’t have to put items into Benchmark • You can use the paper tests that you have been using all along • You’ve minimized your “degrees of freedom” which will maximize your chance for success!

Quick Start

Why NOT Answer Key Only?

• You won’t get reports that include the actual test item.

Quick Start

Steps for AKO tests

• Tell Benchmark how many items there are on the test • Tell Benchmark what the answers are • Tell Benchmark how the items relate to the curriculum • Assign/print/administer test • Scan answer sheets • Emerge from your office, victorious!

OS Support

Oakland Schools’ Continuing Role in Pearson Benchmark

We’re here for you…help is just a phone call away!

Oakland Schools’ Continuing Role in Inform

• Create structure for naming/filing queries for – Principals – Teachers • Create a consistent set of queries for each • Teach all principals to run their own queries • Get additional test data into Inform

Professional Development for LEA’s

• Using data to inform instruction • Using Benchmark & Inform for grouping and differentiation • Using the Benchmark with

Common

Assessments • Using the Benchmark for

Classroom

Assessments • Administrator use of Inform • SIP Planning using both products

OS Support

A “modularized” notion of PD

1. Stage setting (planning) 2. System Administration training 3. AKO Use 4. Curriculum Management 5. Test Item Input 6. Test Construction 7. Online Test Delivery 8. Reports 9. Test Diagnostics 10. Others?

28 24 20 16 12 8 4 0 25

Current Status

20 Pearson Benchmark & Inform As of November 21, 2005 26 Inform Inform Data Validation (Completed, Scheduled, or Planned) Lite Benchmark Full 18

Early successes

Lake Orion High School •5 departments •14 courses •36 teachers (about 25%) •72 sections •Over 2200 scan sheets

Phase I (Sept-Nov)

• Meet individually with department heads • Review exams with course teams • Create answer keys • Verify data • Distribute results to participating teachers • Review detailed results to participating teachers • All-staff professional development (11-11-05)

Impact of Phase I

• Improved dialogue between participating teams – Discussion and modification of course assessment schedule – Question issues – Assessment design • Increased participation • Improved teacher comfort level of common assessment procedures

Phase 2 (Nov-Jan)

• Try online testing • Try using rubrics • Additional course benchmarks • Build new tests • Identify & train department experts