/resources/retreats/data_retreat.ppt

Download Report

Transcript /resources/retreats/data_retreat.ppt

Data Retreat
January 12, 2006
Please sign in
Introduction
• Welcome by Dean Miller
• Preview of retreat
• Complete evaluation and leave at table
before you leave
Assessment Objectives
2005-2006
1.
2.
3.
4.
5.
Data is collected at each transition point – there is
an evaluation at the transition point and data is
collected.
All faculty are able to enter data from courses and
clinicals/internships.
Data is evaluated for sufficiency and reliability –
issues of fairness, accuracy, and consistency are
analyzed.
Data is used by multiple stakeholders to make
decisions at transition points
Improvements in data collection, analysis, and
decision-making are documented and implemented.
Areas for Improvement
• The assessment system at the advanced
level has not yet been implemented to the
level achieved at the initial level.
• At the initial level, only the clinical
experiences transition point report data
aligned to program standards.
Focus of Retreat
2005-2006 Objectives 3 & 5
3. Data is evaluated for sufficiency and
reliability – issues of fairness, accuracy,
and consistency are analyzed.
5. Improvements in data collection,
analysis, and decision-making are
documented and implemented.
Objective #3
Data is evaluated for sufficiency and
reliability – issues of fairness, accuracy,
and consistency are analyzed.
Data Analogy
•
•
•
•
•
Goal is losing weight
Check to see clothes fit
Weight on scale
Ask others-”Do I look like I have lost
weight?”
Correct Goal?
Healthy weight
Fitness level and activities
Sufficiency
• Is there enough data to determine a
pattern?
• The three points can be across measures,
across participants, or across time.
• Does sufficient data exist to help with
understanding of demographics,
achievement, perceptions, and program
processes?
Data Analogy
Considerations
Why Improve
Standards to base
improvements on
Methods
Data Collection
Evaluation
Weight Loss
Professional Ed
•Doctor
•Clothes
•Sense the need
•NCATE
•National & state mandates
•Sense the need
•Height/weight charts
•Sense of what feels right
•National & state standards
•NCATE
•Professional Experience
•Diet
•Exercise
•Course work
•Clinical experiences
•Daily Calories
•Scale
•Exercise chart
•Core Assessments
•Clinical Evaluations
•Dispositions
•Reach weight goal
•Feel good
•Clothes fit
•Praxis
•Follow up surveys
•Advisory Committee
Reliability
• Does the instrument consistently measure
the same construct or student ability?
Fairness
• Is the assessment free of cultural bias?
• Is the assessment equally accessible to all
participants?
• Is the time frame needed to conduct the
assessment appropriate for a quality
measurement for both the participant and
the evaluator?
Accuracy
• Do we have measurements from all
participants?
• Is there missing data?
• Does that data collected match the
instrument?
Consistency
• When the instrument is used by more than
one evaluator, do they arrive at
comparable results for similar
participants?
• Are some scores higher for some
participants in identical courses or groups?
• If the measurement were taken again
today, would you likely get the same
result?
Question 1: How would you evaluate
the data you expect to have with
respect to (refer to your Standard
Matrix document):
•
•
•
•
•
Sufficiency
Accuracy
Reliability
Fairness
Consistency
Make revisions to Standard Matrix
Objective #5
Improvements in data collection, analysis,
and decision-making are documented
and implemented.
Random Acts of Improvement
Focused Acts of Improvement
Four Quadrants of Data
•
•
•
•
Demographics
Student Learning or Achievement
Perceptions
Processes
How Is Data Collected?
• University Assessment System Database
–
–
–
–
Clinical evaluations
Core assessment evaluations (rubrics)
Praxis pass rates
Surveys at admissions and student teaching (initial
licensure programs)
• Academic Advising Office
– writing assessment (initial licensure programs)
– GPA at admission to Professional Ed
• Office of Professional Ed-follow up surveys of
graduates and employers
• What is not being collected?
Reports of Data from University
Assessment System
• Usable vs. unusable reports
• Default report format and content (to be
approved by Professional Ed Assessment
Committee)
• Other options can be requested
• Program faculty are encouraged to
consider their specific needs for reports
What will be most useful?
• Trend Analysis Over Time – Use
frequencies and percentages
• Examine frequencies at each level
• Sub measures provide more detail about
performance
• Major is helpful for departments like KSP
and content areas
Evaluation Reports
• Means and/or Frequency and Percentages
• Defined by….
– Rubric
– Semester
– Courses
– Sections
– Major(s)
– Race/Ethnicity
• Format: Excel; PDF; Web; Word
Default Report Format
• Course or Program
• Measurement Instrument Name
• Criteria (Sub measures/parts of the rubric
in the measurement instrument or by
student major if only one measurement)
• Performance Indicators (rank by number
and percent).
Examining Reports of Data
• Student Learning reports
– Core assessments
– Clinical evaluations
– Student teaching
• Perception surveys of graduates
• Process spreadsheet of Professional Ed
Review Board requests (variances)
Patterns
• Three data points
– Over time
– Across programs
– Across measurements
• Significant and insignificant impact
• Observations are crafted as statements of
fact
What patterns do you observe?
Question 2: What decisions can be
made with the data at each
transition point for:
•
•
•
•
Students
Faculty
Course
Programs
Group Task: Complete table
Question 3: What additional data
would you need at each transition
point to make better decisions for:
•
•
•
•
Students
Faculty
Course
Programs
Group Task: Complete table
Where are the gaps?
•
•
•
•
•
Types of data
Transition point data
Collection of data
Analysis of data or reporting of data
Use of data to make improvements
Group Task: Examine questions sheet
Analyze on table handout
Review of Standards Matrix
Group Task
• Consider comments on matrix
• Review for sufficiency, reliability, etc.
• Review ability to make decisions from data
reports
• Respond to questions on charts
• Review for ability to answer questions
• Make further revisions to Standards matrix
Collect standards matrices
electronically
At tables
Next Steps
• Revise assessments and standards matrix
(if applicable) [due February 9th]
• Modify or add to program measurements
• Review comprehensive data as a program
to consider picture it reveals
• Make changes to curriculum or policy
• Make changes to procedures, advising
and supporting materials
Aligning Our Improvement
Retreat Evaluation
• Complete evaluation
• Leave on table as you leave
Conclusion
• Dean Miller-where are we heading?