TSPC MEETING JULY 20-22,2011

Download Report

Transcript TSPC MEETING JULY 20-22,2011

ACCREDITATION SITE VISITS

DIVISION 010 – SITE VISIT PROCESS

DIVISION 017 – UNIT STANDARDS

DIVISION 065 – CONTENT STANDARDS





Team selected from higher education peers
and k-12 educators.
Institutions would present evidence at TSPC
office for review.
Teams would review evidence and visit the
institution.
Team evaluated evidence based on standards.
Purpose of the site visit was to determine
compliance to standards.
Programs approved by commission and
reapproved as part of unit site visit.
 Critics of process
1. Process subjective
2. Inconsistent in evaluations
3. Teams made recommendations based on
site visit findings.
4. Undefined culture of evidence
5. No program review process.

Move purpose of process from compliance to
continuous improvement
 Change the definition of culture of evidence
1. Define required assessment systems
2. Define required categories of data to
demonstrate candidate competencies.
3. Define processes for use of data for
program improvement.

Create a rigorous program review process as
part of accreditation process.
1. Emphasis on assessments, rubrics and
scoring guides
2. Emphasis on quality of data for purposes of
continuous improvement
3. Use of data in continuous improvement
process

Key standards for accreditation
1. Candidate competencies evidenced by data
2. Assessment systems
3. Field experiences
4. Cultural competency/Diversity and inclusion
5. Faculty Qualifications
6. Unit Resources and Governance



Site team use of rubrics to determine meeting
standards
Allows for meeting standards yet determining
Areas for Improvement (AFI)




New process in accreditation.
Evidence used to demonstrate validity of
candidate competency data during unit site
visit.
Program review process virtual in nature
based on electronic exhibits.
Program reviews conducted six months prior
to unit site visits.


The commission has adopted a template for
the program review process associated with
site visits, major program modifications and
new endorsement programs
The intent is to provide clear directions on
the requirements for program review,
addition and modification. Electronic
submission of materials is required for easier
review by commissioners and site team
members.







PRINCIPLES TO FOLLOW FOR DATA
COLLECTION
· Candidates ability to impact student
learning
· Knowledge of content
· Knowledge of content pedagogy
· Pedagogy and professional knowledge,
· Dispositions as defined by state standards
or the unit’s conceptual framework
· Technology


The following rubric will be used when
considering whether the program meets state
standards.
Acceptable: The program is aligned to the state
program standards. Assessments do address the
range of knowledge, skill and dispositions stated
in standard or by unit. Assessments are
consistent with the complexity, cognitive
demands, and skill required by the standard it is
designed to measure. The assessment does
measure what it purports to measure. The
assessments are defined. The assessments and
scoring guides are free of bias.

Assessment instruments do provide candidates
or supervisors with guidance as to what is being
sought. Assessments and scoring guides allow
for levels of candidate proficiency to be
determined. The assessments do address
candidate content knowledge, contentpedagogy, pedagogy and professional
knowledge, student learning and dispositions.
Field experience does meet the requirements of
the standards. There is evidence data has been
summarized and analyzed. The data has been
presented to the consortium. Syllabi clearly align
and clearly address the program standards.


AFI Example: Key assessment do not provide
candidates or supervisors with substantive
guidance as to what is being sought.
Rationale: Scoring guides use simple words
(i.e. unacceptable, emerging, proficient, or
exemplary) and are left to broad
interpretation.


AFI Example: Instruments and scoring guides
do not allow for levels of candidate
proficiency to be determined.
Rationale: Data demonstrates little or no
distribution of candidates across the scoring
guide scale. All candidates receive
predominately the same score.


State Program Review Results Report:
The State Program Review Results Report is
the document that will be submitted by the
program review site team to the Commission
for review. at the meeting prior to the
submission of the unit’s Institutional Report.

The program review site team will make
recommendations to the Commission
regarding whether the Commission should
extend full state recognition of the
program(s), recognition with conditions, or
denial of the program’s recognition. {See
Division 10 for the levels or program review
recognitions.}




Small group activity:
Question: Does the acceptable level in the
rubric define clearly expectations for program
review and approval?
Question: Show teams review syllabi to
program standards?
Question: Should teams evaluate assessments
and data for quality?




Small Group (cont.)
Question: Should programs provide evidence
of consortium review of data.?
Question: National standards require 3 years
of data. What should be Oregon’s standard?
Question: At what point should conditions be
imposed? At what point should recognition be
denied?