Comprehensive Assessment Reports

Download Report

Transcript Comprehensive Assessment Reports

Comprehensive Assessment
Reports
1
Fred Trapp, Ph.D.
Administrative Dean, Institutional Research/Academic Services (Retired)
Long Beach City College
Cambridge West Partnership, LLC
Robert Pacheco, Ed.D.
Director Of Institutional Planning, Research and Resource Development
Barstow College
Outcomes for the Session
2
 The participant will be able to
Describe the comprehensive assessment report concept.
 Locate best practice examples from other colleges
through web links.
 Discuss how the comprehensive report idea can be used
as part of the institution’s learning process and as a
means by which the institution provides quality assurance
to the public.
 Indicate national trends and efforts of consortia/national
organizations to provide quality assurance about student
experiences and learning outcomes.

Please hold questions until the end.
Curriculum Map
3
Introduction
(Purpose and
Goals)
Where Can I
Go?
How Does the
Process Meet
Our Pledge to
the
Communities
We Serve?
What Might
Comprehensive
Reports “Look
Like”?
How Do the
Reports
Improve
Institutional
Learning?
ACCJC Institutional Effectiveness Rubric
4
 Part III Student Learning Outcomes
 Proficiency

Comprehensive assessment reports exist and are completed on a
regular basis.
What Did We Look At, With Whom Did We Consult?
5
 ACCJC. Institutional Effectiveness Rubric
 ACCJC. 2002 Standards
 ACCJC. Themes
 ACCJC. Guide to Evaluating Institutions
 Professional literature
 Efforts of national groups/institutes
 Institutional web sites, listservs and colleagues
Guiding Questions
6
 How can the report writing experience:
Help faculty explore the student learning process?
 Determine the extent to which the curriculum is
working?
 Where can time, energy and/or money be allocated for
continuous improvement in learning?
 Exploit the writing process and dialogue about results
to gain broader institutional learning experiences?
 Help meet our quality assurance pledge to the
community?

Illustration Selections
7
 Council for Higher Education Accreditation (CHEA)
 Annual Award for Outstanding Institutional Practice in
Student Learning Outcomes
Demonstrated commitment to & developed highly effective
practice to use SLO assessment
 Willingness to share the practices they developed

Selection committee
 Selection criteria

Articulation & evidence of outcomes
 Success with regard to outcomes
 Information to the public about outcomes
 Using outcomes for improvement

Illustration selections (continued)
8
 Cited by scholars and peers
 Schools with assessment work cited in scholarly books and
articles
 Schools with assessment work selected for presentation at
national conferences
 Web presentations publicly available for you to ease drop upon
 Prominent national movements/initiatives regarding
learning outcomes & assessment documentation
(including public quality assurance)
What Might Be Included?
9
 Assessment focus- course, program, general ed, etc.
 What outcomes were assessed?
 How and when were they assessed?
 Who was assessed?
 What were the results?
 Who reviewed the results, made sense of the them
and what conclusions were reached?
 What are the implications for practice and/or policy
or future assessment work?
 How were the results used?
CC of Baltimore County (MD)
Course-level Reporting
10
 Middle States Commission on Higher Education
 Community College Futures Assembly, Bellwether
Award, 2008

Instructional Programs & Services for High Impact Course
Level Assessment
 CHEA award winner, 2006
 Institutional Progress in Student Learning Outcomes
 National Council on Student Development (NCSD)
Exemplary Practice Award Winner
CC of Baltimore County (MD)
Course-level Reporting
11



Projects are at least three semesters long
Individual and high-impact courses (all sections) included
Project proposal by a faculty group
Measurable objectives
 External review & approval in selecting methods/instrument &
analyzing results. Benchmarking should be included if possible.
 Controls and sample size considered.




Course improvements based on data analysis
Reassessment expected
Results/report shared across the college and web posted
CC of Baltimore County (MD)
Course-level Reporting
12
 Learning Outcomes Assessment Final Report
Template
1.
2.
3.
4.
5.
Design & proposal for the LOA project
Implementation of design & data collection
Redesign of the course to improve student learning
Implementation of course revisions & reassessment of
student learning
Final analysis and results
Ease dropping



http://www.ccbcmd.edu/loa/CrseAssess.html
Two-page executive summaries available
CC of Baltimore County (MD)
Course-level Reporting
13
 CHEM 108
 An initial “failure” turned to success and collaboration with a
four-year school
 HLTH 101
 Addressing an achievement gap with professional development
and increased communication with students
 CRJU 101 and 202
 Statewide group assessment development effort and creativity
in the interventions used
Riverside CC (CA)
Course-level Reporting
15
 GEG 1
 Assessment part of the program review 2008
 GEG 1 appendix
 GEG 1L appendix
 Ease dropping
 http://www.rcc.edu/administration/academicAffairs/effectivenes
s/assess/resources.cfm
Program-level Reporting
16
 North Central Association of Colleges & Schools,
Higher Learning Commission
 CHEA award winner, 2008

Institutional Progress in Student Learning Outcomes
Hocking College (OH)
Program-level Report
17
 All programs have individual assessment
plans
Mission statement & central objective
for assessment
Institutional success skills (GE)
Program exit competencies
Criteria for and means of assessment
Reporting of results
Hocking College (OH)
18
 Learning outcomes data collected in a student E-
portfolio


Directing internal and external evidence (1 to 10 measures)
Indirect evidence (1 to 4 measures)
 Evidence drawn from samples of student work for
faculty to apply an agreed upon holistic rubric


Eight general education outcomes (student success skills)
Discipline-specific exit competencies or outcomes
Hocking College (OH)
Program-level Report
19
 Ease dropping
 Cloud
reference, not college URL as links are
broken there
 Various reports available in each program profile
 Curriculum
matrix
 Criteria statements (exit competencies)
 Instructional Program Outcomes (assessment plan)
 Trend Charts for performance criteria
Hocking College (OH)
Program-level Report
20
 Example reports and analysis
 Culinary Arts Technology
 Forestry Management Technology
 Nursing Technology
Mesa College (AZ)
General-Education Reports
21
 North Central Association of Colleges & Schools,
Higher Learning Commission
 CHEA Award winner, 2007

Institutional progress in Student Learning Outcomes
Mesa College
General Education Report
22
 Multiple outcomes assessed
Annually
 Annual Report elements
 Executive
Summary
 Methodology
 Results & observations (GE & workplace)
 Indirect measures findings
 Appendices of past results
Mesa Community College (AZ)
23
 General education studies completed 2007-08; 2005-06
Numeracy
 Scientific inquiry
 Problem solving/critical thinking
 Information literacy
 Workplace skills (CTE)
 General education studies completed 2006-07; 2004-05
 Arts & humanities
 Cultural diversity
 Oral communication
 Written communication

Mesa College (AZ)
General Education Reports
24
 Ease dropping

http://www.mesacc.edu/about/orp/assessment/index.html
 Annual
Eight
reports and summaries available
years of history and experience
Capital CC (CT)
General-Education Reports
26
 New England Association of Schools and Colleges,
Commission on Institutions of Higher Education
 Cited in the Art and Science of Assessing General
Education Outcomes: A Practical Guide (AAC&U,
2005)
Capital CC (CT)
General Education Reports
27
 Multiple reports
 One per outcome
 Each study commonly takes a year
 Report elements
 Introduction
 Methods
 Results and findings
 Conclusions and recommendations
 Implications for future assessments
 Appendices of assignment, rubric, notes to teachers,
etc.
Capital CC (CT)
General Education Reports
28
 General education studies completed
 Writing, 2001-02
 Math, 2002-03
 Critical thinking, 2003-04
 Global perspective, 2004-05
 Ease dropping
 http://www.ccc.commnet.edu/slat/
 Annual reports and summaries available
Portland CC (OR)
General Education Reports
30
 Northwest Accrediting Commission
 Ease Dropping
 http://www.pcc.edu/resources/academic/learningassessment/
 One general education theme a year
 Learning Assessment Focus for 2009-10- Critical Thinking &
Problem Solving
Physical Science, Geology and General Science
 Bioscience Technology
 Management and Supervisory Development
 Culinary Assistant Program

Truman State University (MO)
Various Reports
31
 Southern Association of Schools and Colleges, Commission
on Colleges
 Ease Dropping


Assessment work began in 1970
http://assessment.truman.edu/
Assessment Almanac- A compilation of results from each year’s
assessment work (versions from 1997 to 2009 are posted)
 General Education outcomes are assessed in the context of the
major field of study


Portfolio Project- required of all seniors to show best work assessed
by faculty for the nature & quality of the liberal arts and sciences
learning outcomes (versions from 1997 to 2008 are posted)
Authorship
32
 Course-level, program-level & general education
 Teaching faculty study team with technical assistance from


institutional research or assessment committee
No “lone ranger” authors
 Institutional summary
 Academic administrator with assistance from


Learning outcomes coordinator or assessment committee
Compilation of work accomplished in one or two academic
years across the institution
 Automated reporting-(TracDat)
 Sierra College examples
Location
33
 Location of course, program and general education
comprehensive assessment reports

Teaching faculty study team members, assessment committee
chair, assessment website
 Location of institutional summary reports
 Academic administrator, assessment committee chair,
assessment website
 Not in the library basement, actively used to promote
a learning organization
Distribution
34
 To all affected participants
 Campus committees

Curriculum, assessment, resource allocation group, unit
(department) leadership, general academic and college
leadership
 Campus fairs, brown-bag lunches, poster sessions for
information sharing
 Faculty professional development programs
 Accreditation self-study committee work groups
 College web site for the public
Reports & a Learning Organization
35
 Learning organization
 Environment that promotes a culture of learning
 Individual & group learning enriches & enhances the
organization as a whole
Systematic problem solving using data for decisions
 Learning from experiences in assessing organizational
performance
 Comparing yourself to others (benchmarking) and borrowing ideas

Adriana Kezar ed. Organizational Learning in Higher Education New Directions for
Higher Education. No. 131, Fall 2005. Jossey-Bass.
Characteristics of Organization Learning
36
 Researchers have found some critical features of learning






organizations (Lieberman, 2005, pp. 87-98). In particular, a
college as an effective organizational learner:
Maintains a scholarly approach to the questions and problems
that the institution faces;
Approaches the campus problems as learners and not as
experts;
Develops a culture of evidence that drives decision-making;
Links the organizational learning to the college’s mission;
Makes connections throughout the college and not just as
individual units (e. g. , faculty, administration); and
Recognizes and rewards the college’s efforts to become a
“learner.”
Reports as Institutional Learning
& Resource Allocation
37
 The assessment data sense-making process = a faculty learning
experience
 Linking results to future interventions = a learning experience
for
 Faculty, assessment committee, academic administration,
planning & resource allocation groups
 Using results to inform an intervention, then reassess = a
learning experience (accomplished one or more terms later) for
 Faculty, assessment committee, academic administration
 Reference for future assessment work and other groups on
campus
Reports as Institutional Learning
& Resource Allocation
38
 Hocking College (OH)
 Student E-portfolio
 Annual summary
Improvements in the program in the previous year brought on by
study of assessment results
 Expenditures of time, money & materials for the assessment
program
 Requests for assistance in implementing assessment
 Recommendations for altering the institution’s assessment process


Transition from evaluating individual students to assessing
groups of students & the curriculum experience
Reports as Institutional Learning
& Resource Allocation
39
 Community College of Baltimore County (MD)
 Learning Outcomes Assessment Advisory Board


Links findings in assessment reports to other college-wide
initiatives and professional development opportunities
Use of assessment processes and (findings) results
Challenged faculty to reexamine prompts used in assessment
 Clarity of written prompt & extent it supports program goals
 Common assignment options and common rubric increases faculty
understanding and buy-in
 Builds faculty unity toward common goals
 Public web page enhances communication and accessibility to
information

Reports as Institutional Learning
& Resource Allocation
40
 CHEA award winner, 2010
 Institutional Progress in Student Learning Outcomes
Reports as Institutional Learning
& Resource Allocation
41
 Northern
Arizona University- Seals of
Assessment Achievement & Excellence
 Purpose:
 To
recognize programs that have demonstrated
significant progress with assessing student learning
 To promote “best practices” in assessment by sharing
practical experiences
 To encourage programs to showcase program-level
achievements and to adjust curricula when
appropriate.
Reports as Institutional Learning
& Resource Allocation
42
 Feedback & recognition
 Feedback rubric for annual assessment reports
Conversations
and action
Collection and analysis of evidence
Implementation of findings
 Recognition (achievement & excellence)
Reports as Institutional Learning
& Resource Allocation
43
 Seal of Assessment Achievement
 Academic programs earning this recognition have
demonstrated in their annual report that


learning outcomes have been assessed through two or more
methods, and
findings have been discussed among
the faculty.
Reports as Institutional Learning
& Resource Allocation
44
 Seal of Assessment Excellence
 Academic programs earning this recognition have
demonstrated
a thorough implementation of assessment plan(s)
 the reporting of meaningful assessment data
 the discussion of findings among faculty
and perhaps students
 the use of findings to showcase
student achievements and
to make curricular adjustments.

Reports as Institutional Learning
& Resource Allocation
45
 Mesa College (AZ)
 Results Outreach Committee
 Promotes use of outcomes data in relation to faculty
development, pedagogy and academic climate
 Groups of faculty offer a proposal for summer or academic year
work above the course level
 Resulting report placed on the web and used for campus
discussion and action
Report as Quality Assurance
46
 National Institute for Learning Outcomes
Assessment (NILOA)


Assists institutions & others in discovering & adopting
promising practices in the assessment of college student
learning outcomes.
Documenting what students learn, know and can do is of
growing interest to colleges and universities, accrediting
groups, higher education associations, foundations and others
beyond campus, including students, their families, employers,
and policy makers.
Report as Quality Assurance
47
 NILOA
 2010 Webscan report Exploring the Landscape: What
Institutional Websites Reveal About Student learning
Outcomes Assessment Activities
 2010 Connecting State Policies on Assessment with
Institutional Assessment Activity
 Ease dropping

www.learningoutcomeassessment.org
Report as Quality Assurance
48
 Promising Vehicles for Expanding Information to
the Public
 Brief
narrative report from annual assessment
reports
 Simple statistical reports on learning
outcomes or surveys
 Best practices stories supported by
assessment
Peter Ewell. Accreditation & the Provision of Additional Information to the Public about Institutional
and Program Performance, CHEA, May 2004
Quality Assurance to the Public
49
 Voluntary System of Accountability
 APLU & AASCU (520 public institutions, award 70% of
bachelor’s degrees in the US each year)
 Started 2007
 College Profile (includes learning outcomes & links to campus)
Proactive initiative to document learning gains and average
institutional scores (choice of 3 national instruments)
 Proactive initiative to illustrate unique campus learning outcomes
assessment work
 Promoting a learning institution


Ease Dropping

http://www.collegeportraits.org/
Quality Assurance to the Public
VSA Example, Cal Poly Pomona
50
 http://www.collegeportraits.org/map
 Cal Poly Pomona

http://www.csupomona.edu/~academic/programs/ge_assessment/
Quality Assurance to the Public
51
 National Association of independent Colleges and
Universities


Assessment programs on campus tied to institution’s mission
Ease Dropping
http://www.naicu.edu/special_initiatives/accountability/Student
_Assessment/id.514/default.asp
 Pepperdine University
 http://services.pepperdine.edu/oie/learningoutcomes/learning-outcomes-overview.aspx

Where Can I Go?
Resource
52
 Filesanywhere.com
http://www.filesanywhere.com/fs/v.aspx?v=8a69668b5c6773a96f6d
Contacts & Questions
53
 Robert Pacheco (Barstow College)
 [email protected]
 Fred Trapp (Cambridge West Partnership)
 [email protected]
 Questions and Comments
Session Evaluation
Outcomes for the Session
54
 The participant will be able to
Describe the comprehensive assessment report concept.
 Locate best practice examples from other colleges
through web links.
 Discuss how the comprehensive report idea can be used
as part of the institution’s learning process and as a
means by which the institution provides quality assurance
to the public.
 Indicate national trends and efforts of consortia/national
organizations to provide quality assurance about student
experiences and learning outcomes.
