Transcript TracDat Overview - Ferris State University: Michigan
Office of Academic Affairs Robbie Teahen Kim Wilber
http://www.youtube.com/watch?v=DRBW8eJG TVs
Simply: To advance the quality of student
learning through careful elaboration of intended learning, meaningful measures of students’ learning achievements, and systematic collection of data that informs instructional and other improvements – at the level of courses, programs, colleges, and institutions. Assessment involves going beyond the evaluation of individual student performance (Teahen, 2008).
Determine learning Needs Analyze learner needs Assess performance
Curriculum Development
Specify learning outcomes Plan learning activities
As a result of participating in today’s session, you will: Log into the system Change your password Describe key terms, such as “reporting” or “assessment” units and “learning outcomes” Create an appropriate learning outcome Enter an assessment plan(e.g. program or course) Enter assessment results Locate implementation plans/timelines Produce a curriculum map Create an ad hoc report Be prepared to continue to enter target outcomes/goals, assessment plans, and your results on an ongoing basis
Create a repository for conserving assessment information Stimulate cross-discipline sharing of learning and institutional effectiveness outcomes Focus unit-level efforts on specifying and monitoring assessment outcomes Encourage use of assessment results to inform course, program, and institution level learning enhancements Streamline reporting requirements for academic units Contribute to a data-informed, data-driven culture
User roles Security Access Changing the password Entering data – outcomes, plans, and results Attaching documents Linking assessment unit outcomes to reporting unit standards or plans Producing standard and custom reports Resources
Role Options: User Reports-Only Program-level administrator The primary difference is that the program-level administrator is responsible for the Assessment Unit Tab. (See next slide) Ferris Administrator (Kim Wilber) Access and Security levels and permissions: Form (see Academic Affairs assessment website) Approvals by Department Head and Dean Submit requests to Maureen Milzarski
Comparing instructor performance . . .
Outcomes and measures are expected to be highly unique, so it will not be possible to compare “performance” across programs or disciplines Instructor Evaluation Instructors will continue to be evaluated by existing methods. If they want to use this information to demonstrate their effectiveness, it’s their option.
Determining What Information is Reported Inclusion is decided at the appropriate level. . . What the program personnel put in is what they decide to enter. Most reporting will be at the program level, and course-level reported is expected to evolve at the This Too Shall Pass. . .
Assessment of student learning has been required since 1995. Expectations have heightened. I Don’t Have Time (It’s too much work) . . .
Assessment is integral to the work of a professional educator. It is a part of the cycle of instructional design and the continuous improvement approach of “plan, do, check, act.”
Show/Tell (using the Demo Site) Components included Unit Name Mission Sites Offered Accreditation Entity, Date of next accreditation visit Certification and Licensing Online status and plans Advisory Board?
Next Academic Program Review Date
Show/Tell/Do (using Demo or Sandbox) Outcome name Outcome statement Assessment Method Criterion for Success Related Goals Additional assessment methods
Learning Outcomes: describe one of the
major skills
that is an intended outcome for a represent a skill that a competent individual would use
outside
the begin with an
action verb
describing what the learner will be able to do are
measurable
and observable present a clear, concise, and
precise
statement describing the action specify a
single performance
/outcome, not a combination describe
learner performance
, not the instructor's activities, learning
Outcome
: Technology Use
Outcome Statement
: Learners will demonstrate their use of common functions associated with software relevant to the discipline (e.g. MS Office, SPSS, CAD, etc.)
Measure(s)
: 1.
2.
3.
Capstone project assignments will incorporate the utilization of common software applications associated with the field. Rubrics will be provided for each that address learner performance in technology use.
Exams in the second-year major course will incorporate timed tests utilizing identified software to produce documents appropriate to meet external performance requirements. Standards of the profession will be utilized to assess learners’ performance.
Throughout the program, individual course requirements will incorporate and report on technology-use performance by students, as appropriate
Outcome Name Outcome Measures Outcome Type Status (active or not) Asterisks represent required fields – over time, more will be required
SAVE CHANGES (button at bottom)
Relating to a particular outcome, specify: Method Category Offer your suggestions Method (Description) Criterion What will success look like for this program or course?
Schedule When will assessment method(s) be implemented? Frequency?
Multiple Measures Especially at the program level, multiple measures should be used.
Learners will demonstrate their use of common functions associated with software relevant to the discipline (e.g. MS Office, SPSS, CAD, etc.)
Measure(s)
: 1.
Capstone project assignments will incorporate the utilization of common software applications associated with the field. Rubrics will be provided for each that address learner performance in technology use.
Results
:
1.
2.
Review of 32 capstone projects for students in the X program during the spring of 2008 revealed that 95% of the learners were able to perform all specified functions within MS Word and Powerpoint, but just 62% could demonstrate their abilities to perform specified functions within Access. Further, AutoCAD design capabilities were rated to be at an average level of 3 on a scale of 1 to 5, with 10% of the soon to-graduate students not meeting minimum standards for the profession.
Review of 18 capstone projects in spring 2009 . . . .
Action Plan:
1.
2.
Faculty within the major will meet in August 2008 to examine the curriculum to determine where and how Access and CAD are introduced and reinforced and development supplemental units to assist students to more adequately achieve intended outcomes. Faculty meetings will address this performance concern and monitor curricular changes and end-of-year performance of students in spring 2009. Results from the Spring 2009 will be reported and determination made about whether additional curricular reform is required.
During the fall of 2009, faculty will incorporate more practice assignments in each software-related course and utilize ITAP students to support instructors in labs where enrollment exceeds 24 students.
Create folders and Attach documents. . .
Examples: Rubrics Standards Assignments Comprehensive analyses as backup to summary results Link to standards or criteria Such as accreditation standards Program outcomes General education outcomes Industry standards, such as Microsoft Certification
A
reporting unit
is a group of two or more assessment units for which individuals may want to produce reports. Examples include: College of Business Department of the Built Environment General Education: Global Consciousness English and Writing-Intensive Courses Your
Liaison
should let us know what Reporting Units you need and which assessment units should be linked to it.
Standard (refer to list) Ad Hoc (Custom Reports)
All Unit information should be entered All program outcomes and assessment plans should be entered Results for at least one program outcome should be entered Course assessment plans for multi-section and general education courses should be entered soon
(original plan was spring 09, but there were delays in getting courses loaded)
At least Enter outcomes as available for course-level assessments
Courses must be entered before you can produce the curriculum map. Purposes include: identify gaps, Identify unnecessary redundancies, Identify appropriate progression across the curriculum (i.e., Introduction precedes reinforcement and assessment) Identify actionable improvements when evaluating program-level outcomes
Primary communicator of needs to Academic Affairs (AA) office Responsibility for maintaining currency of Assessment Unit page Provide assistance to users within College Participate in occasional meetings of AA office regarding process improvements Assist in unit-level reporting as required or appropriate See list on website
90% of Ferris courses will be in TracDat with clear, measurable outcomes 80% of Ferris courses will have effective assessment methods with criteria for success The courses in 75% of the programs will be integrated into a curriculum map to program outcomes by December 2009. All faculty will be engaged in active assessment at the course level to enhance student learning. Find entire assessment plan here: http://www.ferris.edu/htmls/administration/academicaffairs/assess ment/plan0809.htm
"The result of this paradigm shift is a college where faculty are the designers of powerful learning environments, where curriculum design is based on an analysis of what a student needs to know to function in a complex world rather than on what the teacher knows how to teach, where the college is judged, not on the quality of the entering class, but on the quality of aggregate learning growth possessed by its graduates, where compartmentalized departments are replaced by cross-disciplinary cooperatives, and where every employee has a role to play and a contribution to make in maintaining a learner-centered environment (p. 5).” -Bill Flynn, Palomar College, 1998