Progress in Graduate Attribute Assessment in Mechanical
Download
Report
Transcript Progress in Graduate Attribute Assessment in Mechanical
Progress in Graduate
Attribute Assessment in
Mechanical Engineering at
McMaster University
M.F. Lightstone
July 2014
1
Organizational Structure
• Faculty of Engineering Grad. Attributes Comm.
– Formed in early 2010 to develop strategy/guidelines
for graduate attributes assessment
– 1 member from each department
– Fall 2010: set of ‘indicators’ for each attribute were
developed
• Mech Eng Departmental Grad Attributes Comm.
– Formed in early 2011
– Mandate: assist faculty with curriculum mapping,
measurement of indicators, continuous improvement,
rubric development and documentation of
measurements.
2
Mech. Eng. Progress – Early 2011
• Curriculum mapping of indicators was performed based on
2010/11 calendar.
• For each course and indicator, instructor assessed:
0 - indicator not covered
1 – mentioned
2 – taught and graded
3 – significant part of the course
• Revealed areas for continuous improvement:
– Technical communications
– Professionalism
– Ethics
– Teamwork
– Conflict resolution
– sustainability
3
Mechanical Engineering Progress:
Continuous Improvement (2011/12)
• Major changes made to curriculum (2011/12):
– New course MECH ENG 2A03 (“Design
Communication”): has module on technical
communication. Also added AUs for CEAB input
based assessment.
– ENGINEER4A03 (“Engineering & Social
Responsibility”): includes professionalism and
ethics. Now a required course.
4
Mechanical Engineering Progress
Continuous Improvement (2011/12)
• Capstone design course (MECH ENG 4M06):
Incorporated additional lectures on:
– Professionalism and ethics (Ross Judd)
– Library research (Andrew Colgoni, McMaster Library)
– Teamwork/conflict resolution/emotional intelligence
(Sonia Hawrylyshyn, Manager, Employee Career
Services, Human Resources McMaster)
– Success in the Workplace (Dr. John Mackinnon, VicePresident Engineering, AMEC-NSS)
– Students were tested on lecture content.
5
Measurement – 2011/12 – Trial Year
• MECH ENG 4M06 – Capstone design:
– Introduced rubrics for assessment of presentations (2
per year) and final report (April 2012).
– Rubrics were carefully written to incorporate
‘indicators’ associated with:
•
•
•
•
engineering design
communication
sustainability
economics and project management
– Created web-based survey to assess indicators
associated with “individual and team work”
6
Measurement (2011/12 - TRIAL) (con’t)
• Knowledge base for engineering: (Indicator:
“competence in engineering fundamentals”)
• Measure in core courses, span years 2-4:
•
•
•
•
MECH ENG 4V03 “Thermo-fluids Sys. Des”(fall – 2011)
MECH ENG 3F04 “Numerical Methods” (fall – 2011)
MECH ENG 2W04 “Thermodynamics”(winter – 2012)
MECH ENG 4R03 “Control Systems” (winter – 2012)
• Method for measurement: performance on
specific questions on tests/exams
7
2011/12 – What did we learn?
• Professors were unclear as to how to do
measurements
• Training the faculty on how to actually do GA
measurements was critical
• Needed a step-by-step method that would
remove the “fuzziness” and also provide some
background information on “jargon” used.
8
Workshop on GA Measurement – April 2012
• Ken, Carlos and Marilyn worked on developing a faculty
wide workshop on how to actually do GA measurement.
• Includes:
– Background on GA, indicators, Learning Outcomes, Bloom’s
Taxonomy,…
– How to create a rubric
– How to do the measurement – the logistics and examples
– What to include in the final report
– Importance of continuous improvement
• First workshop given on April 25, 2012.
• Has been repeated numerous times at both Faculty of
Engineering level and to individual departments.
9
At the Department Level
• Measurements are essentially done at the departmental level
and organized by the Department GA Committee (but with some
guidance from Faculty level committee).
• In Mechanical Engineering – measurements are done for all CORE
courses.
• Key point: ALL professors are expected to participate!
• Departments responsible for:
– Developing a “Measurement Plan” for each academic year
– Reviewing the quality of the reports that each professor submits
(checklist provided)
– Creating “year-end” report that integrates the measurements taken
from individual courses
• The importance of sharing documentation and processes
between departments
• Organizing all the material: Marilyn developed website that
facilitated sharing of key documents and processes
10
Summary of TimeLine
• Fall 2010 – Developed indicators (Faculty level)
• Early 2011 – Indicator mapping (Dept level)
• 2011/12 – Trial Year
– Addressed gaps in mapping (Dept level)
– Trial measurements in a few courses (Dept level)
– GA measurement workshop (Faculty level – April 2012)
• 2012/13 – Nearly Real Measurements
–
–
–
–
–
Measured indicators for 6 attributes for all core courses
Instructors are still learning, so not all reports at same quality
GA measurement workshop given again + presentations by each dept (May 2013)
Summer 2013 – Faculty streamlined indicator list, dept did a remapping
Note that Marilyn was on sabbatical that year
• 2013/14 – Real Measurements
– Measured indicators for other attributes for core courses (plus “Knowledge Base”
measured every year)
– Help from Minha on teamwork and communication
• 2014/15 – CEAB Evaluation year
– Bring in “stakeholders” into Department GA committee
– Prepare for site visit in fall of 2015.
11
Important Points
• Need to provide clear training to professors on
how to do the measurements
• Sharing of documentation, rubrics, processes,
between departments is critical
• Document measurement results and store
documents centrally
• Check for continuous improvement (have
professors made changes identified in
previous measurement)
12
Documents that U. Guelph might want to see:
•
•
•
•
•
GA measurement workshop presentation
Checklist for GA reports
Sample measurement report
Sample ‘checklist’ for reviewing GA reports
GA – suggested management structure
document by Marilyn
13