Transcript Slide 1
The Quality of Solutions to Open-Ended Problem Solving Activities and its Relation to First-Year Student Team Effectiveness Tamara J. Moore Heidi A. Diefes-Dux P.K. Imbrie What are Model-Eliciting Activities? MEAs are authentic assessment activities that are open-ended with a fictitious client Connect mathematical modeling to other fields Elicit students thinking in the process of solving Require teams of problem solvers Research Question What relationship exists between student team functioning (as measured by interdependency, goal setting, and potency) and performance on ModelEliciting Activities? Setting ENGR 106: Engineering Problem Solving and Computer Tools First-year required introductory course in engineering (Approx. 1400 students) Problem Solving – Mathematical Modeling Teaming Engineering Fundamentals – statistics/economics/logic development Computer Tools – Excel/MATLAB Factory Layout MEA Client: The general manager of a metal fabrication company Provide results for 122,500 ft2 square layout Placement of departments: extrusion, heat-treat, shipping/receiving, and office space Total distance and order of material travel for each product Final department dimensions Propose a reusable procedure to determine any square plant layout that takes spatial concerns and material travel into account Teaming What are teams? Task-oriented Interdependent social entities Individual accountability to team Why encourage teaming? Research indicates student participation in collaborative work increases learning and engagement Accreditation Board for Engineering and Technology (ABET) Demand from industry Team Effectiveness Scale Student-reported questionnaire to measure team functionality 26-item Likert scale Given immediately following MEA Internal reliability measured Cronbach’s Alpha > 0.95 (N ~ 1400) Subscales Interdependency, Potency, Goal Setting, and Learning Researcher Observations Observation of one group per lab visited Based on teaming literature Interdependency – 3 items Potency – 2 items Goal Setting – 2 items Teams received 1-5 score for 7 items Detailed field notes also taken Quality Assurance Guide Does the product meet the client’s needs? Performance Level How useful is the product? 1 Requires redirection The product is on the wrong track. Working longer or harder won’t work. 2 Requires major extensions or revisions The product is a good start toward meeting the client’s needs, but a lot more work is needed to respond to all of the issues. 3 Requires only minor editing The product is nearly ready to be used. It still needs a few small modifications, additions or refinements. 4 Useful for this specific data given No changes will be needed to meet the immediate needs of the client, but this is not generalizable to new but similar situations. 5 Sharable or reusable The tool not only works for the immediate situation, but it also would be easy for others to modify and use it in similar situations. Results 11 student teams observed Correlation of rankings of: 1. 2. 3. 11 teams self-reporting ranking 11 observation score ranking Aggregate score ranking With the MEA Quality Score Results MEA Quality Score vs.11 teams self-reporting ranking Kendall’s Beta-Tau coefficient is -0.410 Not statistically significant at a 0.05 level (2-tailed correlation) (p=0.108) Moderate degree of correlation Results MEA Quality Score MEA Score vs. Self-Reported Team Rank 5 4 R2 = 0.29 3 2 1 0 0 1 2 3 4 5 6 7 8 9 Self-Reported Team Rank 10 11 12 Results MEA Quality Score vs.11 teams observed ranking Kendall’s Beta-Tau coefficient is -0.572 Statistically significant at a 0.05 level (2-tailed correlation) Moderate degree of correlation Results MEA Score vs. Observed Team Rank MEA Quality Score 5 4 R2 = 0.31 3 2 1 0 0 1 2 3 4 5 6 7 8 Observed Team Rank 9 10 11 12 Results MEA Quality Score vs. Aggregate Team score ranking Kendall’s Beta-Tau coefficient is -0.733 Statistically significant at a 0.01 level (2-tailed correlation) Marked degree of correlation Results MEA Quality Score MEA Score vs. Aggregate Teaming Rank 5 4 R2 = 0.63 3 2 m 1 0 0 1 2 3 4 5 6 7 8 9 10 Aggregate Team Effectiveness Rank 11 12 Findings Data suggests that More work is needed in having students understand how to self-assess their teaming abilities Research is needed to understand which of the team functioning categories are most important – especially in the observer rankings Significance of the Study Provides insight to the fundamental question: Leads to other research questions Does team functionality affect team performance? Which characteristics of teaming are more likely to create better solutions? How are these team attributes best fostered in the classroom? Contributes to the discussion on ABET and the role of teaming and problem solving in undergraduate engineering education Questions? To contact me: Tamara Moore [email protected]