Student Learning Outcomes Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment in Student Services Jerry Rudmann, Irvine Valley College February 2008
Download ReportTranscript Student Learning Outcomes Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment in Student Services Jerry Rudmann, Irvine Valley College February 2008
Student Learning Outcomes Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment in Student Services Jerry Rudmann, Irvine Valley College February 2008 1 Overview - Student Services 1. Fine-tuning assessment a. b. 2. Tips for writing survey items Focus groups Helpful technology tools a. b. c. d. e. f. 3. Clickers - promote active learning and record SLO information Rubric generators - a way to measure most anything PDF Acrobat forms - autoscoring and recording student input Portfolios - making students responsible and reflective Scanning - some ideas Tracking software - organizing all this stuff Several options / strategies for making SLOs meaningful a. b. c. d. e. f. g. SSO versus SLO Problem focus Less is better Use what you already have Think of SLOs in the context of student development Qualitative assessment in OK Other…? 2 Some Options / Strategies for Making SLOs Meaningful 1. 2. 3. 4. 5. 6. 7. 8. Address “robust” SLOs (overarching outcomes) Problem focus Less is better Share SLOs with students Use what you already have Think of SLOs in the context of student development Qualitative assessment is OK SSOs vs. SLOs… 3 General Tip 1: Problem Focus Approach What competencies do students have difficulty mastering? Focus SLO activities on problem areas. 4 General Tip 2: Keep It Simple But Meaningful Corollary - Often, less is better. 5 General Tip 3: Student Development Approach Student development Academic self-efficacy (Bandura) Academic self-regulation Campus involvement (Astin) Mentoring professor studies Student Services DO help student success 6 Surveys 7 Surveys - SLO Uses Students self-rate their competencies on program or college level learning outcomes. Students’ satisfaction with various student services. 8 Types of Questions Open-ended – respondents answer in own words Closed-ended – respondent limited to a finite range of choices 9 Types of Questions Open-ended Flexible Hard to code answers Good for preliminary work to finalize a survey Closed-ended Easier to code answers, process and analyze Hard to write good closed-ended items 10 Item Format Visual Analogue Scale Food in the cafeteria is… Poor_ _ __ _ _ _ _ _ _ _ _ _ _ _ _ _Excellent Likert Scale Food in the cafeteria is outstanding! SD D N A SA (Strongly Agree) (Disagree) (Neutral) (Agree) (Strongly Agree) 11 Nine tips for designing and deploying a survey 1. Don’t call it a survey 2. Provide a carefully worded rationale or justification at the beginning 3. Group items by common format 4. Start with more interesting items 5. Put demographic items last 6. Mix in negative wording to catch acquiescence (aka “response set”) 7. Automate scoring when possible 8. If asking for sensitive information, use procedures designed to assure anonymity 9. Always, always, always pilot test first 12 Survey Administration Methods Face to Face Written Group administration Mail Computerized http://research.ccc.cccd.edu Password protected Validation rules Branching and piping Telephone 13 Focus Groups Focus groups can be especially insightful and helpful for program and institutional level learning outcome assessment. Have your college researcher provide some background materials. Focus Groups: A Practical Guide for Applied Research By Richard A. Krueger, Mary Anne Casey The RP Group sponsored several “drive in” workshops over the last few years. 14 Goal for This Section Technology Uses Technology Tools Expected Outcome: Be able to select and use technology-based approaches to assess student learning outcomes 15 Assessment Challenges Engaging Students in Self-Evaluation More Efficient Assessment 16 Some Technology Tools Online Rubric Builders eLumen (SLO Assessment/Tracking) Classroom Responders (“Clickers”) Scannable and Online Tests and Surveys ePortfolios Adobe Acrobat Forms Excel Spreadsheets 17 Rubrics Way to measure the heretofore immeasurable: products and performances. A rubric breaks the assessment into important components. Each component rated along a scale welllabeled scale. 18 Let’s Develop an Assessment Rubric for a Resume Factor Lists educational background Needs Improvement 0 points Satisfactory Excellent 1 point 2 points Chocolate Chip Chocolate Chip Cookie Cookie RubricRubric Characteristic Texture Poor 1 Cookie is overcooked or undercooked Fair 2 Cookie is fully cooked but only crisp or only chewy Satisfactory 3 Cookie is crisp on the outside and chewy on the inside Excellent 4 Cookie is crispy on the outside; chewy on the inside; moist but not greasy Appearance Total Score Score A B C D Rubrics are Good! Facilitate staff dialogue regarding satisfactory performance. Create a more objective assessment. Make expectations more explicit to the student. Encourage metacognitive skill of self-monitoring own learning. Facilitate scoring and reporting of data. 21 Online Discussion Rubric 22 http://www.uas.alaska.edu/sitka/IDC/resources/onlineDiscussionRubric.pdf Design Your Own Rubric Please work in groups and use the worksheet in your packet to design a scoring rubric for assessing one of the following: Coffee shops Syllabi Customer service at retail stores Grocery stores Online courses 23 Online Rubric Builders Rubrics to guide and measure learning Tools Rubistar http://rubistar.4teachers.org Landmark Rubric Machine http://landmark-project.com/rubric_builder 24 Rubistar Art History Rubric Rubistar 25 Rubric Builder Screen Shot 26 Adobe Acrobat Forms Make form using MS Word Import form and save as PDF form Adjust the fields Add fields to tally sub scores and total scores 27 How Do You Report Results? 31 eLumen to Assess SLOs Reduce Time Spent Creating Reports Assess Course, Program, and/or Degree-Level Outcomes Share Assessment Rubrics Across Classes and Programs View Individual or Aggregated Results Use Online or Offline 32 http://www.elumen.info Use Online or Offline 33 Criterion-Based Assessment Rubrics are attached to each SLO Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007 34 Rubrics Describe Criteria Writes prose clearly Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007 35 Library of Degree-Level SLOs Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007 36 And Rubrics Link to SLOs Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007 37 Science and Gen Ed SLOs/Rubrics from the Biology Department from the faculty committee on critical thinking Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007 from the Science committee from the faculty committee on communication skills 38 Scorecard for All Students in the Course Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007 39 Class Scores by Student Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007 40 Aggregated Data for Course Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007 41 Course Aggregates by Program Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007 42 Classroom Responders Engage students Monitor student understanding Quickly and easily collect and store assessment data Use publisher item banks or create your own 43 Renaissance Classroom Response System PBS Demo 44 Most valuable tip is… A. Finding ways to use technology to make SLO and SSO assessment easier and more efficient B. Concentrating SLO work on skills students have difficulty mastering C. Building SLOs around student development (self-efficacy, goal clarity, etc.) 45 Renaissance Learning for clicker training resources 46 http://www.renlearn.com Scanning Technology A way to gather survey input from students A way to test students’ knowledge 47 http://www.scantron.com and http://www.renlearn.com Surveys and Tests Online or Scannable Surveys Pre and post surveys of student self evaluation of progress Gather stakeholder (faculty, business community leaders, advisory groups) input on expected learning outcomes Student satisfaction with service (SSO) Quizzes/Tests Practice and graded 48 Some Survey Software Options Scannable surveys and quizzes - Optical Mark Reader by Remark (OMR Remark) Need software and a Fujitsu scanner Use word processor to create scannable bubble-in surveys or answer sheets. Produces item analysis output. Online survey tools eListen (Scantron Co.) SelectSurvey.net http://www.classapps.com/SelectSurveyNETO verview.asp 49 Excel Spreadsheets Example of autoscoring and record keeping in a Japanese Program. 50 ePortfolios Advantages Document artifacts of learning Support diverse learning styles Authentic assessment Course, program, or degree-level tracking Job skill documentation Proprietary or Open Source ePortfolio and Open Source Portfolio 51 ePortfolio.org Assessment Lock Assignments after submission Random selection of assignments by learning objective Anonymity of the student who produced the assignment and the instructor Access to the work and the scoring rubrics Reports aggregate scores; generate frequencies/means Ability to download raw data which can be analyzed in another format http://www.eportfolio.org52 Open Source Portfolio Aligned with Sakai Admins or Faculty can structure and review work Learning matrix documents levels of work 53 http://www.osportfolio.org Resources eListen: http://www.elisten.com eLumen: http://www.elumen.info ePortfolios: ePortfolio.org: http://eportfolio.org Open Source Portfolio: http://www.osportfolio.org/ For others, see EduTools ePortfolio product comparison: http://eportfolio.edutools.info/item_list.jsp?pj=16 Online Rubric Builders Rubistar: http://rubistar.4teachers.org Landmark Rubric Machine: http://landmark-project.com/rubric_builder/index.php Coastline Rubric Builder: http://rubrics.coastline.edu Remark Survey Software: http://www.principiaproducts.com/web/index.html Renaissance Classroom Responders: http://www.renlearn.com/renresponder/ SelectSurvey.Net http://www.classapps.com/SelectSurveyNETOverview.asp 54 Contact Info & Acknowledgements Dr. Jerry Rudmann, Professor of Psychology Irvine Valley College [email protected] Much of this slide show was adapted (with the express written permission) from Pat Arlington, Instructor/Coordinator Instructional Research Coastline Community College [email protected] 55 Development of New SLO Measures Procedure, Findings, Conclusions and Recommendations from a Recent Exploratory Study 56 Purpose of the Study The study was designed to explore whether assessment tools used to measure cognitive variables -- e.g., goal clarity, self-efficacy -- could serve as learning outcome measures in Student Services. 57 The Spark for this Study The need for truly appropriate and really useful assessment measures in Student Services. Ideas generated by interviews with counselors. 58 Possible Relationships Students’ Academic Outcomes Attributes of New Students •Confidence level •Goals •Motivation •Study skills and habits Short term outcomes •Semester GPA •Units earned •% units completed •Return next semester •Long range outcomes •GPA •Units earned •Certificate, degree, and/or transfer Possible Relationships Students’ Academic Outcomes Short term outcomes Attributes of New Students •Semester GPA •Confidence level •Goals •Motivation •Study skills and habits Student Services •Units earned •% units completed College success courses •Return next semester Academic counseling •Long range outcomes Career Center presentations Career counseling Career course Club, team, chorus, band, student government or other form of social connectedness Formal and informal recognition for progress Non-academic counseling Transfer Center programs Peer advisors Tutoring center University tours •GPA •Units earned •Certificate, degree, and/or transfer Procedure Counselor interviews (preliminary brainstorming) Literature survey for promising assessments tools Recruitment presentations at Region 8 DSPS and EOPS meetings A website was created having all assessments online 61 Study Website Measures We Tried Academic and Career Goal Clarity Academic Self-Efficacy Dispositional Hope Self-Regulation Optimism Positive Affect Negative Affect 63 Summary and Examples of Measures Found Most Useful in This Study Academic Self-Efficacy Beliefs about one’s capabilities to learn or perform at designated levels. Compared with students who doubt their learning capabilities, those who feel efficacious for learning or performing a task participate more readily, work harder, persist longer when they encounter difficulties, and achieve at a high level. I know how to schedule my time to accomplish tasks. I know how to study to perform well on tests. Academic Self-Regulation Confidence in ability to perform various academic tasks. I can take notes of class instruction. I know how to use the library to get information for assignments. Academic and Career Goal Clarity Measures clarity of immediate and long range academic plans and extent to which student has career clarity. I have worked with a counselor to develop a plan listing the courses I need to complete my lower division coursework. I have decided on an academic major. I am familiar with the daily work routine for people working in my desired career. 64 Participation DSPS study Seven colleges Students Pre-test - 142 Post-test -127 EOPS study Six colleges Students Pre-test - 276 Post-test - 154 65 Descriptive Statistics from the DSPS Study 66 Intercorrelations Among Scales and Academic Outcomes (DSPS study) 67 Predicting Academic Outcomes (Based on Correlation Matrix and Stepwise Regression Analyses) Short Term Academic Outcomes GPA % Units Earned DSPS Study Self-Regulation Academic Self- Efficacy Academic Self- Efficacy Self-Regulation Goal Clarity EOPS Study Academic Self-Efficacy Self-Regulation Academic Self-Efficacy Self-Regulation Hope Goal Clarity 68 Impact of Services on Student Outcomes STUDY Service Outcome DSPS Academic OR Career Counseling GPA % of Units Earned EOPS Transfer Assistance Goal Clarity Gain Peer Advisement Goal Clarity Gain 69 DSPS Study Self Regulation, Receipt of Counseling, and Semester GPA 2.8 2.6 Semester GPA 2.4 Yes-Received Academic or Career Counseling Services 2.2 2 No-Did Not Receive Academic or Career Counseling Services 1.8 1.6 Low SR - Yes Counseling = 32 Low SR - No Counseling = 31 High SR - Yes Counseling = 20 High SR - No Counseling = 20 (N = 103) 1.4 1.2 1 Low High Level of Initial Self-Regulation 70 DSPS Study Self Regulation, Receipt of Counseling, and Percentage of Units Earned for Units Attempted % Units Earned of Units Attempted 85 80 75 Yes-Received Academic or Career Counseling Services 70 65 60 No-Did Not Receive Academic or Career Counseling Services 55 50 Low SR - Yes Counseling = 32 Low SR - No Counseling = 32 High SR - Yes Counseling = 20 High SR - No Counseling = 20 (N = 103) 45 40 35 Low High Level of Initial Self-Regulation 71 EOPS Study Self Regulation, Peer Advising, and Semester GPA 3.15 3.05 Semester GPA 2.95 Yes - Received Peer Advising 2.85 2.75 No - Did Not Receive Peer Advising 2.65 2.55 Low SR -Yes Peer Adv = 17 Low SR - No Peer Adv = 51 High SR -Yes Peer Adv = 15 High SR - No Peer Adv = 55 (N = 138) 2.45 2.35 2.25 Low High Level of Initial Self-Regulation 72 Changes in Goal Clarity and Receipt of Transfer Assistance Changes in Goal Clarity Over the Semester 25 20 15 10 No Transfer Assistance 5 Yes--Received Transfer Assistance 0 Low High -5 -10 Level of Initial Goal Clarity 73 Changes in Goal Clarity and Receipt of Peer Advisement Changes in Goal Clarity Over the Semester 25 20 15 10 No Peer Advisement 5 Yes--Received Peer Advisement 0 Low High -5 -10 Level of Initial Goal Clarity 74 Limitations of Study Lack of random selection and assignment to treatments Self-selection bias Results are correlational, not causal Data are an aggregate from the participating colleges, but there may be significant differences among colleges, procedures, services, personnel, etc. 75 Thoughts… The instruments are Inexpensive, easy to complete and score Can help identify “at risk” students Can help formulate appropriate ways to assist students Gain scores derived from pre to post-test assessments can be useful Use as SLO assessment instruments that are good matches to the services provided within Student Services 76 Recommendations “Map” your services to the constructs measured by these instruments. Develop new interventions where none currently exist. Create an assessment referral system. 77 The Three Instruments Found Most Useful Academic and Career Goal Clarity (Tucker & Rudmann, 2006) Measures overall clarity and sub-components of goal clarity Academic Self-Efficacy (Chemers, Hu, & Garcia, 2001) Measures confidence in reaching positive academic outcomes Efficacy for Self-Regulated Learning (Zimmerman, Bandura, & MarinezPons, 1992) Measures confidence in one’s ability to manage and regulate academic tasks students face in college 78 Goal Clarity Instrument Structure Structure Matrix Component 1 2 3 4 0.429 0.705 0.137 0.359 0.833 0.334 0.440 0.616 0.381 0.436 0.468 0.242 0.528 0.513 0.206 0.494 0.430 0.339 0.494 0.380 0.368 0.841 0.518 0.596 0.332 0.799 0.514 0.690 0.420 0.440 0.372 0.385 0.884 0.354 0.465 0.278 0.860 0.428 0.372 0.564 0.510 0.809 0.355 0.324 0.115 0.536 0.693 0.591 0.237 0.576 0.699 0.711 0.166 0.664 0.644 0.732 0.324 0.567 0.809 0.504 0.342 0.327 0.781 0.459 0.424 0.634 0.794 0.562 0.374 0.830 0.526 0.411 0.711 0.721 0.175 0.095 0.198 0.341 0.718 0.373 0.244 1. I hav e identified at least one area of interest that I w ould like to pursue 0.606 in my education. 2. I hav e decided on an academic maj or. 0.508 3. For my maj or or academic goal, I know the list of courses that I need 0.806 to take. 4. I hav e w orked w ith a college counselor to dev elop a plan listing the 0.810 courses I need for my low er div ision course w ork. 5. I am aw are of the steps it w ill take for me to complete my highest 0.813 academic goal. 6. I am clear about how long it w ill take for me to complete my education 0.860 to meet my final academic goal. 7. I am pretty sure about the amount of time it w ill take me to complete all of my low er div ision (freshman and sophomore) course w ork. 0.868 8. I know how many and the specific classes I w ill need to take each semester to complete my academic goal. 9. All in all, I am set w ith a clear academic plan tow ard completing my educational goal. 10. I hav e a pretty good idea of the college to w hich I plan to transfer. 11. I hav e at least one alternativ e college in mind j ust in case I'm not accepted into the college or univ ersity to w hich I most w ant to transfer and attend. 12. I am sure about w hat I w ant to do for my occupation. 13. I hav e sev eral career options in mind for myself. 14. I'v e thought about the type of w ork env ironment that I desire for my career. 15. I know the most important skills needed for at least one of the careers I hav e in mind. 16. I hav e a pretty good idea of the college degree requirements for the career I hav e in mind. 17. I am familiar w ith the daily w ork routine for people w orking in my desired career. 18. I know the approximate salary range for at least one of my occupational choices. 19. I know the steps that I need to take to enter the career of my choice. 20. I know the typical w orking hours for at least one of my career 0.422 choices. 21. I know w hat a curriculum v itae or resume is. 0.610 22. I know how to make a curriculum v itae or resume of my ow n. 0.502 23. I hav e spoken w ith or heard a talk giv en by someone about the career 0.304 I w ant to hav Principal e. Extraction Method: Component Analysis. Rotation Method: Promax with Kaiser Normalization. 79 One Interpretation Structure Matrix Detailed Ed Plan 1. I hav e identified at least one area of interest that I w ould like to pursue 0.606 in my education. Transfer Goal Clarity 0.429 0.705 0.137 0.359 0.833 0.334 0.440 0.616 0.381 0.436 0.468 0.242 0.528 0.513 0.206 0.494 0.430 0.339 0.494 0.380 0.368 0.841 0.518 0.596 0.332 0.799 0.514 0.690 0.420 0.440 0.372 0.385 0.884 0.354 0.465 0.278 0.860 0.428 0.372 0.564 0.510 0.809 0.355 0.324 0.115 0.536 0.693 0.591 0.237 0.576 0.699 0.711 0.166 0.664 0.644 0.732 0.324 0.567 0.809 0.504 0.342 0.327 0.781 0.459 0.424 0.634 0.794 0.562 0.374 0.830 0.526 0.411 0.711 0.721 0.175 0.095 0.198 0.341 0.718 0.373 0.244 2. I hav e decided on an academic maj or. 0.508 3. For my maj or or academic goal, I know the list of courses that I need 0.806 to take. 4. I hav e w orked w ith a college counselor to dev elop a plan listing the 0.810 courses I need for my low er div ision course w ork. 5. I am aw are of the steps it w ill take for me to complete my highest 0.813 academic goal. 6. I am clear about how long it w ill take for me to complete my education 0.860 to meet my final academic goal. 7. I am pretty sure about the amount of time it w ill take me to complete all of my low er div ision (freshman and sophomore) course w ork. 0.868 8. I know how many and the specific classes I w ill need to take each semester to complete my academic goal. 9. All in all, I am set w ith a clear academic plan tow ard completing my educational goal. 10. I hav e a pretty good idea of the college to w hich I plan to transfer. 11. I hav e at least one alternativ e college in mind j ust in case I'm not accepted into the college or univ ersity to w hich I most w ant to transfer and attend. 12. I am sure about w hat I w ant to do for my occupation. 13. I hav e sev eral career options in mind for myself. 14. I'v e thought about the type of w ork env ironment that I desire for my career. 15. I know the most important skills needed for at least one of the careers I hav e in mind. 16. I hav e a pretty good idea of the college degree requirements for the career I hav e in mind. 17. I am familiar w ith the daily w ork routine for people w orking in my desired career. 18. I know the approximate salary range for at least one of my occupational choices. 19. I know the steps that I need to take to enter the career of my choice. ComponentCareerRelated Academic Career Goal Goal 20. I know the typical w orking hours for at least one of my career 0.422 choices. 21. I know w hat a curriculum v itae or resume is. 0.610 22. I know how to make a curriculum v itae or resume of my ow n. 0.502 23. I hav e spoken w ith or heard a talk giv en by someone about the career 0.304 I w ant to hav Principal e. Extraction Method: Component Analysis. Rotation Method: Promax with Kaiser Normalization. 80 Converting to Adobe Acrobat Interactive Form 81 Current & Potential Services for Enhancing These Important Student Learning Outcomes SLOs What Do or Could Do to Increase Low Scores Assessment Tool Academic SelfEfficacy Self-Regulation ? Efficacy scale ? Academic and Career Goal Clarity ? Self-regulation scale Goal clarity scale 82 Research Team Jerry Rudmann, PhD Professor of Psychology Irvine Valley College [email protected] Kari Tucker, PhD Professor of Psychology, Department Chair [email protected] Shañon Gonzalez, MA Coastline College Research Assistant III [email protected] 83 Four Sources of Efficacy Beliefs Mastery experiences -- Outcomes interpreted as successful raise efficacy, those interpreted as failures lower efficacy Vicariously – success or failure of models Verbal persuasions by others – positive or negative appraisals by others Physiological states (e.g., anxiety, stress, arousal, fatigue, mood) act as information about efficacy beliefs and can raise or lower efficacy 84