Evidence-Based Best Practices for Interactive Online Learning Environments Dr. Curtis J. Bonk Associate Professor, Indiana University President, CourseShare.com http://php.indiana.edu/~cjbonk, [email protected].
Download ReportTranscript Evidence-Based Best Practices for Interactive Online Learning Environments Dr. Curtis J. Bonk Associate Professor, Indiana University President, CourseShare.com http://php.indiana.edu/~cjbonk, [email protected].
Evidence-Based Best Practices for Interactive Online Learning Environments Dr. Curtis J. Bonk Associate Professor, Indiana University President, CourseShare.com http://php.indiana.edu/~cjbonk, [email protected] Tons of Recent Research Not much of it ...is any good... Problems and Solutions (Bonk, Wisher, & Lee, in review) 1. 2. 3. 4. 5. Tasks Overwhelm and confuse Too Nice Due to Limited History Lack Justification Too much data Communities not easy to form Train, be clear, structure due dates Develop roles and controversies Train back up claims Use Email Pals Embed Informal/Social Benefits and Implications (Bonk, Wisher, & Lee, in review) 1. 2. 3. 4. 5. 6. 7. Shy open up online Minimal off task Delayed collab more rich than real time Students can generate lots of info Minimal disruptions Extensive E-Advice Excited to Publish Use async conferencing Create social tasks Use Async for debates; Sync for help, office hours Structure generation and force reflection/comment Foster debates/critique Find Experts or Prac. Ask Permission Basic Distance Learning Finding? • Research since 1928 shows that DL students perform as well as their counterparts in a traditional classroom setting. Per: Russell, 1999, The No Significant Difference Phenomenon (5th Edition), NCSU, based on 355 research reports. http://cuda.teleeducation.nb.ca/nosignificantdifference/ Online Learning Research Problems (National Center for Education Statistics, 1999; Phipps & Merisotos, 1999; Wisher et al., 1999). Anecdotal evidence; minimal theory. Questionable validity of tests. Lack of control group. Hard to compare given different assessment tools and domains. Fails to explain why the drop-out rates of distance learners are higher. Does not relate learning styles to different technologies or focus on interaction of multiple technologies. Online Learning Research Problems (Bonk & Wisher, 2001) • For different purposes or domains: in our study, 13% concern training, 87% education • Flaws in research designs - Only 36% have objective learning measures - Only 45% have comparison groups • When effective, it is difficult to know why - Course design? - Instructional methods? - Technology? Evaluating Web-Based Instruction: Methods and Findings (41 studies) (Olson & Wisher, in review) Number of Studies Year of Publication 12 10 8 6 4 2 0 1996 1997 1998 1999 Year 2000 2001 Evaluating Web-Based Instruction: Methods and Findings (Olson & Wisher, in review) “…there is little consensus as to what variables should be examined and what measures of of learning are most appropriate, making comparisons between studies difficult and inconclusive.” e.g., demographics (age, gender), previous experience, course design, instructor effectiveness, technical issues, levels of participation and collaboration, recommendation of course, desire to take add’l online courses. Evaluating Web-Based Instruction: Methods and Findings (Olson & Wisher, in review) Variables Studied: 1. Type of Course: Graduate (18%) vs. undergraduate courses (81%) 2. Level of Web Use: All-online (64%) vs. blended/mixed courses (34%) 3. Content area (e.g., math/engineering (27%), science/medicine (24%), distance ed (15%), social science/educ (12%), business (10%), etc.) Other data: a. Attrition data collected (34%) b. Comparison Group (59%) Different Goals… Making connections Appreciating different perspectives Students as teachers Greater depth of discussion Fostering critical thinking online Interactivity online Wisher’s Wish List Effect size of .5 or higher in comparison to traditional classroom instruction. Web Based Instruction Average Effect Size Number of Studies CBI Kulik [8] CBI Liao [18] 31 . 32 . 11 97 46 . 41 Electronic Conferencing: Quantitative Analyses Usage patterns, # of messages, cases, responses Length of case, thread, response Average number of responses Timing of cases, commenting, responses, etc. Types of interactions (1:1; 1: many) Data mining (logins, peak usage, location, session length, paths taken, messages/day/week) Electronic Conferencing: Qualitative Analyses General: Observation Logs, Reflective Specific: Semantic Trace Analyses, Emergent: Forms of Learning Assistance, interviews, Retrospective Analyses, Focus Groups Talk/Dialogue Categories (Content talk, questioning, peer feedback, social acknowledgments, off task) Levels of Questioning, Degree of Perspective Taking, Case Quality, Participant Categories Overall frequency of interactions across chat categories (6,601 chats). On-Task Social Mechanics Mechanics 15% 70% 60% 50% 40% 30% On-Task 55% Social 30% 20% 10% 0% Month 1,2 Month 3,4 Month 5,6 Research on Instructors Online If teacher-centered, less explore, engage, interact (Peck, and Laycock, 1992) Informal, exploratory conversation fosters risktaking & knowledge sharing (Weedman, 1999) Four Key Acts of Instructors: pedagogical, managerial, technical, social Instructors Tend to Rely on Simple Tools (Ashton, Roberts, & Teles, 1999) (Peffers & Bloom, 1999) Job Varies--Plan, Interaction, Admin, Tchg (McIsaac, Blocher, Mahes, & Vrasidas, 1999) Network Conferencing Interactivity (Rafaeli & Sudweeks, 1997) 1. > 50 percent of messages were reactive. 2. Only around 10 percent were truly interactive. 3. Most messages factual stmts or opinions 4. Frequent participators more reactive than low. 5. Interactive messages more opinions & humor. 6. More self-disclosure, involvement, & belonging. 7. Attracted to fun, open, frank, helpful, supportive environments. Starter Centered Interaction: Scattered Interaction (no starter): Week 4 Collaborative Behaviors (Curtis & Lawson, 1997) Most common were: (1) Planning, (2) Contributing, and (3) Seeking Input. Other common events were: (4) Initiating activities, (5) Providing feedback, (6) Sharing knowledge Few students challenge others or attempt to explain or elaborate Recommend: using debates and modeling appropriate ways to challenge others Online Collaboration Behaviors by Categories (US and Finland) Behavior Categories Planning Conferences (%) Finland U.S. Average 0.0 0.0 0.0 Contributing 80.8 76.6 78.7 Seeking Input 12.7 21.0 16.8 Reflection/ Monitoring 6.1 2.2 4.2 Social Interaction 0.4 0.2 0.3 100.0 100.0 100.0 Total Dimensions of Learning Process (Henri, 1992) 1. Participation (rate, timing, duration of messages) 2. Interactivity (explicit interaction, implicit interaction, & independent comment) 3. Social Events (stmts unrelated to content) 4. Cognitive Events (e.g., clarifications, inferencing, judgment, and strategies) 5. Metacognitive Events Cognitive Skills Displayed in Online Some Findings Conferencing (see Hara, Bonk, & Angeli, 2000) Percent of Coded Units 40 35 Social (in 26.7% of units coded) 30 social cues decreased as semester 25 20 progressed; messages became less formal 15 Cognitive (in 81.7% of units) 10 5 More inferences & judgments than 0 clarifications of St ra ts Ju dg me nt Inf er en cin g Ap pli c More reflections on exper & self-awareness Cognitive Skills Some planning, eval, & regulation & self q’ing InDe pt h Cl ar if Cl ar if Metacognitive (in 56% of units) Ele m Surface vs. Deep Posts (Henri, 1992) Surface Processing making judgments without justification, noting that one shares stated ideas or opinions repeating what said asking irrelevant q’s i.e., fragmented, narrow, and somewhat trite. In-depth Processing linked facts and ideas offered new information discussed advantages & disadvantages Made judgments supported by examples or justification i.e., more integrated, weighty, and refreshing. Level of Cognitive Processing: All Posts Both 12% Surface 33% Surface Deep Deep 55% Both Critical Thinking (Newman, Johnson, Webb & Cochrane, 1997) Used Garrison’s five-stage critical thinking model Critical thinking in both CMC and FTF envir. Depth of critical thinking higher in CMC envir. More likely to bring in outside information Link ideas and offer interpretations, Generate important ideas and solutions. FTF settings were better for generating new ideas and creatively exploring problems. Unjustified Statements (US) 24. Author: Katherine Study #3. Fall, 1997 Date: Apr. 27 3:12 AM 1998 I agree with you that technology is definitely taking a large part in the classroom and will more so in the future… 25. Author: Jason Date: Apr. 28 1:47 PM 1998 I feel technology will never over take the role ofUnsupported the teacher...I feel however, this is just help us teachers... 26. Author: Daniel Date: Apr. 30 0:11 AM 1998 Social Justified Extension I believe that the role of the teacher is being changed by computers, but the computer will never totally replace the teacher... I believe that the computers will eventually make teaching easier for us and that most of the children's work will be done on computers. But I believe that there… Indicators for the Quality of Students’ Dialogue (Angeli, Valanides, & Bonk, in press) ID Examples Indicators 1 Social acknowledgement/ Sharing/Feedback hHello, good to hear from you; I agree, good point, great idea 2 Unsupported statements (advice) II think you should try this…. This is what I would do… · 3 Questioning for clarification and extend dialogue cCould you give us more info? …explain 4 Critical thinking, Reasoned thinkingjudgment ·I disagree with X, because in class we discussed….I see the following disadvantages to this approach…. what you mean by…? Social Construction of Knowledge (Gunawardena, Lowe, & Anderson, 1997) Five Stage Model 1. Share ideas, 2. Discovery of Idea Inconsistencies, 3. Negotiate Meaning/Areas Agree, 4. Test and Modify, 5. Phrase Agreements In global debate, very task driven. Dialogue remained at Phase I: sharing info Social Constructivism and Learning Communities Online (SCALCO) Scale. (Bonk & Wisher, 2000) ___ 1. The topics discussed online had real world relevance. ___ 2. The online environment encouraged me to question ideas and perspectives. ___ 3. I received useful feedback and mentoring from others. ___ 4. There was a sense of membership in the learning here. ___ 5. Instructors provided useful advice and feedback online. ___ 6. I had some personal control over course activities and discussion. Evaluation… 16 Evaluation Methods 1. Formative Evaluation 2. Summative Evaluation 3. CIPP Model Evaluation (Context, Input, Process, Product) 4. Objectives-Oriented Eval 5. Marshall & Shriver's 5 Levels (Self, Materials, Curric, Modules, Transfer) 6. Bonk’s 8 Part Eval Plan 7. Kirkpatrick’s 4 Levels 8. Return on Invest Level 5 9. Level 6 budget and stability of team. 10. Level 7 e-learning champion(s) promoted 11. Cost/Benefit Analysis 12. Time to Competency 13. Time to Market 14. Return on Expectation 15. AEIOU: Accountability, Effectiveness, Impact, Organizational Context, U = Unintended Consequences 16. Consumer-Oriented Evaluation My Evaluation Plan… Considerations in Evaluation Plan 8. University or Organization 7. Program 6. Course 5. Tech Tool 1. Student 2. Instructor 3. Training 4. Task 1. Measures of Student Success (Focus groups, interviews, observations, surveys, exams, records) Positive Feedback, Recommendations Increased Comprehension, Achievement High Retention in Program Completion Rates or Course Attrition Jobs Obtained, Internships Enrollment Trends for Next Semester 1. Student Basic Quantitative Grades, Achievement Number of Posts Participation Computer Log Activity—peak usage, messages/day, time of task or in system Attitude Surveys 1. Student High-End Success Message complexity, depth, interactivity, q’ing Collaboration skills Problem finding/solving and critical thinking Challenging and debating others Case-based reasoning, critical thinking measures Portfolios, performances, PBL activities 2. Instructor Success High student evals; more signing up High student completion rates Utilize Web to share teaching Course recognized in tenure decisions Varies online feedback and assistance techniques 3. Training Outside Support Training (FacultyTraining.net) Courses & Certificates (JIU, e-education) Reports, Newsletters, & Pubs Aggregators of Info (CourseShare, Merlot) Global Forums (FacultyOnline.com; GEN) Resources, Guides/Tips, Link Collections, Online Journals, Library Resources 3. Training Inside Support… Instructional Consulting Mentoring (strategic planning $) Small Pots of Funding Facilities Summer and Year Round Workshops Office of Distributed Learning Colloquiums, Tech Showcases, Guest Speakers Newsletters, guides, active learning grants, annual reports, faculty development, brown bags RIDIC5-ULO3US Model of Technology Use 4. Tasks (RIDIC): Relevance Individualization Depth of Discussion Interactivity Collaboration-Control-ChoiceConstructivistic-Community RIDIC5-ULO3US Model of Technology Use 5. Tech Tools (ULOUS): Utility/Usable Learner-Centeredness Opportunities with Outsiders Online Ultra Friendly Supportive 6. Course Success Few technological glitches/bugs Adequate online support Increasing enrollment trends Course quality (interactivity rating) Monies paid Accepted by other programs 7. Online Program or Course Budget (i.e., how pay, how large is course, tech fees charged, # of courses, tuition rate, etc.) Indirect Costs: learner disk space, phone, accreditation, integration with existing technology, library resources, on site orientation & tech training, faculty training, office space Direct Costs: courseware, instructor, help desk, books, seat time, bandwidth and data communications, server, server back-up, course developers, postage 8. Institutional Success E-Enrollments from new students, alumni, existing students Additional grants Press, publication, partners, attention Orientations, training, support materials Faculty attitudes Acceptable policies (ADA compliant) Best Practices? Part I. Best Practices: Who are some of the key scholars and players…??? Karen Lazenby, Instructor Qualities, Deputy-Director, Telematic Learning and Education Innovation (now Director, Client Service Center) (University of Pretoria, Nov., 2001, [email protected]) Flexible to shift between roles Patient, responsive Friendly, positive, supportive Limit lecture Publish best student work Set clear rules for posting and interaction Involve outside experts Online Teaching Skills The Online Teacher, TAFE, Guy Kemshal-Bell (April, 2001) Technical: email, chat, Web development Facilitation: engaging, questioning, listening, feedback, providing support, managing discussion, team building, relationship building, motivating, positive attitude, innovative, risk taking Managerial: planning, reviewing, monitoring, time management ================================== From provider of content to designer of learning experiences. From solitary teacher to team member Ron Oliver, Edith Cowen University, Collab & Constructivist Web Tasks (McLoughlin & Oliver, 1999; Oliver & McLoughlin, 1999)) 1. 2. 3. 4. 5. 6. Apprenticeship: Q&A; Ask an Expert forums. Case-Based and Simulated Learning: exchange remote views; enact events online. Active Learning: Design Web pages & databases. Reflective/Metacognitive Learning: Reflect in online journals, bulletin boards Experiential Learning: Post (articulate ideas) to discussion groups Authentic Learning: PBL, search databases John Hedberg, Singapore (was at Univ of Wollongong) RILE Monograph (2001) Online Envir. Learner must be active in learning process Provide variety of contexts and viewpoints Learning is a process of construction Immerse learners in authentic contexts Reflective thinking is the ultimate goal Learning involves social negotiation Need to develop realistic strategic, pedagogical, & commercial models for online learning E-Moderating by Gilly Salmon (Salmon, (1999) Kogan Page; [email protected]) 1. 2. 3. 4. 5. 6. 7. 8. Know when to stay silent for a few days. Close off unproductive conferences. Variety of relevant conference topics. Deal promptly with dominance, harassment. Weave, archive, co-participate, acknowledge Provide sparks or interesting comments. Avoid directives and right answers. Support others for e-moderator role. Robyn Mason’s (1991) 3 Roles (The Open University; [email protected]) http://iet.open.ac.uk/pp/r.d.mason/main.html Organizational—set agenda, objectives, timetable, procedural rules Patience, vary things, spur discussion, invites Social—welcome, thank, provide feedback, and set generally positive tone Reinforce good things, invite to be candid Intellectual—probe, ask q’s, refocus, set goals, weave comments, synthesize comments Know when to summarize and to leave alone Morton Paulsen’s Pedagogical Techniques (Morton Paulsen, 1995, The Online Report on Pedagogical Techniques for Computer-Mediated Communication; [email protected]) 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Collective databases, Access to Online Resources Informal socializing (online cafes) Seminars (read before going online) Public tutorials Peer counseling, learning partnerships (Online Support Groups) Simulations, games, and role plays Free Flowing Discussions/Forums Email interviews Symposia or speakers on a theme The notice board (class announcements) PROF. DR. BETTY COLLIS University of Twente (UT) , Faculty of Educ Science & Technology (TO); [email protected] Lead successful development and implementation of the TeleTOP (http://teletop.edte.utwente.nl) Web-based course-management system (1997), now in use throughout university and beyond. Learning is active, collaborative, construction, and contribution (i.e., learner-centered) Give learner support tools & options Ideal Environment of Synchronous Trainer by Jennifer Hoffman (Insync Training, [email protected]) A private, soundproof room. High-speed connection; telephone; powerful computer; additional computer; tech support phone # Studio microphone and speakers A “Do Not Disturb” sign Near restroom; pitcher of water Zane Berge’s Pedagogical Recs (Zane Berge, 1995, The role of the online instructor/facilitator; [email protected]) Draw attention to conflicting views Don’t expect too much/thread Do not lecture (Long, coherent sequence of comments yields silence) Request responses within set time Maintain non-authoritarian style Promote private conversations Linda Harasim, Online Collab Learning Simon Fraser University, [email protected] In 1985, Dr. Harasim was one of the first to teach a totally online graduate course. The following year, she and her colleagues at the Ontario Institute for Studies in Education delivered the first professional development courses taught online. Harasim, L. (2001). Shift Happens: Online Education as a New Paradigm in Learning. The Internet and Higher Education, 3(1). Elsevier Science, New York, NY Harasim, L.. The Virtual University: A State of the Art. Advances in Computers, Book Series - Volume 54. Academic Press, London, UK. The Sharp Edge of the Cube: Pedagogically Driven Instructional Design for Online Education Syllabus Magazine, Dec, 2001, Nishikant Sonwalkar five functional learning styles— apprenticeship, incidental, inductive, deductive, discovery. http://www.syllabus.com/syllabusmagazine/article.asp?id=5858 Dealing with Online Students (Vanessa Dennen, San Diego State Univ) Students don’t participate Because it isn’t required Because they don’t know what is expected Students all participate at last minute Because that is what was required Because they don’t want to be the first Instructor posts at the last minute Just a Lot of Bonk (Curt Bonk, Indiana University) Variety: tasks, topics, participants Interaction extends beyond class Make learners are also teachers Allow multiple ways to succeed Embed personalization and choice Clarity and easy to navigate course Instructor Tips Archive work, repurpose it, use it Take a course online—be a student Conduct usability testing and simplify Schedule someone due early in course Market/Share what do Find a tech mentor Be flexible What do we need??? FRAMEWORKS!!! 1. Reflect on Extent of Integration: The Web Integration Continuum (Bonk et al., 2001) Level 1: Course Marketing/Syllabi via the Web Level 2: Web Resource for Student Exploration Level 3: Publish Student-Gen Web Resources Level 4: Course Resources on the Web Level 5: Repurpose Web Resources for Others ================================ Level 6: Web Component is Substantive & Graded Level 7: Graded Activities Extend Beyond Class Level 8: Entire Web Course for Resident Students Level 9: Entire Web Course for Offsite Students Level 10: Course within Programmatic Initiative 2. Reflect on Interactions: Matrix of Web Interactions (Cummings, Bonk, & Jacobs, 2002) Instructor to Student: syllabus, notes, feedback to Instructor: Course resources, syllabi, notes to Practitioner: Tutorials, articles, listservs Student to Student: Intros, sample work, debates to Instructor: Voting, tests, papers, evals. to Practitioner: Web links, resumes Practitioner to Student: Internships, jobs, fieldtrips to Instructor: Opinion surveys, fdbk, listservs to Practitioner: Forums, listservs 3. Study of Four Classes (Bonk, Kirkley, Hara, & Dennen, 2001) Technical—Train, early tasks, be flexible, orientation task Managerial—Initial meeting, FAQs, detailed syllabus, calendar, post administrivia, assign e-mail pals, gradebooks, email updates Pedagogical—Peer feedback, debates, PBL, cases, structured controversy, field reflections, portfolios, teams, inquiry, portfolios Social—Café, humor, interactivity, profiles, foreign guests, digital pics, conversations, guests Some Final Advice… Or Maybe Some Questions???