From Objectives to Outcomes - Welcome to Laerdal Medical

Download Report

Transcript From Objectives to Outcomes - Welcome to Laerdal Medical

Beth Fentress Hallmark, PhD, RN Belmont University Nashville, TN

Belmont University

Nashville, Tennessee

• 5,000 + students

College of Health Sciences Inter-professional Education

 Nursing   Accelerated, Fast track and Traditional BSN FNP  Social Work (BSW)  Physical Therapy (DOT)  Occupational Therapy (DPT/MSOT)  Pharm D

Belmont University

Nashville, Tennessee

• • • • • 2- Eight bed Adult Health laboratories 8 bed “Acute care” lab 4 bed Peds lab 8 bed Health Assessment/OB lab 4 Inter-professional private patient areas

From Objectives to Outcomes:

Learning Objectives:

 Identify the components of Healthcare Simulation  Discuss the importance of outcomes evaluation and challenges to traditional assessments  Discuss the importance of validity, reliability and feasibility as it relates to assessment  Discuss types of assessments and their application in healthcare education

Components of Healthcare Simulation

Jeffrey A. Groom, PhD, CRNA (2009)

Communicate

Healthcare Education

Teamwork

HOW ARE WE DOING?

Study Finds No Progress in Safety at Hospitals

 November 24, 2010 : NY Times: 10 North Carolina hospitals   25.1 injuries per 100 admissions 42.7 % = extra time in the hospital  2.9 % patients suffered a permanent injury  > 8 percent life-threatening  2.4 % of them caused or contributed to a patient’s death  Medication errors caused problems in 162 cases.

How do we measure our improvement?

Safe/competent practitioners: whatever the discipline/setting.

 Initial & Continued competence  Acquisition of relevant knowledge  Development of psychomotor skills  Application of this knowledge and the skill

Current Assessments

 Current methods used to measure performance in the clinical area is difficult:  Confidentiality  Faculty to student ratio  Safety to patient  Preceptors: Valid? Reliable?

 Adjunct Faculty  Tools

Model of Competence

Problem with “Knowing”

 Knowing is measured using examinations like NCLEX, NREMT cognitive exam, FNP certification exam,, calculations test, etc….

 Recalling basic Facts, principles and theories  Multiple choice questions and T/F questions  Test question design: Valid, reliable  Bloom’s taxonomy  Critical thinking questions

Problem with “Knowing”

 Cognitive Domain  Belmont pass rate on NCLEX May 2005 is 98.6% on NP exam 100 %....

 Strategies to pass these exams employed in educational institutions.

Does this mean that each of these students Will be prepared to care for you or your loved ones?

Model of Competence

Problem with “Knows How”

 “Knows How”:  Application of knowledge to problem solving and decision making” (Waas, 2001)  “A thought process stimulated by a problem” (Waas, 2001).

 “ability to solve problems, make decisions and describe procedures” (Scalese, 2008)  Case studies and essays  Multiple/multiples  Again… are these students prepared to provide safe proficient care.

Model of Competence

“Shows how vs. Does”

 “Shows How”  “demonstration of skills in a controlled setting” (Scalese, 2008)  Educating in these methods includes simulation based education (SBE).  OSCE, SP, Simulations, log books, portfolios  Technical skills  Includes higher level thinking  “Does”  Moves from simulated environment to the real life setting

Assessment vs. Evaluation

Assessment and evaluation are often used interchangeably  However for our purposes…  Assessment describes the measurement of learner outcomes  Evaluation describes the measurement of course/program outcomes

Why do we assess learner outcomes?

 Provides baseline data  Provides summative and formative feedback  “Drives learning”  Allows measures of individual progress.

 Encourages “student” reflection  Assures public that providers are competent  Licensure/credentialing requirements

Why do we evaluate our programs?

 Demonstrates change and growth in programs/courses  Identifies gaps in programs/ courses  Fundamental to outcomes- or competency-based education  Accrediting/Credentialing facilities/programs  Allows administration to make informed allocation decisions

Objectives/Outcomes of Program

 Define outcomes based on accrediting/professional organizations., etc.

 Objectives/Outcomes leads to competency and mastery.

 Identify the Knowledge, Skills and Attitudes/Affective Behaviors (KSA).

 Curricular/Program Specific Event Specific.

 Measurable, clearly defined standards.

Simulation

Student Learning Outcomes Curriculum Teaching & Learning Assessment Evaluation ±

change/refine

Simulation Education

 Knowledge  Skills  Attitudes  Advance these throughout the curriculum via assessment  For example: injection to team training

Preparing assessments

 What should be assessed?

 Every aspect of curriculum considered  

Essential Significant designated teaching time

 Should be consistent with learning outcomes that are established as the competencies students should master/perform at a given phase of study

Use of Assessment in Simulation

Formative or Summative Rosen, MA et al. Measuring Team Performance in Simulation-Based Training: Adopting Best Practices for Healthcare. Sim Healthcare 3:2008;33–41.

Assessment

Formative Assessment

 Lower stakes assessment  One of several assessments over time of course or program  May be evaluative, diagnostic, or prescriptive  Often results in remediation or progression to next level 

Summative Assessment

 Higher stakes assessment  Generally final course or program assessment  Primary purpose is performance evaluation  Often results in a Go-No Go outcome

Assessments - peer

 Enables learners to hone their skills in their ability to work with others and professional insight  Enables faculty to obtain a view of students they do not see  An important part of peer assessment is for students to justify the marks they award to others  Justification can also be used as a component when faculty evaluates attitudes and professionalism.

Assessments - standard setting

 Should be set to determine competence  Enables certification to be documented, accountable and defensible  Appropriately set standards for an assessment will pass those students who are truly competent • Standards should not be two low (false positives) to pass those who are incompetent, nor too high (false negative) to fail those who are competent.

Assessments - standard setting

 Standards should be set around a core curriculum that includes the knowledge, skills and attitudes required of all students  When setting a standard the following should be considered:    Must reflect the core curriculum High standard in the core components of the curriculum Demonstrate mastery at each phase

Performance Assessment

Basic to performance –

Do they know it and know how?

Competence –

Can they do it?

Performance –

Do they do it?

Assessing Simulation

 Documenting Data:  Live, video recording, Software logging systems  Logistics of documenting Data:  AV annotation via logging, pencil paper (wipe off cards), scantron, PDA/handheld/TabletPC  Assessors:  Instructors, Observers, SIM/Patients, Peers, Participants

Choosing appropriate assessment methods/tools

 When choosing the assessment instrument, the following should be answered:  Is it valid  Is it reliable  Is it feasible

Assessments - validity

 Are we measuring what we are supposed to be measuring?

 Use the appropriate instrument for the knowledge, skill, or attitude you are testing  The major types of validity should be considered (content, predictive, and face)

Assessments - reliability

 Does the test consistently measure what it is supposed to be measuring  Types of reliability:  Inter-rater (consistency over raters)  Test-retest (consistency over time)  Internal consistency (over different items/forms)

Assessment Tools

Tools should measure KSA within the domains that you are measuring  Cognitive  Psychomotor  Affective  Do these domains occur alone? Or simultaneous?

 Simulation offers the ability to assess each of these domains …an application of the cognitive domain while performing psychomotor skills as the student demonstrates how they have internalized values, attitudes and beliefs.

Where did I start?

 “Low hanging Fruit”  TASTED GREAT!!

 Self- reported  Confidence  Increased critical thinking  Satisfaction  Situational Awareness

Where should you start?

   Tools developed for your OBJECTIVES!

   To measure clinical judgment Use a tool developed for this.

Lasater (2007).

Adds to reliability and validity May combine instruments  What about the tool you use for clinical evaluation?

  Is it reliable? Valid? Who developed it? Have you had consistency issues with tool/students in clinical?

Does it measure what you really want it to?

Assessments - feasibility

 Is the administration of the assessment instrument feasible in terms of time and resources?

       Time to construct?

Time to Score ?

Ease of interpreting the score/producing results ?

Practical given staffing/organization ?

Quality of feedback ?

Learner takeaway ?

Motivate Learner ?

Practicality

 Number of students to be assessed  Time available for the assessment  Number of staff available  Resources/equipment available  Special accommodations

Examples of Tools

 Kardong-Edgren, S., Adamson, K.A., Fitzgerald, C. (2010, January). A review of currently published evaluation instruments for human patient simulation. Clinical Simulation in Nursing, 6(1), e25-e35. Doi:10.1016/jecns.2009.08.004.

Exercise

 Let’s try it: OUTLOUD  Groups  Hospital  Emergency  Nursing education  Safe Medication Administration  How does this link to the programmatic outcomes and then with your course?

Safe Medication Administration

 How are you measuring this now?

 Summative  Formative  Knowledge (Cognitive exams)  Skills/Psychomotor (lab check off)  Attitudes/Affective (what would you examine or are you examining here?) Likert satisfaction? Self Confidence?

Model of Competence

Knowing: Safe Med Administration

 Each group write a high level MC question for Safely Administering a specific medication (Choose One).

 Is the student who answers this question safe and competent?

Model of Competence

Knows How: Safe Med Administration

 Write a short case related to giving the same medication?

 What components must the student tell the grader?

 How to administer the med?

 Side Effects?

 Teaching ?

 What Else will we measure?

 For the student who reaches all of these assessment criteria …Are they competent and safe to give the medication?

Model of Competence

Shows How: Safe Med Administration

 Take the case above and the objectives and apply to a simulation.

 This can be simple or advanced incorporating teamwork/communication/ high acuity.

 What KSA’s are required?

 Which “student” do you want taking care of you?

References

      Decker, S., Sportsman, S., Puetz, L., Billings, L. (2008). The Evolution of Simulation and Its Contribution to Competency. The Journal of Continuing Education in Nursing, 39 (2), 74-80.

Groom, J.A. (2009). Creating new solutions to the simulation puzzle. Society for Simulation in Healthcare, 4 (3), 131. DOI: 10.1097/SIH.0b013e3181b3e4c3

Kardong-Edgren, S., Adamson, K.A., Fitzgerald, C. (2010, January). A review of currently published evaluation instruments for human patient simulation. Clinical Simulation in Nursing, 6(1), e25-e35. Doi:10.1016/jecns.2009.08.004.

Mckimm, J. (2010).University of Auckland; Visiting Professor of Healthcare Education and Leadership, University of Bedfordshire and Honorary Professor in Medical Education, Swansea University. Retrieved from http://www.faculty.londondeanery.ac.uk/e-learning/setting-learning objectives/learning-objectives-and-learning-outcomes .

Ross J. Scalese, MD, Vivian T. Obeso, MD, and S. Barry Issenberg, MD. (2009).Simulation Technology for Skills Training and Competency Assessment in Medical Education. Journal of General Internal Medicine, Journal of General Internal Medicine , 23(1), 46-49, DOI: 10.1007/s11606-007-0283 Wass, V., Van der Vleuten, C., Shatzer, J., & Jones, R. (2001). Assessment of clinical competence. Lancet, 24;357(9260):945-9.