CONTENT VALIDITY STUDY: WATER TRAINING INSTITUTE’S …

Download Report

Transcript CONTENT VALIDITY STUDY: WATER TRAINING INSTITUTE’S …

ASSESSMENT of the CONTENT VALIDITY
of the
WATER TRAINING INSTITUTE (WTI)
CURRICULUM
Elizabeth L. Shoenfelt, Ph.D.
Alicia Turner, M.A. Candidate
Patricia Slack, M.A. Candidate
David Normansell, M.A. Candidate
Department of Psychology
Western Kentucky University
OVERVIEW
 Content Validation – Training Program Evaluation Methods
 Criterion Phase


Participants
Procedure
 Content Phase


Participants
Procedure
 Analysis
Background: WTI
 2 year associate program
 2 certification programs (Water and Wastewater)
 Curriculum: general requirements and 2 specialized
tracks (Water and Wastewater)
 4 projected entry level jobs that WTI graduates will
most likely enter:
o Water Treatment Operator
o Wastewater Treatment Operator
o Distribution Systems Operator
o Collection Systems Operator
Background: NSF Grant
 Grant evaluation includes an assessment of the
content validity of the WTI Curriculum
 Will examine the 4 courses currently in place to
ensure they are preparing students with the
Knowledge Skills and Abilities (KSAs) they need to
successfully enter the workforce in any of the 4
identified entry level jobs
Content Validity
 CONTENT VALIDITY: The extent to which the
material taught in the training course reflects the
actual KSAs required for effective job performance
 The more similar the WTI training program content
is to the job, the more effective it should be in
preparing WTI program graduates.
Content Validation – Training Evaluation
Methods
 Matching Technique (Ford & Wroten, 1984)
Analyzed the effectiveness of a Police department training program by
identifying Knowledge, Skills, and Abilities (KSAs) considered
important to job performance and the KSAs that currently taught in the
training program (i.e., time spent)
• Used a matching matrix to graph the data to identify
Hits and Misses
• Hit: training emphasis reflects training needs (i.e., high
importance KSA receives high emphasis in program)
• Miss: training emphasis does not reflect training needs
• o Deficiency: high importance KSA receives low emphasis
• o Excess: low importance KSA receives high emphasis
Matching Matrix
Content Validation – Training Evaluation
Methods
 Linking Technique (Teachout, Sergo, & Ford, 1997)
Used the matching matrix but also linked training emphasis to
difficulty of learning
 More effective picture:
High Importance KSA with Low Emphasis – is it a
Deficiency or does it have Low Difficulty of Learning?
Current Evaluation
 Will examine the content validity of the WTI
curriculum
 Will use combination of the Matching and Linking
Techniques
 2 Phases: Criterion (Job)
Content (WTI Curriculum)
Criterion Phase: Procedure
 Developed preliminary Job Knowledge Surveys (JKS)
based on materials received from the Associate Director of
the Center for Water Resource Studies at WKU that
included the KSAs needed for the 4 entry level jobs
 For each element of job information, JKS included ratings
for: a) Time Spent on job
b) Importance to job
c) Difficulty of Learning
d) When learned (before hire/formal trn/on the job)
e) Should it be taught in WTI
f) Is it needed for certification
Job Knowledge Survey: Sample
Criterion Phase: Participants
WTI Steering Committee Members
 Knowledgeable individuals about Water and Wastewater
Industry for both Kentucky and Tennessee.
 From a variety of organizations from Kentucky and
Tennessee such as Kentucky Rural Water Association
and Tennessee Association of Utility Districts
 JKSs were distributed to steering committee members
during their WTI workshop/conference in Louisville, KY.
The members pilot tested the surveys and provided
feedback to refine surveys
Criterion Phase: Procedure
 The JKSs were refined based on Steering Committee
feedback
 Steering Committee members collectively identified 40
water and wastewater incumbents to serve as job content
Subject Matter Experts (SMEs) to complete the JKSs
(Approximately 10 per entry level job)
 The JKS were mailed to the Steering Committee
Members for distribution to the identified job incumbent
SMEs with a return deadline of October 30, 2009
 The JKS data have been entered.
 We are now in the process of analyzing the JKS data.
Content Phase: Participants
 Second Phase is a content analysis of the WTI
Curriculum
 We will develop Course Content Surveys to determine
KSAs taught and training emphasis (time)
 WTI Curriculum Content SMEs:
o Bowling Green Community College WTI Instructor
o Teaching Assistant for the Center for Water Resource
Studies at WKU
o The 4 current WTI students also will complete the Course
Content Surveys
Content Phase: Procedure
 The SMEs will develop a list of KSAs that are taught in
each of the current 4 courses through a “brainstorming”
panel discussion
 At the end of the discussion, Curriculum SMEs will be
provided with the course syllabi and any relevant
training materials to further identify KSAs taught
 The KSAs identified will be used to develop a Course
Content Survey. Each KSA taught in the WTI course will
be rated on the time spent teaching the KSA.
 The SMEs (including students) will complete the survey
Analysis
• Matching Matrix will be used to compare:
• Importance ratings vs. Time Spent Training ratings
• Difficulty of Learning ratings vs. Time Spent Training ratings
• Identify hits and misses in the courses:
• Hits – appropriate emphasis in curriculum
• Deficiencies – where more emphasis is needed
• Excesses – where time on topic can be devoted elsewhere
Matching Matrix
Formative Evaluation
 Data from the Content Validation can
be used:
•
•
To revise current WTI courses (if needed) to
ensure appropriate emphasis of job-related
KSAs
In developing other WTI courses to ensure they
contain appropriate emphasis of job-related
KSAs
References
Primary References

Bownas, D., Bosshardt, M., & Donnelly, L. (1985). A Quantitative Approach to Evaluating Training Curriculum Content
Sampling Adequacy. Personnel Psychology, 38. 117-131.

Ford, J., & Wroten, S.(1984). Introducing New Methods for Conducting Training Evaluation and for Linking Training
Evaluation to Program Redesign. Personnel Psychology, 37. 651-665.

Sproule, C. Rationale and Research Evidence Supporting the Use of Content Validation in Personnel Assessment.
Retrieved from International Personnel Assessment Council Website: http://www.ipacweb.org

Teachout, M., Sego, D., & Ford, J. (1997). An Integrated Approach to Summative Evaluation for Facilitating Training
Course Improvement. Training Research Journal, 3. 169-184.
Questions?
Content Validity Ratio ((N imp – N not-imp) / N)