National Assessment of Educational Progress
Download
Report
Transcript National Assessment of Educational Progress
NATIONAL ASSESSMENT OF
EDUCATIONAL PROGRESS
Ashley Singer
University of Central Florida
ARE 6905
April 16, 2013
PURPOSE
Teachers demand for creating applications from
results
Add more value and clarity to existing NAEP
testing by addressing the lack of teacher
questionnaire in the visual arts assessment that
is present in nearly all other NAEP subjects.
WHY IT’S IMPORTANT
Exists as a result of lack of lucidity that accompanies
NAEP visual arts assessment data
Nothing offers possible explanations for results
Teachers are left to interpret the numbers without
guidance
Add some context to the data
Be able to see what is current or trending in the
classroom
What could be lacking in their curriculum
What is proving successful in their practice
Universities and schools could use to see what
educator training programs have been successful in
equipping teachers for their fields
What teachers may be lacking in their classrooms and
how to give it to them
RESEARCH QUESTIONS
How can NAEP clarify the results of the visual arts
assessment by adding a teacher questionnaire with
common practice and teacher background similar to
existing teacher questionnaires of other subjects?
How can the demographic and background
information be applied to understanding knowledge
and experience as well as hiring trends?
Do the findings suggest certain training and
specialties lead to classroom achievement?
What areas of art education are being concentrated
on and what areas are being neglected?
How could we take the results to further develop a
NAEP curriculum and understand best practices?
WHAT I’VE LEARNED
Creating a well-done questionnaire is difficult
Basic structure and style is simple
Scientific approach is tedious and thought-provoking
Various steps to developing a questionnaire
Not just writing whatever questions you want answers to
and expecting reliable results from them
Overwhelming to develop questions that would yield
best applications for educators while answering my
questions
WHAT I’VE LEARNED
Critical to review who is writing tests, papers,
and surveys
What do you want to know?
Likely based on what they know, their experience, or
what they want to know
May not be true representation of the information
Boards and panels are important
Can also be influenced by central philosophy or
philanthropist
Reduce bias based on multiple experiences and perspectives
WHAT I’VE LEARNED
Objectivity and Adjustments
Analyzing previous and test-specific data
Research’s ultimate progress
Minor and major changes made to improve tests
Discussion of limitations shows what could be better
Changes are not personal – just progress
Adjustments to create another test
Designed around teacher’s training and preparedness
Areas of focus, certification process, work history, etc
What is making teachers ready for the classroom
WHAT I’VE LEARNED
NAEP
Obvious need for more clarity
If complaints are lack of application, they have to
find ways to make it relevant
Educators must be a part of the process
Either in test development, research, or advocacy
NAEP could find more ways to reach out to teachers
Whether you are a researcher or a teacher, you
cannot continue doing things the same way and
expect different or better results
REVIEW OF LITERATURE
“Finally, the arts assessment reminds us once again that arts
education is for all students, not just for the talented. No one has
suggested that math or science should be taught only to students with
talent in those disciplines. The arts, similarly, provide long-term
benefits that are important for every student. Experience has
demonstrated to arts educators that all children can learn basic arts
skills and knowledge, provided that they begin instruction early
enough.” (Lehman, 1999)
“Most NAEP assessments” have teacher questionnaire (NAEP, 2012)
The common education practitioner often has difficulty gleaning
consequence and meaning from the scores – must ask what we know
about these teachers (Eisner, 1999)
“Test performance, like paintings, needs to be ‘read,’ not only seen.
Information needed to give test scores a deep reading is very limited”
(Eisner, 1999
Recent study - “revealed that untrained people do not simply walk
into classrooms and become successful” prepared and certified
teachers are more successful than the untrained ones (Hatfield, 2007)
Test results only leave readers with “value without clarity” (Diket &
Brewer, 2011)
REVIEW OF LITERATURE
“While teachers’ completion of the questionnaire is
voluntary, NAEP encourages their participation since
their responses make the NAEP assessment more
accurate and complete” (Teacher Questionnaire, 2011)
Covers:
“teaching experience, certifications, degrees, major and
minor fields of study, coursework in education, course work
in specific subject areas, the amount of in-service training,
the extent of control over instructional issues, and the
availability of resources for the classroom” (Teacher
Questionnaire, 2011)
“pre- and in-service training, the ability level of the
students in the class, the length of homework assignments,
use of particular resources, and how students are assigned
to particular classes” (Teacher Questionnaire, 2011)
METHODOLOGY
Population
Similar to NAEP sample selection
Need to be directly related to the test results
Teacher questionnaires must match up with NAEP participants’
classrooms, schools, districts, etc.
NAEP participation is entirely voluntary
Teachers survey would also be voluntary
No way to accurately forecast who will be undergoing
the research and how they represent the actual
population of the United States visual arts classroom
NAEP visual arts exam only covers eighth grade students
Only be administered to corresponding eighth grade
teachers of the visual arts program
METHODOLOGY
Procedures
Similarly follow NAEP testing to adhere to procedural
protocol
Teachers will be given a general background questionnaire
and a subject-area specific questionnaire
Consists of a series of select-response questions
Teachers will mark their answers in their booklet or record
answers online as accurately as possible
Once the survey is finished the online answers will be
saved or the booklet can be given to the NAEP school
coordinator
Methodology – Descriptive/Quantitative
Used to look for trends and graph opinions, facts and
demographic data
Used to make recommendations for classroom application
Could prove to be effective information for correlation tests
INSTRUMENTATION
Development based on:
Other teacher questionnaires
Reading and writing teacher questionnaire. (2011).
National Assessment for Educational Progress.
Writing teacher questionnaire. (2010). National Assessment
for Educational Progress.
Teacher data in NAEP Data Explorer
Considered questionnaire development resources
NAEP 1997 national theatre results. (2002). National
Assessment of Educational Progress.
Gillham, B. (2000). Developing a questionnaire. New York,
NY: Continuum.
NAEP teacher questionnaire overviews
Teacher questionnaire. (2011). National Assessment of
Educational Progress.
DATA ANALYSIS
Best done by professional statistician
Descriptive
Per advice for collaboration within quantitative
research (Brewer, 2013)
Analysis will show trends, demographic data, etc.
Correlation
Correlation testing to note potential relationships
between student results and teacher questionnaires
RESULTS AND IMPLICATIONS
Speculative in nature
Descriptive and correlation research
Whatever results are reported, they will be limited to:
Making recommendations, not judgments
Seeing relationships, not causes
Add transparency to results
Show that specific subjects are highly promoted or often
neglected in classrooms
See what practices (i.e. writing, production, assessment,
presentation, critical analysis) are being done in classrooms
and which are not
Educational background and current practice and training in
the field of the teachers
With that information, we can compare the educators with the
“ideal practices” and see how their classrooms performed on
NAEP testing and determine possible explanations for success
or failure by looking for patterns.
RESULTS AND IMPLICATIONS
More background information = results will likely be more
generalizable and reliable (Brewer, 2013)
Results from NAEP follow principle with teacher background
Generalizability usefulness
Step towards examining school structure and culture that
Eisner calls for in order to make improvements in student
achievement (1999)
Could likely affect the qualifications for hiring and
successful preparation programs if Hatfield is correct
Relationships between student success and certain visual
arts subjects and practices individual classroom
structures may progress and a possibility for curriculum
improvements
Rationale for teacher adjustments
The call for direct applications may finally be heard and
answered.
LIMITATIONS
Not having a board or a panel creating the survey
Solely developed by me
Based on what I want to know – no hidden agendas
No other perspectives or experiences
Based on my experience or lack thereof
Cause assumptions because of what I think I know about
the issues (Gillham, 2000)
Quick development
No pre-pilot or pilot stage
Affects wording and understanding (Gillham, 2000)
Assumed done from other questionnaires
Sample population variable
INSTRUMENTATION
INSTRUMENTATION
INSTRUMENTATION
INSTRUMENTATION
INSTRUMENTATION
INSTRUMENTATION
INSTRUMENTATION
REFERENCES
Brewer, T. (Forthcoming, 2013). A primer for today’s quantitative research in art education. In K. Miraglia & C. Similian
(Eds), Inquiry in Action: Research Methodologies in Art Education. Reston, VA: National Art Education Association.
Diket, R. M., & Brewer, T. M. (2011). NAEP and policy: Chasing the tail of the assessment tiger. Arts Education Policy
Review, 112(1), 35-47. Retrieved from
http://www.informaworld.com.ezproxy.lib.ucf.edu/openurl?genre=article&id=doi:10.1080/10632913.2011.518126
Eisner, E. W. (1999). The national assessment in the visual arts. Arts Education Policy Review, 100(6), 16-20. Retrieved
from
http://ezproxy.net.ucf.edu/login?url=http://search.ebscohost.com.ezproxy.net.ucf.edu/login.aspx?direct=true&db=eric&A
N=EJ624037&site=ehost-live
Gillham, B. (2000). Developing a questionnaire. New York, NY: Continuum.
Hatfield, T. A. (2007). The unevenness of arts education policies. Arts Education Policy Review, 108(5), 9-13. Retrieved from
http://ezproxy.net.ucf.edu/login?url=http://search.ebscohost.com.ezproxy.net.ucf.edu/login.aspx?direct=true&db=eric&A
N=EJ771257&site=ehost-live
Lehman, P. R. (1999). Introduction to the symposium on the "NAEP 1997 arts report card.". Arts Education Policy Review,
100(6), 12-15. Retrieved from
http://ezproxy.net.ucf.edu/login?url=http://search.ebscohost.com.ezproxy.net.ucf.edu/login.aspx?direct=true&db=eric&A
N=EJ624036&site=ehost-live
Mathematics teacher questionnaire. (2013). National Assessment for Educational Progress. Retrieved from
http://nces.ed.gov/nationsreportcard/bgquest.asp
NAEP 1997 national theatre results. (2002). National Assessment of Educational Progress. Retrieved from
http://nces.ed.gov/nationsreportcard/tables/art1997/sdt02.asp
National Assessment for Educational Progress (NAEP). (2012). Questionnaires for Students, Teachers, and Schools.
Retrieved from http://nces.ed.gov/nationsreportcard/bgquest.asp
Reading and writing teacher questionnaire. (2011). National Assessment for Educational Progress. Retrieved from
http://nces.ed.gov/nationsreportcard/bgquest.asp
Teacher questionnaire. (2011). National Assessment of Educational Progress. Retrieved from
http://nces.ed.gov/nationsreportcard/tdw/instruments/noncog_teach.asp
Writing teacher questionnaire. (2010). National Assessment for Educational Progress. Retrieved from
http://nces.ed.gov/nationsreportcard/bgquest.asp