Assignment 1 rtl2
Assignment 1 rtl2
Assignment 1 rtl2
Another article that argues that NAPLAN is not an accurate form of assessment
program for schools in Australia for literacy and numeracy is by Thompson et al., (2018). The
arguments raised in the article revolves around the validity of the data which according to
American Educational Research Association, American Psychological Association & National
Council on Measurement in Education (as cited in Thompson et al., 2018) in statistical terms
validity of data is the most ‘fundamental consideration in developing an evaluation of tests
(p. 762). The reason for the questionable validity of the data that NAPLAN offers is tied to the
participation rates. The argument is succinctly put as follows: the participation can affect
validity of comparative claims even when participation rates are around the 95% (Thompson
et al., 2018, p. 766). In support of Thompson et al., (2018) argument he cites Margaret Wu (p.
766), who also questions the reliability of NAPLAN scores because most Australian schools
have low participation rates. She argues that most Australian schools have less than 100
students for every year level and a significant number with participation rates less than 30
students for each year level. Again, this argument about NAPLAN being a poor assessment
regime because of statistical reasons, whilst is valid, should not discredit NAPLAN as an
effective assessment program. Whilst the data on the number of participants at a school is
low the information collated does not need to be statistically significant to make comparisons
on the strengths and weakness of students results. ACARAs purpose is to provide information
to drive improvement, and the results of schools for is valuable for aligning pedagogical
adjustments to assist in the improvement of literature and numeracy.
The same theme in another article by Johnston, (2017) appears to argue the validity
of data. Johnston, (2017) conducted a research project on seven independent schools in NSW
while analysing NAPLAN data to see if schools can enhance their literacy results. The research
project found that using NAPLAN data at the whole-school level and year cohorts were
difficult because the data were rendered statistically invalid (p.21). It seems that again,
statistical data is at the forefront to validate the creditability of NAPLAN and render it flawed.
Johnston, (2017) goes on to say that in its current form NAPLAN ignores student difference in
ability, skill, disability and social class (p.25). Whilst this might sound convincing, what is
missed in all of this, is that NAPLAN does not seek to test for differentiation or inclusion, rather
it is a snapshot of where schools are at in comparisons with other like schools in literacy and
numeracy skills.
In concluding with the contemporary literature review about how effective NAPLAN is
in assessing students in secondary Australian schools, there is an over emphasis on meeting
conditions relating to statistical norms and less emphasis on the overall snapshot results of
the literacy and numeracy measures that NAPLAN provides. The overwhelming criticism
seems valid from a statistical point of view, but the NAPLAN test results need to be looked at
as comparing schools and not individual student performance. Hence, I conclude in saying
that NAPLAN is good and effective assessment program in Australian secondary schools when
comparing the outcomes of one school against another like schools. This is supported by the
Senate Standing Committee on Education Employment and Workplace Relations, as cited in
Johnston, (2017) who found that there were some unintended consequences of NAPLAN in
its current form but did not cite statistical reasons.
Reference
Mayes, E., & Howell, A. (2018) The (hidden) injuries of NAPLAN: two standardised test events
and the making of ‘at risk’ student subjects, International Journal of Inclusive Education,
22:10, 1108-1123, DOI: 10.1080/13603116.2017.1415383
Rose, J., Low-Choy, S., Singh, P., & Vasco, D. (2018): NAPLAN discourses: a systematic review
after the first decade, Discourse: Studies in the Cultural Politics of Education, DOI:
10.1080/01596306.2018.1557111
Thompson, G., Adie, L., & Klenowski, V. (2018) Validity and participation: implications for
school comparison of Australia’s National Assessment Program, Journal of Education
Policy, 33:6, 759-777, DOI: 10.1080/02680939.2017.1373407
Action Research Protocol
The stakeholders which will be interviewed include a principal, head of senior school,
three heads of departments and five other teachers from the English and Mathematics
departments. The overwhelming theme from the literature review was that NAPLAN was not
an effective assessment program and most of the articles reviewed cited validity of the data
collected. I’m expecting some replies in relation to data captured by ACARA in completing the
MySchool website as being unreliable. But in order to avoid steering the participants to giving
me replies around data validity, there will be no direct mention of data validity in the
questions.
A participation consent form will be provided for them to sign in order to engage in
the interview questions along with an information sheet about the background of the
research undertaken and the purpose of the interview. They will have the option to consent
and there is no obligation to answer all the questions during the interview, so they have the
option to not answer some questions. There is an ethical angle to the consent form and the
information sheet that is provided, which is any questions that might put them in harm’s way
by way of potentially losing a job or give them unnecessary stress when commenting on
NAPLAN and their school, they’ll have the option not to answer to avoid any such scenario’s.
The information sheet accompanying the consent participation form will also note that all
participants and the school will be de-identified when reporting on the result of the interview
in the interest of keeping them anonymous and out of respect for their privacy concerns that
might be raised.
The actual interview will occur at the school and written down with no recordings.
This will only take place after they have signed a consent form and read the information sheet.
The interview will consist of six questions targeted at finding if they use NAPLAN data through
the MySchool website, some probing questions about the effectiveness of NAPLAN, a direct
question about the intended misuse of NAPLAN, some follow up questions and the last
questions will be targeted at getting their opinion on NAPLAN as an assessment program
compared to other testing programs. The aim is to unlock the knowledge of NAPLAN and its
effectiveness as an assessment program directly from stakeholders at the Western Sydney
Secondary School. This will be done by tabling the results into groups of NAPLAN as an
Effective Assessment Program, NAPLAN as an Ineffective Assessment Program and
Miscellaneous feedback grouped either positive or negative feedback in relation to NAPLAN
as an assessment program.
The questions are listed below:
Have you used MySchool website to monitor the improvement of student in literacy and
numeracy?
Can you tell me about how effective NAPLAN is as an assessment program for the literacy
and numeracy in Secondary Australian Schools?
Have you ever given instructions or followed instructions to teach students for the NAPLAN
test?
Could you say that there are unintended consequences of that you know of arising from the
NAPLAN results like: political or financial?
Are you able to provide more details about the unintended consequences?