NAPLAN is a valuable tool for teachers of Mathematics.
Dr Judy Hartnett
Making Maths Reason-able.
This week Australian students will sit the annual National Assessment Program: Literacy and Numeracy (NAPLAN) test. Many researchers, teachers, school administrators, parents and politicians have differing opinions about the value of this test. Schools publish the data on websites and overall results are reported on the One School website allowing interested parties to view a school’s data and compare this data to other schools, the whole of their state and whole of country. The data is highly visible which adds to the pressure schools feel to prepare students for this assessment.
Schools today are encouraged to be data focussed and data driven. Schools seem to struggle to source rich Mathematics data and while the curriculum comprises more than content it tends to be tests that provide the data schools use to track the progress of their students. Teacher professional judgements inevitably contain subjectivity based on interpretation of curriculum documents and personal experience. The NAPLAN tests provide a source of data that is comparable and worthy of deeper analysis.
The NAPLAN class reports provide teachers and schools with a rich source of information that can be used to guide teaching. These class reports show the responses of all students in the class per item. They list correct responses as well as the responses chosen/made by each student who was incorrect. Teachers and school admin can look at common errors and hypothesise about the misconceptions and likely causes of errors students make rather than the just the raw scores, percentages correct or the bands in which students are placed.
NAPLAN questions are written with an understanding of the possible errors students make embedded within them. An example from the 2016 test shows how the data can be analysed at the question level.
This question was a link item to the Year 5 test (Yr 3 Question 23 and Yr 5 Question 16)
The question involves two calculations – a multi-step problem. The students need to find the total cost of the items and then work out the change from $30. The four options provided for students to choose from are all viable in the context and represent likely misconceptions / errors, as well as the correct response. When analysed at the item level teachers and schools have a rich source of data that can guide further teaching. Students who chose $24 have correctly added the prices of the items but have not completed the final step to answer the question asked regarding the change from $30. 29.3% of Qld Year 3 students and 18.9% of Qld Year 5 students chose this option. These students, while incorrect have shown they can add amounts of money presented as dollars and cents. Students who chose $54 have also correctly added the price of the items but have added this to the $30 instead of subtracting it. 7.4% of Qld Yr 3s and 1.9% of Qld Yr 5s chose this option. The other option of $7 could have been chosen by students who miscalculated the total or only added the dollar amounts. 25% of Qld Yr 3s and 13.6% of Qld Year 5s chose this option. 37.2% of Qld Year 3 students and 65.3% of Qld Year 5 students were correct.
There will always be some aspect to the data on any test that is simply incorrect. Some students who do not understand the mathematics in a particular question will guess correctly. Highly-able students will rush or assume they understand what a question is asking and choose the wrong option. While we always want students to do as well as they possibly can, students who do not understand the maths involved in a question really should be getting it wrong so the results help to direct teachers to concepts in need of instruction. The focus on continual improvement year after ear is well meaning but misguided. NAPLAN is one source of data in the plethora of data available in schools. I consider it to be a particularly rich source and encourage teachers and schools to look a little deeper into this data.