Assessing what we value, not valuing what we assess
Monday, Dec 9, 2019, 12:49 AM | Source: Pursuit
By Sandra Milligan
Standardised tests are once again telling us that all is not well in schooling. But not all is well with standardised testing either.
The Programme for International Student Assessment (PISA), a worldwide study by the Organisation for Economic Co-operation and Development (OECD) measuring 15-year-old school students' performance in mathematics, science and reading, finds Australia falling behind.
But the report itself is subject to scathing criticism by academics and other experts.
Australia's own NAPLAN, which has us flatlining in literacy and numeracy, has been controversial since its introduction in 2008, has been reviewed several times and is now being officially reviewed again.
These reviews and controversies often miss a central point. The problem is not in the tests but in the strategy to which the tests belong.
The underlying strategy was never very good. It has clearly failed even in its own narrow terms and is now, in significant part, obsolete as well. Among the many substantial reforms now needed in schooling is a root-and-branch reform of our assessment regime.
Standardised testing arrived in Australia from the US in the 1980s and 1990s.
I was Director of Curriculum in a state education department at the time and remember the salutary shock delivered to a complacent schooling system.
In crucial areas we were not doing as well as we'd assumed, for Aboriginal, regional, migrant and low socio-economic communities particularly, but for others as well.
A plethora of improvement programs, in teacher training, school leadership and school and professional development appeared. Gains were made.
But, over time, productive innovation turned into an unproductive narrowing of attention and reform, culminating in Julia Gillard's 2012 announcement of a single goal for the system: We must be in the top five by 2025, where 'top' meant 'performance' in selected aspects of the formal curriculum, as measured by the PISA standardised tests.
That narrow focus continues to dominate.
The annual and triennial reveals of NAPLAN and PISA results generate endless media stories and commentator reviews.
The entrails of the tests are examined in the search for answers where none exist, or not there anyway. Politicians claim success or failure on the slimmest of pretexts.
NAPLAN is particularly toxic because schools and staff can be lauded, compared, berated and rated according to their scores, and every child takes home 'the black dot' report on how they performed on testing day.
The outcomes measured by NAPLAN and PISA are fundamental, as is invariably asserted.
But these outcomes - and indeed outcomes of any kind - are a very long way from being the only fundamental things in schooling.
I am thinking here not just about large areas of the formal curriculum not up there in lights with 'the basics', and not just about the so-called 21st century skills such as collaboration, problem-solving and the like either.
I am thinking also about what used to be called 'the hidden curriculum'. That is, the relationship between students and their schools, the day-to-day experience of students and about the impact of schools on the wider society.
These aspects of schooling are not only fundamental in themselves. They interact with each other and with the things measured in PISA and NAPLAN.
Trying to fix one or two aspects of schooling without understanding the others is a recipe for failure which Australia has faithfully followed. We need a measurement regime that flows from and supports a strategy as broad as schooling's objectives and effects.
What might such a regime cover, to lift our gaze?
Well firstly, an updated definition of 'literacy'. Reading and writing are as fundamental as ever, but the fourth industrial revolution demands quite new kinds of digital and communication capabilities, sometimes referred to as 'Literacy 4.0'.
Then there's the so-called 'general capabilities' such as critical thinking, social and personal development and intercultural understanding which are recognised in the Australian curriculum as key to supporting young Australians to become successful learners, confident and creative individuals, active and informed citizens and masters of deep learning in discipline and other content domains.
'Engagement' is by any definition fundamental, and not just because it is a prerequisite to learning in the formal curriculum.
Schooling is a preparation for the future, but it also comprises around one fifth of many working lives. How many students, in which kinds of schools, look forward to going to work every day because their passions and interests are engaged?
Social formation and outlook: what do students learn about their own and others' place in the social world? How do boys see girls, and vice versa? How do students in faith-based schools see those in secular schools, and vice versa? Muslims and Jews? Rich and poor? Urban and rural, local-born and migrant, indigenous and other Australians?
This is sensitive terrain, but it is there as a central part of what schools do whether we like it or not; we know very little about how well or badly our system is doing it.
And finally, what relationships are formed in the school years? Are they being formed within social groups or between? Do they persist beyond school? Are schools contributing to social cohesion, or eroding it?
Computer-enabled network analysis makes it possible to understand how schools are performing in one of their most basic tasks.
No doubt others would propose a different regime. And no doubt some difficult questions arise, including how to avoid over-testing and the negative effects of information of the kind that we have seen in NAPLAN.
The OECD has work underway to expand outcomes to be assessed in PISA. The latest review of NAPLAN promises that nothing is off the table. All this is to be applauded.
It is a good time to conduct a national discussion about assessment strategy as extensive as schooling's purposes and effects, not as narrow as a few standardised tests.
We need to assess what we value, not just value what we assess.
Banner: Getty Images