11.2 C
Melbourne
Saturday, November 16, 2024

Trending Talks

spot_img

Distorted: this feeble report misses the boat on classroom behaviour

[ad_1]

At an event at Parliament House earlier this year I heard that 2024 is going to be the year of education. That is excellent news given that we haven’t heard much about education from the Albanese government but, to be honest, that has been somewhat of a blessed reprieve given the hyperventilation of the previous Morrison LNP government.

I have mixed feelings about what might be coming but wouldn’t if education policy was informed by evidence rather than politics. It isn’t. The impact of that politicisation is never openly acknowledged and the policy decisions that are made (or not made) by governments are never the focus of inquiries or reviews. Instead, the “problem” is always framed by alleged deficiencies in students, parents, teachers, and/or universities.

Disagreement among panel members

Take, for example, the Senate Inquiry into the issue of increasing disruption in Australian classrooms. The interim report has just landed, and, like the final report of the Disability Royal Commission, there was disagreement among panel members. Labor and Greens senators have made additional comments that acknowledge the complexity of behaviour in schools and the Greens have only one recommendation: to fully fund public schools at the beginning of the next National School Reform Agreement in 2025. 

I was called to give evidence at the senate inquiry. At the time, I expressed concern that the Inquiry based its case for ‘increasing disruption’ on PISA data, noting first, that there are cultural and other differences between countries and second, that there are problems with the rankings. I will have more to say about the report and its recommendations in time but for now I want to take readers through points I made in the new first chapter of Inclusive Education for the 21st Century, which extend my comments from the evidence I gave to the inquiry.

Since that hearing, I have looked more closely at the data on which these claims are based and I’m frankly astonished that the Inquiry team did not do this themselves. Even a cursory glance should have been enough to signal to the Senate that these rankings were not a rigorous enough premise on which to base an Inquiry. 

Let us wade through this numerical sewage together

The claim for ‘increasing disruption in Australian classrooms’ is based on the difference in results from two surveys of 15-year-olds who participated in the OECD’s Program of International Student Assessment (PISA). 

The first survey occurred in 2009 and the second in 2018. The disciplinary climate data is based on five survey items:  

1.       Students don’t listen to what the teacher says. 

2.       There is noise and disorder.  

3.       The teacher has to wait a long time for students to quiet down.  

4.       Students cannot work well.  

5.       Students don’t start working for a long time after the lesson begins. 

Here’s where things get interesting! Here are relevant findings from the two reports.

PISA 2009 PISA 2018
Participating countries were ranked on the percentage of 15-year-old students who selected ‘never or hardly ever’ and ‘in some lessons’ for Item 1 ‘Students don’t listen to what the teacher says’, and Item 3 ‘The teacher has to wait a long time for students to quiet down’. 79 countries participated and 76 were ranked, however, this time the OECD developed a disciplinary climate index that encompasses all five items with some minor changes in wording.
Australia was ranked 28th for the first item and 25th for the second. Countries were ranked using their respective Index scores.
Differences between PISA 200 and PISA 2009 were calculated. Australia was ranked 69th
Australia deemed to have an average disciplinary climate that had not significantly changed between the two timepoints. Differences between PISA 2009 and PISA 2018 were calculated 
There was a significant difference between timepoints in the responses of Australian students for only two of the five items: Item 3 ‘The teacher has to wait a long time for students to quiet down’, and Item 4 ‘Students cannot work well’
Item (5) also declined (-1.8%) but not significantly, while Items (1) and (2) improved (both +0.8%), but again not significantly.

What does all this mean?

First, Australia has not fallen from 28th or 25th in the ranking to 69th. Rather, the number of participating countries has changed over time and so therefore have the rankings. To be clear, the number of participating countries has grown from 43 (2000) to 65 (2009) to 79 (2018). And, because comparisons can only be made between countries that participated in each assessment, the number of countries in the rankings has changed from 38 in 2009 to 76 in 2018. This is not to dispute that Australia is ranked lower than anyone would like but there are problems with the rankings which render them meaningless. 

Here’s why

1)    The types of countries participating in PISA 2009 and PISA 2018 substantively changed due to the entrance of Asian countries. Unlike Australia, these jurisdictions/systems are grounded in Confucian culture, which has a profound effect on teacher-student relationships, classroom interactions, and climate. 

2)    There was a significant difference between timepoints in the responses of Australian students for only two of the five items. The case for increasing disruption in Australian classrooms therefore rests on a 3.7% decrease in the number of students saying their teacher ‘never or hardly ever’ has to wait a long time for students to quiet down, and a 2.8% decrease in the number saying students cannot work well ‘never or hardly ever’. Given that there was no difference in students’ responses between PISA 2000 and 2009, that suggests that there has been no change in more than 20 years for at least two of the five items.

3)    Countries with almost identical disciplinary index scores are ranked above and below each other. For example, Australia and Belgium received Index scores of 0.20 and 0.21, respectively yet Australia is ranked 69th and Belgium 70th. There is a snowball’s chance in hell that these scores are statistically different to each other, so why is one being ranked above the other? Doing this simply expands the number of places in the ranking which makes the distance between countries look larger than it really is.

4)    No tests of significance between countries or ranks were conducted, so we do not know whether there is a statistically significant difference in Australian students’ responses to the OECD average or how much of a difference there is between Australia and the countries at the top of the ranking. Similar points have been made numerous times over the years in relation to the rankings for student achievement in reading, mathematics, and science, but at least in those cases, countries with statistically indistinguishable performances are grouped together and given the same rank. 

5)    Recent research by Sally Larsen from the University of New England has indicated no decline in TIMMS, PIRLS or NAPLAN results of Australian students. Any observed correlations between declines in PISA’s disciplinary climate survey and student academic outcomes should not be causally interpreted.

My view

If politicians are going to look at rankings, then look at them all. Let’s consider, for example, that: 

1.     Australia is sitting at the top of ranked countries in terms of the hours that teachers spend in face-to-face teaching. 

2.     Australian teachers spend more hours teaching than the OECD average (838.28 hours/year vs 800.45 hours respectively)

3.     Korea is ranked first in classroom disciplinary climate and Australia is ranked 69th. However, Australian teachers spend 323.30 more hours per year in face-to-face teaching than their Korean counterparts, who teach just 516.98 hours/year.

4.     In disciplinary climate, the difference between advantaged students and disadvantaged students in Australia (0.34) is double that of Korea (0.17). 

These are just some of the gaps and anomalies that arise when the PISA data is subjected to close reading, which is the absolute minimum amount of analysis that should have been conducted (if not, prior, then at least) during an Inquiry that used these data for its rationale.

The questions education ministers must ask

Readers of the Interim Report, especially Education Ministers, should regard it very critically and start asking serious questions:

  • Who stands to benefit from such simple representations of these data?
  • Might there be financial benefits for non-university providers from the ‘deregulation’ of initial teacher education?
  • Are there other data that have been ignored and, if so, what does their omission suggest about rigour and bias?
  • Might Australian students tell a different story if asked by expert researchers using both open and close-ended questions? 

Are we brave enough to ask them?

Linda Graham is professor and director of The Centre for Inclusive Education at Queensland University of Technology (QUT). She has led multiple externally funded research projects and has published more than 100 books, chapters and articles. Her international bestseller, Inclusive Education for the 21st Century: Theory, Policy and Practice, is now in its second edition. In 2020, Linda chaired the Inquiry into Suspension, Exclusion and Expulsion processes in South Australian government schools. She also gave evidence to the Royal Commission into Violence, Abuse, Neglect and Exploitation of People with Disability on the use of exclusionary school discipline and its effects.



[ad_2]

Source link

Serendib News
Serendib News
Serendib News is a renowned multicultural web portal with a 17-year commitment to providing free, diverse, and multilingual print newspapers, featuring over 1000 published stories that cater to multicultural communities.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles