The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Australia has its first framework for AI use in schools – but we need to proceed with caution

  • Written by Lucinda McKnight, Senior Lecturer in Pedagogy and Curriculum, Deakin University
Australia has its first framework for AI use in schools – but we need to proceed with caution

Federal and state governments have just released a national framework[1] for generative AI in schools. This paves the way for generative AI[2] – algorithms that can create new content – to be used routinely in classrooms around the country.

This provides much-needed guidance, a full year after the launch of ChatGPT. Over the past 12 months, schools have had a range of responses[3] to the technology from outright banning to trying to incorporate it into learning.

What is in the framework and what is missing?

Read more: High school students are using a ChatGPT-style app in an Australia-first trial[4]

What is the framework?

The framework was agreed by state and federal education ministers in October and released publicly last week.

It is designed to help schools use generative AI “in a safe and effective way”. It notes it has “great potential to assist teaching and learning and reduce administrative workload in Australian schools”. But at the same time it warns of risk and consequences, including

the potential for errors and algorithmic bias in generative AI content; the misuse of personal or confidential information; and the use of generative AI for inappropriate purposes, such as to discriminate against individuals or groups, or to undermine the integrity of student assessments.

Federal Education Minister Jason Clare also stressed “schools should not use generative AI products that sell student data”.

What is in the framework?

The framework itself is just two pages long, and includes six overarching principles and 25 “guiding statements”. The six principles are:

  • teaching and learning, including schools teaching students about how these tools work, including their potential limitations and biases

  • human and social wellbeing, including using tools in a way that avoids reinforcing biases

  • transparency, including disclosing when tools are used and their impact

  • fairness, including access for people from diverse and disadvantaged backgrounds

  • accountability, including schools testing tools before they use them, and

  • privacy, security and safety, including the use of “robust” cyber-security measures.

The framework will be reviewed every 12 months.

Read more: AI is now accessible to everyone: 3 things parents should teach their kids[5]

Caution is needed

The framework does important work acknowledging opportunities of this technology, while noting the importance of wellbeing, privacy, security and safety.

However, some of these concepts are much less straightforward than the framework suggests. As experts in generative AI in education, we have moved from optimism to a much more cautious stance about this technology over the past 12 months. As UNESCO has recently warned[6],

the speed at which generative AI technologies are being integrated into education systems in the absence of checks, rules or regulations, is astonishing.

The framework puts an extraordinary onus on schools and teachers to do high-stakes work for which they may not be qualified or do not have time or funding to complete.

For example, the framework calls for “explainability” – but even the developers of AI models struggle to fully explain[7] how they work.

The framework also calls on schools to do risk assessments of algorithms[8], design appropriate learning experiences, revise assessments, consult with communities, learn about and apply intellectual property rights and copyright law and generally become expert in the use of generative AI.

It is not clear how this can possibly be achieved within existing workloads, which we know are already stretched[9]. This is particularly so when the nature[10] and ethics of generative AI are complex and contested[11]. We also know the technology is not foolproof – it makes mistakes[12].

Here are five areas we think need to be included in any further version of this framework.

1. A more honest stance on generative AI

We need to be clear that generative AI is biased. This is because it reflects the biases of its training materials[13], including what is published on the internet.

Such limited datasets are created largely by those who are white, male and United States or Western-based[14].

For example, a current version of ChatGPT does not speak in or use Australian First Nations words. There may be valid reasons for this, such as not using cultural knowledges without permission. But this indicates the whiteness of its “voice” and the problems inherent in requiring students to use or rely on it.

2. More evidence

The use of technology does not automatically improve teaching and learning.

So far, there is little research demonstrating the benefits of generative AI use in education. In fact, (a recent UNESCO report[15] confirmed there is little evidence of any improvement to learning from the use of digital technology in classrooms over decades.

But we do have research showing the the harms of algorithms. For example, AI-driven feedback[16] narrows the kinds of writing students produce and privileges white voices.

Schools need support to develop processes and procedures to monitor and evaluate the use of generative AI by both staff and students.

3. Acknowledging dangers around bots

There is long-standing research[17] demonstrating the dangers of chatbots and their capacity to harm human creativity and critical thinking. This happens because humans seem to automatically trust bots and their outputs.

The framework should aim to clarify which (low-stakes) tasks are and are not suitable for generative AI for both students and teachers. High stakes marking, for example, should be completed by humans.

4. Transparency

So far, the framework seems to focus on students and their activities,

All use of generative AI in schools needs to be disclosed. This should include teachers using generative AI to prepare teaching materials and plan lessons.

5. Acknowledging teachers’ expertise

The global education technology (“edtech”) market was estimated to be worth about US$300 billion[18] (A$450 billion) as of 2022. Some companies argue[19] edtech can be used to monitor students’ progress and take over roles traditionally done by teachers.

Australia’s national education policies need to ensure teachers’ roles are not downgraded as AI use becomes more common. Teachers are experts in more than just subject matter. They are experts in how to teach various disciplines and in their students’ and communities’ needs.

Read more: The rise of ChatGPT shows why we need a clearer approach to technology in schools[20]

References

  1. ^ national framework (www.education.gov.au)
  2. ^ generative AI (www.mckinsey.com)
  3. ^ range of responses (theconversation.com)
  4. ^ High school students are using a ChatGPT-style app in an Australia-first trial (theconversation.com)
  5. ^ AI is now accessible to everyone: 3 things parents should teach their kids (theconversation.com)
  6. ^ recently warned (unesdoc.unesco.org)
  7. ^ fully explain (www.nature.com)
  8. ^ risk assessments of algorithms (www.businessofgovernment.org)
  9. ^ already stretched (www.abc.net.au)
  10. ^ nature (bera-journals.onlinelibrary.wiley.com)
  11. ^ contested (cmci.colorado.edu)
  12. ^ makes mistakes (www.ibm.com)
  13. ^ biases of its training materials (dl.acm.org)
  14. ^ white, male and United States or Western-based (www.bloomberg.com)
  15. ^ a recent UNESCO report (www.unesco.org)
  16. ^ AI-driven feedback (www.tandfonline.com)
  17. ^ long-standing research (dl.acm.org)
  18. ^ US$300 billion (www.holoniq.com)
  19. ^ argue (www.theaustralian.com.au)
  20. ^ The rise of ChatGPT shows why we need a clearer approach to technology in schools (theconversation.com)

Read more https://theconversation.com/australia-has-its-first-framework-for-ai-use-in-schools-but-we-need-to-proceed-with-caution-219094

Times Magazine

Can bigger-is-better ‘scaling laws’ keep AI improving forever? History says we can’t be too sure

OpenAI chief executive Sam Altman – perhaps the most prominent face of the artificial intellig...

A backlash against AI imagery in ads may have begun as brands promote ‘human-made’

In a wave of new ads, brands like Heineken, Polaroid and Cadbury have started hating on artifici...

Home batteries now four times the size as new installers enter the market

Australians are investing in larger home battery set ups than ever before with data showing the ...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

The Times Features

Why the Mortgage Industry Needs More Women (And What We're Actually Doing About It)

I've been in fintech and the mortgage industry for about a year and a half now. My background is i...

Inflation jumps in October, adding to pressure on government to make budget savings

Annual inflation rose[1] to a 16-month high of 3.8% in October, adding to pressure on the govern...

Transforming Addiction Treatment Marketing Across Australasia & Southeast Asia

In a competitive and highly regulated space like addiction treatment, standing out online is no sm...

Aiper Scuba X1 Robotic Pool Cleaner Review: Powerful Cleaning, Smart Design

If you’re anything like me, the dream is a pool that always looks swimmable without you having to ha...

YepAI Emerges as AI Dark Horse, Launches V3 SuperAgent to Revolutionize E-commerce

November 24, 2025 – YepAI today announced the launch of its V3 SuperAgent, an enhanced AI platf...

What SMEs Should Look For When Choosing a Shared Office in 2026

Small and medium-sized enterprises remain the backbone of Australia’s economy. As of mid-2024, sma...

Anthony Albanese Probably Won’t Lead Labor Into the Next Federal Election — So Who Will?

As Australia edges closer to the next federal election, a quiet but unmistakable shift is rippli...

Top doctors tip into AI medtech capital raise a second time as Aussie start up expands globally

Medow Health AI, an Australian start up developing AI native tools for specialist doctors to  auto...

Record-breaking prize home draw offers Aussies a shot at luxury living

With home ownership slipping out of reach for many Australians, a growing number are snapping up...