The Times Australia
Mirvac Harbourside
The Times World News

.

Australia will impose a ‘digital duty of care’ on tech companies to reduce online harm. It’s a good idea – if it can be enforced

  • Written by Lisa M. Given, Professor of Information Sciences & Director, Social Change Enabling Impact Platform, RMIT University

In an escalation of its battle with big tech, the federal government has announced[1] it plans to impose a “digital duty of care” on tech companies to reduce online harms.

The announcement follows the government’s controversial plans to legislate a social media ban[2] for young people under 16 and impose tighter rules[3] on digital platforms such as Google, Facebook, Instagram, X and TikTok to address misinformation and disinformation.

In a speech last night, Minister for Communications Michelle Rowland explained why the government was planning to introduce a digital duty of care:

What’s required is a shift away from reacting to harms by relying on content regulation alone, and moving towards systems-based prevention, accompanied by a broadening of our perspective of what online harms are.

This is a positive step forward and one aligned with other jurisdictions around the world.

What is a ‘digital duty of care’?

Duty of care is a legal obligation to ensure the safety of others. It isn’t limited to just not doing harm; it also means taking reasonable steps to prevent harm.

The proposed digital duty of care will put the onus on tech companies such as Meta, Google and X to protect consumers from harm on their online platforms. It will bring social media platforms in line with companies who make physical products who already have a duty of care to do their best to make sure their products don’t harm users.

The digital duty of care will require tech companies to regularly conduct risk assessments to proactively identify harmful content.

This assessment must consider what Rowland called “enduring categories of harm”, which will also be legislated. Rowland said these categories could include:

  • harms to young people
  • harms to mental wellbeing
  • the instruction and promotion of harmful practices
  • other illegal content, conduct and activity.

This approach was recommended by the recent review of the Online Safety Act[4]. It is something that is already in effect elsewhere around the world, including in the United Kingdom as part of the Online Safety Act[5] and under the European Union’s Digital Services Act[6].

As well as placing the onus on tech companies to protect users of their platforms, these acts also put the power to combat harmful content into the hands of consumers.

For example, in the EU consumers can submit online complaints[7] about harmful material directly to the tech companies, who are legally obliged to act on these complaints. Where a tech company refuses to remove content, users can complain to a Digital Services Coordinator to investigate further. They can even pursue a court resolution if a satisfactory outcome cannot be reached.

The EU act sets out that if tech companies breach their duty of care to consumers, they can face fines of up to 6% of their worldwide annual turnover[8].

The Human Rights Law Centre in Australia supports the idea of a digital duty of care. It says[9] “digital platforms should owe a legislated duty of care to all users”.

Photo of Facebook homepage on a monitor screen through a magnifying glass.
The government’s proposal will put the onus on tech companies such as Facebook to proactively remove harmful material online. Gil C/Shutterstock[10]

Why is it more appropriate than a social media ban?

Several experts[11]including myself[12] – have pointed out problems with the government’s plan to ban people under 16 from social media.

For example, the “one size fits all” age requirement doesn’t consider the different levels of maturity of young people. What’s more, simply banning young people from social media just delays their exposure to harmful content online. It also removes the ability of parents and teachers to engage with children on the platforms and to help them manage potential harms safely.

The government’s proposed “digital duty of care” would address these concerns.

It promises to force tech companies to make the online world safer by removing harmful content, such as images or videos which promote self-harm. It promises to do this without banning young people’s access to potentially beneficial material or online social communities.

A digital duty of care also has the potential to address the problem of misinformation and disinformation.

The fact Australia would be following the lead of international jurisdictions is also significant. This shows big tech there is a unified global push to combat harmful content appearing on platforms by placing the onus of care on the companies instead of on users.

This unified approach makes it much more likely for tech companies to comply with legislation, when multiple countries impose similar controls and have similar content expectations.

Woman in black suit speaking in front of white building. On Wednesday night minister for communications Michelle Rowland announced the government plans to impose a digital duty of care on tech companies. Mick Tsikas/AAP[13]

How will it be enforced?

The Australian government says it will strongly enforce the digital duty of care. As Minister Rowland said last night:

Where platforms seriously breach their duty of care – where there are systemic failures – we will ensure the regulator can draw on strong penalty arrangements.

Exactly what these penalty arrangements will be is yet to be announced. So too is the method by which people could submit complaints to the regulator about harmful content they have seen online and want to be taken down.

A number of concerns about implementation have been raised in the UK[14]. This demonstrates that getting the details right will be crucial to success in Australia and elsewhere. For example, defining what constitutes harm will be an ongoing challenge and may require test cases to emerge through complaints and/or court proceedings.

And as both the EU and UK introduced this legislation only within the past year, the full impact of these laws – including tech companies’ levels of compliance – is not yet known.

In the end, the government’s turn towards placing the onus on the tech companies to remove harmful content, at the source, is welcome. It will make social media platforms a safer place for everyone – young and old alike.

References

  1. ^ the federal government has announced (minister.infrastructure.gov.au)
  2. ^ social media ban (theconversation.com)
  3. ^ impose tighter rules (theconversation.com)
  4. ^ review of the Online Safety Act (minister.infrastructure.gov.au)
  5. ^ Online Safety Act (www.legislation.gov.uk)
  6. ^ Digital Services Act (commission.europa.eu)
  7. ^ submit online complaints (hateaid.org)
  8. ^ face fines of up to 6% of their worldwide annual turnover (digital-strategy.ec.europa.eu)
  9. ^ It says (www.hrlc.org.au)
  10. ^ Gil C/Shutterstock (www.shutterstock.com)
  11. ^ Several experts (theconversation.com)
  12. ^ including myself (theconversation.com)
  13. ^ Mick Tsikas/AAP (photos.aap.com.au)
  14. ^ been raised in the UK (theconversation.com)

Read more https://theconversation.com/australia-will-impose-a-digital-duty-of-care-on-tech-companies-to-reduce-online-harm-its-a-good-idea-if-it-can-be-enforced-243682

Mirvac Harbourside

Times Magazine

YepAI Joins Victoria's AI Trade Mission to Singapore for Big Data & AI World Asia 2025

YepAI, a Melbourne-based leader in enterprise artificial intelligence solutions, announced today...

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an onli...

September Sunset Polo

International Polo Tour To Bridge Historic Sport, Life-Changing Philanthropy, and Breath-Taking Beau...

5 Ways Microsoft Fabric Simplifies Your Data Analytics Workflow

In today's data-driven world, businesses are constantly seeking ways to streamline their data anal...

7 Questions to Ask Before You Sign IT Support Companies in Sydney

Choosing an IT partner can feel like buying an insurance policy you hope you never need. The right c...

Choosing the Right Legal Aid Lawyer in Sutherland Shire: Key Considerations

Legal aid services play an essential role in ensuring access to justice for all. For people in t...

The Times Features

Yellow Canary partners with global payroll audit leader Celery to bring pre-payroll review technology to Australia

Payroll compliance is becoming tougher for Australian employers. Underpayment cases continue to do...

Noticing These 5 Issues? Contact an Emergency Plumber Now

The invisible arteries running through homes, plumbing systems, streamline daily life discreetly...

The Perfect Champagne Day Pairing: Luke Nguyen’s Chargrilled Lemongrass Beef Skewers

Celebrate Champagne Day on October 24th with this delicious recipe and elegant pairing from Luke Ngu...

Bribing kids to eat vegetables might backfire. Here’s what to do instead

It’s a tactic many parents know well: “eat two bites of broccoli, and then you can have desser...

Common Wall Mounting Challenges and How Professionals Solve Them

It is not always as easy as it seems to mount artwork, shelves, or TVs, since some difficulties are ...

Understanding Centrelink Investment Property Valuation: A Guide for Australian Property Owners

Introduction Owning an investment property in Australia can bring financial stability — but it al...

The climate crisis is fuelling extreme fires across the planet

We’ve all seen the alarming images. Smoke belching from the thick forests[1] of the Amazon. Sp...

Applications open for Future Cotton Leaders Program 2026

Applications have opened for the 2026 intake for the Australia Future Cotton Leaders Program (AFCL...

Optimising is just perfectionism in disguise. Here’s why that’s a problem

If you regularly scroll health and wellness content online, you’ve no doubt heard of optimisin...