The Times Australia
Fisher and Paykel Appliances
The Times World News

.

We have developed a way to screen student feedback to ensure it's useful, not abusive (and academics don't have to burn it)

  • Written by Abby Cathcart, Professor of Higher Education & Governance, Queensland University of Technology
We have developed a way to screen student feedback to ensure it's useful, not abusive (and academics don't have to burn it)

This week, many Australian universities will be sending academics the results of the first semester student evaluation surveys.

For some this will be a worrying and unpleasant time. The comments university students make anonymously in their teaching evaluations can leave academics feeling fearful[1], distressed[2] and demoralised[3].

And with good reason. As a 2021 survey[4] of Australian academics and their experiences of student feedback found:

Personally destructive, defamatory, abusive and hurtful comments were commonly reported.

Hurtful or abusive comments can remain permanently on record as a measure of performance[5]. These records can affect applications for promotion or for secure continued employment.

The authors of the 2021 survey, led by Richard Lakeman at Southern Cross University have been among those calling for[6] anonymous online surveys to be scrapped. Some academics, burned by their experience of student feedback, say they no longer open or engage[7] with student evaluation reports. They said the risk of harm outweighed any benefits.

Read more: 'Lose some weight', 'stupid old hag': universities should no longer ask students for anonymous feedback on their teachers[8]

In the Netflix show, The Chair, a memorable scene sees the character Professor Joan Hambling burn[9] her student evaluations. Clearly, a different solution is needed.

Feedback from students can still be valuable for lifting teaching standards and it’s important students have their say.

We have developed a screening system[10] using machine learning[11] (where software changes its behaviour by “learning” from user input) that allows students to talk about their experiences while protecting academics from unacceptable comments.

Why a new approach is needed

University codes of conduct remind students of their general obligation to refrain from abusive or discriminatory behaviour, but not specifically in regard to student evaluations.

Instead, universities rely on self-regulation or on others to report incidents. Some institutions use profanity blockers to screen comments. Even then, these often fail to detect emerging terms of abuse in online speech.

So, in setting up our screening system, we wanted to:

  • promote staff and student well-being
  • enhance the reliability and validity of student feedback
  • improve confidence in the integrity of survey results.

We developed a method using machine learning and a dictionary of terms to screen for unacceptable student comments. The dictionary was created by QUT drawing on historically identified unacceptable comments and incorporating prior research into abusive and discriminatory terms.

Our ‘Screenomatic’ solution

There is not a lot of published work on the detection of unacceptable or abusive comments in student evaluation surveys. So our team adapted earlier research[12] on detecting misogynistic tweets. This worked because often the student comments we looked at were similar in length to a tweet’s 280-character limit.

Our approach, which we call “Screenomatic[13]”, automatically reviewed more than 100,000 student comments during 2021 and identified those that appeared to be abuse. Trained evaluation staff members manually reviewed about 7,000 flagged comments, updating the machine-learning model after each semester. Each update improves the accuracy of auto-detection.

Read more: Gender bias in student surveys on teaching increased with remote learning. What can unis do to ensure a fair go for female staff?[14]

Ultimately, 100 comments were removed before the results were released to educators and supervisors. University policy enables comments to be re-identified in cases of potential misconduct. The central evaluations team contacted these students and reminded them of their obligations under the code of conduct.

The Screenomatic[15] model can help protect both educators and students. Staff are safeguarded from abuse, and students at risk – who make comments that indicate they need mental health help, include allegations of bullying or harassment, or that threaten staff or other students – can be offered support. Universities can share data to train the model and maintain currency.

Importantly, the process enables universities to act morally to harness student voices while protecting people’s well-being.

Useful feedback, not abuse

The number of educators who receive abusive feedback may be relatively small[16]. However, it’s still unacceptable for universities to continue to expose their staff to offensive comments in the full knowledge of their potential impact.

Read more: Our uni teachers were already among the world's most stressed. COVID and student feedback have just made things worse[17]

With last year’s High Court ruling on liability for defamatory posts[18], and attempts to improve online safety[19], there is a growing acknowledgement that people should not be able to post anonymous, harmful messages.

After all, the cost of screening responses is nothing compared to the cost to individuals (including mental health or career consequences). And that’s ignoring the potential costs of litigation and legal damages.

At the end of the day, the anonymous comments are read by real people. As a tweet in response to the Lakeman findings[20] noted:

The Screenomatic model[21] goes a long way towards enabling the “tons of useful feedback” to serve its intended purpose while ensuring people aren’t harmed in the process.

References

  1. ^ fearful (theconversation.com)
  2. ^ distressed (theconversation.com)
  3. ^ demoralised (theconversation.com)
  4. ^ survey (www.tandfonline.com)
  5. ^ measure of performance (www.tandfonline.com)
  6. ^ calling for (theconversation.com)
  7. ^ open or engage (www.tandfonline.com)
  8. ^ 'Lose some weight', 'stupid old hag': universities should no longer ask students for anonymous feedback on their teachers (theconversation.com)
  9. ^ burn (theconversation.com)
  10. ^ screening system (www.tandfonline.com)
  11. ^ machine learning (theconversation.com)
  12. ^ earlier research (link.springer.com)
  13. ^ Screenomatic (www.tandfonline.com)
  14. ^ Gender bias in student surveys on teaching increased with remote learning. What can unis do to ensure a fair go for female staff? (theconversation.com)
  15. ^ Screenomatic (www.tandfonline.com)
  16. ^ small (link.springer.com)
  17. ^ Our uni teachers were already among the world's most stressed. COVID and student feedback have just made things worse (theconversation.com)
  18. ^ liability for defamatory posts (www.nfplaw.org.au)
  19. ^ improve online safety (www.ag.gov.au)
  20. ^ Lakeman findings (doi.org)
  21. ^ Screenomatic model (www.tandfonline.com)

Read more https://theconversation.com/we-have-developed-a-way-to-screen-student-feedback-to-ensure-its-useful-not-abusive-and-academics-dont-have-to-burn-it-185041

Times Magazine

Mapping for Trucks: More Than Directions, It’s Optimisation

Daniel Antonello, General Manager Oceania, HERE Technologies At the end of June this year, Hampden ...

Can bigger-is-better ‘scaling laws’ keep AI improving forever? History says we can’t be too sure

OpenAI chief executive Sam Altman – perhaps the most prominent face of the artificial intellig...

A backlash against AI imagery in ads may have begun as brands promote ‘human-made’

In a wave of new ads, brands like Heineken, Polaroid and Cadbury have started hating on artifici...

Home batteries now four times the size as new installers enter the market

Australians are investing in larger home battery set ups than ever before with data showing the ...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

The Times Features

The rise of chatbot therapists: Why AI cannot replace human care

Some are dubbing AI as the fourth industrial revolution, with the sweeping changes it is propellin...

Australians Can Now Experience The World of Wicked Across Universal Studios Singapore and Resorts World Sentosa

This holiday season, Resorts World Sentosa (RWS), in partnership with Universal Pictures, Sentosa ...

Mineral vs chemical sunscreens? Science shows the difference is smaller than you think

“Mineral-only” sunscreens are making huge inroads[1] into the sunscreen market, driven by fears of “...

Here’s what new debt-to-income home loan caps mean for banks and borrowers

For the first time ever, the Australian banking regulator has announced it will impose new debt-...

Why the Mortgage Industry Needs More Women (And What We're Actually Doing About It)

I've been in fintech and the mortgage industry for about a year and a half now. My background is i...

Inflation jumps in October, adding to pressure on government to make budget savings

Annual inflation rose[1] to a 16-month high of 3.8% in October, adding to pressure on the govern...

Transforming Addiction Treatment Marketing Across Australasia & Southeast Asia

In a competitive and highly regulated space like addiction treatment, standing out online is no sm...

Aiper Scuba X1 Robotic Pool Cleaner Review: Powerful Cleaning, Smart Design

If you’re anything like me, the dream is a pool that always looks swimmable without you having to ha...

YepAI Emerges as AI Dark Horse, Launches V3 SuperAgent to Revolutionize E-commerce

November 24, 2025 – YepAI today announced the launch of its V3 SuperAgent, an enhanced AI platf...