report reveals how Australian consumers are being duped online
- Written by Katharine Kemp, Senior Lecturer, Faculty of Law & Justice, UNSW, UNSW Sydney
Australian consumers’ choices on websites and apps are being manipulated through online designs taking advantage of their weaknesses. That’s according to research on consumers’ online experiences and the presentation of websites and apps, released today[1] by the Consumer Policy Research Centre (CPRC).
The research gives examples of consumers being manipulated or deceived into unintentionally buying items, paying more, or giving up more personal data than they meant to.
Examples include situations where an online store automatically added items to consumers’ carts, and “Hotel California” techniques which make it easy to subscribe to a service, but much harder to unsubscribe.
According to the CPRC’s findings, 83% of Australians surveyed had experienced one or more negative consequences – including financial harm or feeling manipulated – as a result of these “dark patterns[2]”.
Some misleading designs breach the Australian Consumer Law. However, not all designs that have unfair consequences will necessarily be captured under the law. The latest report adds to existing calls to amend consumer law[3] by introducing a ban on unfair trading practices.
Read more: ACCC 'world first': Australia's Federal Court found Google misled users about personal location data[4]
What are dark patterns?
Experts[5] and regulators[6] around the world have highlighted concerning online design techniques in recent years, labelling them “dark patterns” or “deceptive design”.
These designs often take advantage of a consumer’s recognised behavioural biases. For instance, “default bias[7]” is consumers’ bias in favour of leaving default choices in place to avoid making complex decisions. Businesses take advantage of this by pre-ticking boxes in favour of the business’s preferences, despite consumer interests.
The Australian Competition & Consumer Commission[8] has examined dark patterns, defining[9] them as:
The design of user interfaces intended to confuse users, make it difficult for users to express their actual preferences, or manipulate users into taking certain actions.
The CPRC study conducted a randomised sweep of websites and apps to identify deceptive design features.
Hidden costs: I bought what?
The CPRC found several examples of online stores automatically adding items to consumers’ shopping carts, such as insurance or service plans.
For example, in one case a consumer buying a washing machine from a major online retailer for A$1,059, may or may not have noticed a single-line item, “3 Year Care Plan For Home - $160”, in the final steps of their purchase.
In other cases, customers were presented with offers of a product care plan at several points in the checkout process. The CPRC says:
this design approach risks implying that […] a product care plan is required when most faults or problems are adequately covered by the consumer guarantees.
For products sold in Australia, consumer guarantees about the quality of products are provided free of charge under the Australian Consumer Law.
“Hotel California” or forced continuity
Another concerningly common pattern is the relative difficulty consumers experience when trying to unsubscribe from a service, compared with how easy it is to sign up. CPRC labels this “Hotel California”, after the famous line in the Eagles’ song: “You can check out any time you like, but you can never leave”.
Examples from the CPRC’s findings included attempting to cancel an Amazon Music Unlimited subscription, which required a consumer to navigate more than five screens. Similarly, cancelling an eBay Plus subscription required four additional steps after selecting “cancel membership”.
The CPRC argues it should be as easy to opt-out of a service as it is to opt-in. While extra steps may not seem disastrous in isolation, they can especially disadvantage those already experiencing vulnerabilities, such as sudden illness, loss of a loved one, or low digital literacy.
This is sometimes combined with another manipulative design technique called “confirmshaming”. With this, consumers are asked to confirm a statement that makes them feel shamed or foolish, such as if they want to “lose their benefits” or if they “refuse to support” a good cause.
Data grabs, colours and countdowns
The CPRC also found the majority of consumers surveyed (89%) had experienced being asked for more personal information than was needed to access the relevant product or service. This was achieved in various ways, including by:
- pre-ticking the option to receive marketing communications
- forcing the consumer to create a profile to browse or purchase a product, and
- treating the mere use of a website as acceptance of data terms or conditions.
Other examples of manipulative design included highlighting the business’s preference in a colour known to entice consumers to agree or act[10] (often green or blue), using a rapid countdown to create a false sense of urgency, and warning that a number of other customers are looking at a product.
Importantly, the research found consumers aged between 18 and 28 were more likely to suffer negative impacts from manipulative design, leading to substantial effects on their financial well-being and privacy. A significant proportion of consumers in this younger age bracket reported they:
- accidentally bought something (12%)
- spent more than they intended (33%)
- disclosed more personal information than they wanted to (27%)
- created an online account when they didn’t want to (37%), and
- accidentally signed up to something (39%).
We need to upgrade business practices and consumer law
For businesses, using dark patterns to boost profit will likely lead to long-term losses in the form of consumer trust and loyalty. Almost one in three people surveyed said they stopped using a website or app (either temporarily or permanently) after experiencing dark patterns.
Misleading designs may also lead to penalties for businesses under the Australian Consumer Law. This happened last year when Google’s privacy settings[11] were found likely to mislead consumers.
However, other designs that have unfair consequences might not fall foul of consumer laws[12], if they don’t meet certain criteria set out by the law.
The CPRC’s research adds to evidence in support of the Australian Competition & Consumer Commission’s existing recommendation[13] that our consumer law should include an unfair practices prohibition, similar to those in the European Union and the United Kingdom.
References
- ^ released today (cprc.org.au)
- ^ dark patterns (www.wired.com)
- ^ amend consumer law (www.theguardian.com)
- ^ ACCC 'world first': Australia's Federal Court found Google misled users about personal location data (theconversation.com)
- ^ Experts (www.forbrukerradet.no)
- ^ regulators (www.ftc.gov)
- ^ default bias (www.intereconomics.eu)
- ^ Australian Competition & Consumer Commission (www.accc.gov.au)
- ^ defining (www.accc.gov.au)
- ^ entice consumers to agree or act (fil.forbrukerradet.no)
- ^ Google’s privacy settings (www.accc.gov.au)
- ^ might not fall foul of consumer laws (cprc.org.au)
- ^ existing recommendation (www.accc.gov.au)