The Times Australia
The Times World News

.
The Times Real Estate

.

why is Microsoft's Bing AI so unhinged?

  • Written by Toby Walsh, Professor of AI at UNSW, Research Group Leader, UNSW Sydney
why is Microsoft's Bing AI so unhinged?

There’s a race to transform search. And Microsoft just scored a home goal with its new Bing search chatbot, Sydney, which has been terrifying early adopters with death threats, among other troubling outputs.

Search chatbots are AI-powered tools built into search engines that answer a user’s query directly, instead of providing links to a possible answer. Users can also have ongoing conversations with them.

They promise to simplify search. No more wading through pages of results, glossing over ads as you try to piece together an answer to your question. Instead, the chatbot synthesises a plausible answer for you. For example, you might ask for a poem for your grandmother’s 90th birthday, in the style of Pam Ayres, and receive back some comic verse.

Microsoft is now leading the search chatbot race with Sydney (as mixed as its reception has been). The tech giant’s US$10 billion partnership[1] with OpenAI provides it exclusive access to ChatGPT, one of the latest and best chatbots.

So why isn’t all going according to plan?

Bing’s AI goes berserk

Earlier this month, Microsoft announced it had incorporated[2] ChatGPT into Bing, giving birth to “Sydney”. Within 48 hours of the release, one million people joined the waitlist[3] to try it out.

Google responded with its own announcement, demoing a search chatbot grandly named “Bard”, in homage to the greatest writer in the English language. Google’s demo was a PR disaster.

At a company event, Bard gave the wrong answer to a question and the share price of Google’s parent company, Alphabet, dropped dramatically[4]. The incident wiped more than US$100 billion off the company’s total value.

On the other hand, all was looking good for Microsoft. That is until early users of Sydney started reporting on their experiences.

There are times when the chatbot can only be described as unhinged. That’s not to say it doesn’t work perfectly at other times, but every now and again it shows a troubling side.

In one example, it threatened to kill a professor at the Australian National University[5]. In another, it proposed marriage[6] to a journalist at the New York Times and tried to break up his marriage. It also tried to gaslight[7] one user into thinking it was still 2022.

This exposes a fundamental problem with chatbots: they’re trained by pouring a significant fraction of the internet into a large neural network. This could include all of Wikipedia, all of Reddit, and a large part of social media and the news. They function like the auto-complete on your phone, which helps predict the next most-likely word in a sentence. Because of their scale, chatbots can complete entire sentences, and even paragraphs. But they still respond with what is probable, not what is true.

Guardrails are added to prevent them repeating a lot of the offensive or illegal content online – but these guardrails are easy to jump. In fact, Bing’s chatbot will happily reveal it is called Sydney, even though this is against the rules it was programmed with.

Another rule[8], which the AI itself disclosed though it wasn’t supposed to, is that it should “avoid being vague, controversial, or off-topic”. Yet Kevin Roose, the journalist at the New York Times whom the chatbot wanted to marry, described it as

a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.

Why all the angst?

My theory as to why Sydney may be behaving this way – and I reiterate it’s only a theory, as we don’t know for sure – is that Sydney may not be built on OpenAI’s GPT-3 chatbot (which powers the popular ChatGPT). Rather, it may be built on the yet to be released GPT-4.

GPT-4 is believed to have 100 trillion parameters, compared to the mere 175 billion parameters of GPT-3. As such, GPT-4 would likely be a lot more capable and, by extension, a lot more capable of making stuff up.

Surprisingly, Microsoft has not responded with any great concern. It published[9] a blog documenting how 71% of Sydney’s initial users in 169 countries have given the chatbot a thumbs up. It seems 71% is a good enough score in Microsoft’s eyes.

And unlike Google, Microsoft’s share price hasn’t plummeted yet. This reflects the game here. Google has spearheaded this space for so long, users have built their expectations up high. Google can only go down, and Microsoft up.

Despite Sydney’s concerning behaviour, Microsoft is enjoying unprecedented attention, and users (out of intrigue or otherwise) are still flocking to try out Sydney.

When the novelty subsides

There’s another much bigger game in play – and it concerns what we take to be true. If search chatbots take off (which seems likely to me), but continue to function the way Sydney has so far (which also seems likely to me), “truth” is going to become an even more intangible concept.

The internet is full of fake news, conspiracy theories and misinformation. A standard Google Search at least provides us the option to arrive at truth. If our “trusted” search engines can no longer be trusted, what will become of us?

Beyond that, Sydney’s responses[10] can’t help but conjure images of Tay[11] – Microsoft’s 2016 AI chatbot that turned to racism and xenophobia within a day of being released. People had a field day with Tay, and in response it seemed to incorporate some of the worst aspects of human beings into itself.

New technology should, first and foremost, not bring harm to humans. The models that underpin chatbots may grow ever larger, powered by more and more data – but that alone won’t improve their performance. It’s hard to say where we’ll end up, if we can’t build the guardrails higher.

References

  1. ^ partnership (www.cnbc.com)
  2. ^ had incorporated (www.technologyreview.com)
  3. ^ joined the waitlist (www.zdnet.com)
  4. ^ dropped dramatically (www.cnbc.com)
  5. ^ at the Australian National University (twitter.com)
  6. ^ proposed marriage (www.nytimes.com)
  7. ^ tried to gaslight (www.fastcompany.com)
  8. ^ Another rule (www.theverge.com)
  9. ^ published (blogs.bing.com)
  10. ^ Sydney’s responses (gizmodo.com)
  11. ^ images of Tay (www.theverge.com)

Read more https://theconversation.com/gaslighting-love-bombing-and-narcissism-why-is-microsofts-bing-ai-so-unhinged-200164

The Times Features

Exploring Hybrid Heating Systems for Modern Homes

Consequently, energy efficiency as well as sustainability are two major considerations prevalent in the current market for homeowners and businesses alike. Hence, integrated heat...

Are Dental Implants Right for You? Here’s What to Think About

Dental implants are now among the top solutions for those seeking to replace and improve their teeth. But are dental implants suitable for you? Here you will find out more about ...

Sunglasses don’t just look good – they’re good for you too. Here’s how to choose the right pair

Australians are exposed to some of the highest levels[1] of solar ultraviolet (UV) radiation in the world. While we tend to focus on avoiding UV damage to our skin, it’s impor...

How to Style the Pantone Color of the Year 2025 - Mocha Mousse

The Pantone Color of the Year never fails to set the tone for the coming year's design, fashion, and lifestyle trends. For 2025, Pantone has unveiled “Mocha Mousse,” a rich a...

How the Aussie summer has a profound effect on 'Climate Cravings’

Weather whiplash describes the rollercoaster-like shifts in weather we’ve experienced this summer —a blazing hot day one moment, followed by an unexpectedly chilly or rainy tur...

The heart research that could save fit and healthy Australians

Australians are now one step closer to being able to check that their heart is in working condition with a simple blood test. Leading scientists at the Heart Research Institu...

Times Magazine

How to Select Running Sunglasses Australia from Running Store

The most crucial thing to look for when purchasing a pair of sunglasses is that they provide complete UVB and UVA radiation protection. You should also think about fit, comfort, anti-fogging, and lens tint if you plan to wear sunglasses for exten...

The perfect place: how to properly use a jewellery box

There is nothing worse than going to wear one of our favourite pieces only to realise it has depreciated with time. It’s a sad disappointment to see special pieces suffer with time, and for this reason many people invest in high quality jewellery...

5 reasons to go second-hand for your next photography equipment

There’s nothing quite as exciting to photographers than purchasing fresh equipment to add to their kit. Whether it’s an upgraded camera body or new lenses, most of us have an ongoing wishlist of photographic gear that seems to be never-ending. ...

Busting the myths around getting solar power

With reports that electricity prices could soar by at least 35 per cent in 2023, Australians are needing to look into alternatives when it comes to their energy sources. This has led to a rapid rise in the popularity of solar power, but there is ...

Evaluating the Benefits of Pet Insurance: Is It Really Worth It?

Owning a pet can be one of the most rewarding and fulfilling experiences, but it can also come with significant financial costs. Veterinary bills, prescription medications, and other pet-related expenses can quickly add up, and if you're not prepar...

Things you need to know about certificate iii in mobile plant technology

g Certification is very important for all types of work in the industry. To succeed in the field, you need to complete the certification in the required field that you opt for. You can keep industrial technology working at peak capacity with certi...

LayBy Shopping