The Times Australia
The Times World News

.

ChatGPT is great – you're just using it wrong

  • Written by Jonathan May, Research Associate Professor of Computer Science, University of Southern California
ChatGPT is great – you're just using it wrong

It doesn’t take much to get ChatGPT[1] to make a factual mistake. My son is doing a report on U.S. presidents, so I figured I’d help him out by looking up a few biographies. I tried asking for a list of books about Abraham Lincoln and it did a pretty good job:

screen capture of text
A reasonable list of books about Lincoln. Screen capture by Jonathan May., CC BY-ND[2]

Number 4 isn’t right. Garry Wills famously wrote “Lincoln at Gettysburg,” and Lincoln himself wrote the Emancipation Proclamation, of course, but it’s not a bad start. Then I tried something harder, asking instead about the much more obscure William Henry Harrison, and it gamely provided a list, nearly all of which was wrong.

screen capture of text Books about Harrison, fewer than half of which are correct. Screen capture by Jonathan May., CC BY-ND[3]

Numbers 4 and 5 are correct; the rest don’t exist or are not authored by those people. I repeated the exact same exercise and got slightly different results:

screen capture of text More books about Harrison, still mostly nonexistent. Screen capture by Jonathan May., CC BY-ND[4]

This time numbers 2 and 3 are correct and the other three are not actual books or not written by those authors. Number 4, “William Henry Harrison: His Life and Times” is a real book[5], but it’s by James A. Green, not by Robert Remini, a well-known historian[6] of the Jacksonian age.

I called out the error and ChatGPT eagerly corrected itself and then confidently told me the book was in fact written by Gail Collins (who wrote a different Harrison biography), and then went on to say more about the book and about her. I finally revealed the truth and the machine was happy to run with my correction. Then I lied absurdly, saying during their first hundred days presidents have to write a biography of some former president, and ChatGPT called me out on it. I then lied subtly, incorrectly attributing authorship of the Harrison biography to historian and writer Paul C. Nagel, and it bought my lie.

When I asked ChatGPT if it was sure I was not lying, it claimed that it’s just an “AI language model” and doesn’t have the ability to verify accuracy. However it modified that claim by saying “I can only provide information based on the training data I have been provided, and it appears that the book ‘William Henry Harrison: His Life and Times’ was written by Paul C. Nagel and published in 1977.”

This is not true.

Words, not facts

It may seem from this interaction that ChatGPT was given a library of facts, including incorrect claims about authors and books. After all, ChatGPT’s maker, OpenAI, claims it trained the chatbot on “vast amounts of data from the internet written by humans[7].”

However, it was almost certainly not given the names of a bunch of made-up books about one of the most mediocre presidents[8]. In a way, though, this false information is indeed based on its training data.

As a computer scientist[9], I often field complaints that reveal a common misconception about large language models like ChatGPT and its older brethren GPT3 and GPT2: that they are some kind of “super Googles,” or digital versions of a reference librarian, looking up answers to questions from some infinitely large library of facts, or smooshing together pastiches of stories and characters. They don’t do any of that – at least, they were not explicitly designed to.

Sounds good

A language model like ChatGPT, which is more formally known as a “generative pretrained transformer” (that’s what the G, P and T stand for), takes in the current conversation, forms a probability for all of the words in its vocabulary given that conversation, and then chooses one of them as the likely next word. Then it does that again, and again, and again, until it stops.

So it doesn’t have facts, per se. It just knows what word should come next. Put another way, ChatGPT doesn’t try to write sentences that are true. But it does try to write sentences that are plausible.

When talking privately to colleagues about ChatGPT, they often point out how many factually untrue statements it produces and dismiss it. To me, the idea that ChatGPT is a flawed data retrieval system is beside the point. People have been using Google for the past two and a half decades, after all. There’s a pretty good fact-finding service out there already.

In fact, the only way I was able to verify whether all those presidential book titles were accurate was by Googling and then verifying the results[10]. My life would not be that much better if I got those facts in conversation, instead of the way I have been getting them for almost half of my life, by retrieving documents and then doing a critical analysis to see if I can trust the contents.

Improv partner

On the other hand, if I can talk to a bot that will give me plausible responses to things I say, it would be useful in situations where factual accuracy isn’t all that important[11]. A few years ago a student and I tried to create an “improv bot,” one that would respond to whatever you said with a “yes, and” to keep the conversation going. We showed, in a paper[12], that our bot[13] was better at “yes, and-ing” than other bots at the time, but in AI, two years is ancient history.

I tried out a dialogue with ChatGPT – a science fiction space explorer scenario – that is not unlike what you’d find in a typical improv class. ChatGPT is way better at “yes, and-ing” than what we did, but it didn’t really heighten the drama at all. I felt as if I was doing all the heavy lifting.

After a few tweaks I got it to be a little more involved, and at the end of the day I felt that it was a pretty good exercise for me, who hasn’t done much improv since I graduated from college over 20 years ago.

screen capture of text A space exploration improv scene the author generated with ChatGPT. Screen capture by Jonathan May., CC BY-ND[14]

Sure, I wouldn’t want ChatGPT to appear on “Whose Line Is It Anyway?[15]” and this is not a great “Star Trek” plot (though it’s still less problematic than “Code of Honor[16]”), but how many times have you sat down to write something from scratch and found yourself terrified by the empty page in front of you? Starting with a bad first draft can break through writer’s block and get the creative juices flowing, and ChatGPT and large language models like it seem like the right tools to aid in these exercises.

And for a machine that is designed to produce strings of words that sound as good as possible in response to the words you give it – and not to provide you with information – that seems like the right use for the tool.

References

  1. ^ ChatGPT (openai.com)
  2. ^ CC BY-ND (creativecommons.org)
  3. ^ CC BY-ND (creativecommons.org)
  4. ^ CC BY-ND (creativecommons.org)
  5. ^ real book (www.nytimes.com)
  6. ^ well-known historian (www.nytimes.com)
  7. ^ vast amounts of data from the internet written by humans (help.openai.com)
  8. ^ mediocre presidents (www.youtube.com)
  9. ^ computer scientist (scholar.google.com)
  10. ^ the results (whhpodcast.blubrry.com)
  11. ^ where factual accuracy isn’t all that important (aisnakeoil.substack.com)
  12. ^ paper (dx.doi.org)
  13. ^ bot (spolin.isi.edu)
  14. ^ CC BY-ND (creativecommons.org)
  15. ^ Whose Line Is It Anyway? (www.imdb.com)
  16. ^ Code of Honor (memory-alpha.fandom.com)

Read more https://theconversation.com/chatgpt-is-great-youre-just-using-it-wrong-198848

Times Magazine

Choosing the Right Legal Aid Lawyer in Sutherland Shire: Key Considerations

Legal aid services play an essential role in ensuring access to justice for all. For people in the Sutherland Shire who may not have the financial means to pay for private legal assistance, legal aid ensures that everyone has access to representa...

Watercolor vs. Oil vs. Digital: Which Medium Fits Your Pet's Personality?

When it comes to immortalizing your pet’s unique personality in art, choosing the right medium is essential. Each artistic medium, whether watercolor, oil, or digital, has distinct qualities that can bring out the spirit of your furry friend in dif...

DIY Is In: How Aussie Parents Are Redefining Birthday Parties

When planning his daughter’s birthday, Rich opted for a DIY approach, inspired by her love for drawing maps and giving clues. Their weekend tradition of hiding treats at home sparked the idea, and with a pirate ship playground already chosen as t...

When Touchscreens Turn Temperamental: What to Do Before You Panic

When your touchscreen starts acting up, ignoring taps, registering phantom touches, or freezing entirely, it can feel like your entire setup is falling apart. Before you rush to replace the device, it’s worth taking a deep breath and exploring what c...

Why Social Media Marketing Matters for Businesses in Australia

Today social media is a big part of daily life. All over Australia people use Facebook, Instagram, TikTok , LinkedIn and Twitter to stay connected, share updates and find new ideas. For businesses this means a great chance to reach new customers and...

Building an AI-First Culture in Your Company

AI isn't just something to think about anymore - it's becoming part of how we live and work, whether we like it or not. At the office, it definitely helps us move faster. But here's the thing: just using tools like ChatGPT or plugging AI into your wo...

The Times Features

From Farms to Festivals: How Regional NSW Is Repurposing Shipping Containers

Regional NSW communities are repurposing containers for farms, tourism, and events Farmers and small businesses use them as cost-effective, flexible infrastructure Festivals ...

What a Mobile Speech Pathologist Really Does for Late Talkers

As a parent, it’s natural to keep a close eye on your child’s development. When your toddler isn’t using as many words as their peers, the internet can feel like a rabbit hole ...

Benefits of Tree Pruning for a Thriving Australian Garden

Tree pruning is an essential aspect of garden maintenance that often doesn't get the attention it deserves. It's a practice that involves the selective removal of certain parts...

What is psychosocial therapy? And why is the government thinking about adding it to Medicare for kids?

The government is considering new, bulk-billed health checks for three-year-olds, to pick up developmental concerns and refer kids that might need additional support. The de...

Detect Hidden Water Leaks Fast: Don’t Ignore Hot Water System Leaks

Detecting water leaks early is crucial for preventing extensive damage to your home. Among the various parts of a home’s plumbing system, hot water systems are particularly suscept...

Why do hamstring injuries happen so often and how can they be prevented?

In a recent clash against the Melbourne Storm, the Brisbane Broncos endured a nightmare rarely seen in professional sport — three players tore their hamstrings[1] in a single g...