The Times Australia
The Times World News

.

AI tools produce dazzling results – but do they really have ‘intelligence’?

  • Written by Paul Compton, Emeritus professor in Computer Science and Engineering, UNSW Sydney
an Impressionist painting of water lilies on a pond.

Sam Altman, chief executive of ChatGPT-maker OpenAI, is reportedly trying to find up to US$7 trillion[1] of investment to manufacture the enormous volumes of computer chips he believes the world needs to run artificial intelligence (AI) systems. Altman also recently said the world will need more energy[2] in the AI-saturated future he envisions – so much more that some kind of technological breakthrough like nuclear fusion may be required.

Altman clearly has big plans for his company’s technology, but is the future of AI really this rosy? As a long-time “artificial intelligence” researcher, I have my doubts.

Today’s AI systems – particularly generative AI tools such as ChatGPT – are not truly intelligent. What’s more, there is no evidence they can become so without fundamental changes to the way they work.

What is AI?

One definition of AI is a computer system that can “perform tasks commonly associated with intelligent beings[3]”.

This definition, like many others, is a little blurry: should we call spreadsheets AI, as they can carry out calculations that once would have been a high-level human task? How about factory robots, which have not only replaced humans but in many instances surpassed us in their ability to perform complex and delicate tasks?

Read more: Not everything we call AI is actually 'artificial intelligence'. Here's what you need to know[4]

While spreadsheets and robots can indeed do things that were once the domain of humans, they do so by following an algorithm – a process or set of rules for approaching a task and working through it.

One thing we can say is that there is no such thing as “an AI” in the sense of a system that can perform a range of intelligent actions in the way a human would. Rather, there are many different AI technologies that can do quite different things.

Making decisions vs generating outputs

Perhaps the most important distinction is between “discriminative AI” and “generative AI”.

Discriminative AI helps with making decisions, such as whether a bank should give a loan to a small business, or whether a doctor diagnoses a patient with disease X or disease Y. AI technologies of this kind have existed for decades, and bigger and better ones are emerging all the time[5].

Read more: AI is everywhere – including countless applications you've likely never heard of[6]

Generative AI systems, on the other hand – ChatGPT, Midjourney and their relatives – generate outputs in response to inputs: in other words, they make things up. In essence, they have been exposed to billions of data points (such as sentences) and use this to guess a likely response to a prompt. The response may often be “true”, depending on the source data, but there are no guarantees.

For generative AI, there is no difference between a “hallucination” – a false response invented by the system – and a response a human would judge as true. This appears to be an inherent defect of the technology, which uses a kind of neural network called a transformer.

AI, but not intelligent

Another example shows how the goalposts of “AI” are constantly moving. In the 1980s, I worked on a computer system designed to provide expert medical advice on laboratory results. It was written up in the US research literature as one of the first four[7] medical “expert systems” in clinical use, and in 1986 an Australian government report described it as the most successful expert system developed in Australia.

I was pretty proud of this. It was an AI landmark, and it performed a task that normally required highly trained medical specialists. However, the system wasn’t intelligent at all. It was really just a kind of look-up table which matched lab test results to high-level diagnostic and patient management advice.

There is now technology which makes it very easy to build such systems, so there are thousands of them in use around the world. (This technology, based on research by myself and colleagues, is provided by an Australian company called Beamtree.)

In doing a task done by highly trained specialists, they are certainly “AI”, but they are still not at all intelligent (although the more complex ones may have thousands and thousands of rules for looking up answers).

“There are now thousands of similar systems in use around the world, using a technology which makes it very easy to build such systems, provided by Beamtree, an Australian company” and originally based on research by myself and colleagues.“

The transformer networks used in generative AI systems still run on sets of rules, though there may be millions or billions of them, and they cannot easily be explained in human terms.

What is real intelligence?

If algorithms can produce dazzling results of the kind we see from ChatGPT without being intelligent, what is real intelligence?

We might say intelligence is insight: the judgement that something is or is not a good idea. Think of Archimedes, leaping from his bath and shouting "Eureka” because he had had an insight into the principle of buoyancy.

Generative AI doesn’t have insight. ChatGPT can’t tell you if its answer to a question is better than Gemini’s. (Gemini, until recently known as Bard, is Google’s competitor to OpenAI’s GPT family of AI tools.)

Or to put it another way: generative AI might produce amazing pictures in the style of Monet, but if it were trained only on Renaissance art it would never invent Impressionism.

an Impressionist painting of water lilies on a pond.
Nympheas (Waterlilies) Claude Monet / Google Art Project[8]

Generative AI is extraordinary, and people will no doubt find widespread and very valuable uses for it. Already, it provides extremely useful tools for transforming and presenting (but not discovering) information, and tools for turning specifications into code are already in routine use.

These will get better and better: Google’s just-released Gemini, for example, appears to try to minimise the hallucination problem[9], by using search and then re-expressing the search results.

Nevertheless, as we become more familiar with generative AI systems, we will see more clearly that it is not truly intelligent; there is no insight. It is not magic, but a very clever magician’s trick: an algorithm that is the product of extraordinary human ingenuity.

Read more https://theconversation.com/ai-tools-produce-dazzling-results-but-do-they-really-have-intelligence-223311

Times Magazine

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an online presence that reflects your brand, engages your audience, and drives results. For local businesses in the Blue Mountains, a well-designed website a...

September Sunset Polo

International Polo Tour To Bridge Historic Sport, Life-Changing Philanthropy, and Breath-Taking Beauty On Saturday, September 6th, history will be made as the International Polo Tour (IPT), a sports leader headquartered here in South Florida...

5 Ways Microsoft Fabric Simplifies Your Data Analytics Workflow

In today's data-driven world, businesses are constantly seeking ways to streamline their data analytics processes. The sheer volume and complexity of data can be overwhelming, often leading to bottlenecks and inefficiencies. Enter the innovative da...

7 Questions to Ask Before You Sign IT Support Companies in Sydney

Choosing an IT partner can feel like buying an insurance policy you hope you never need. The right choice keeps your team productive, your data safe, and your budget predictable. The wrong choice shows up as slow tickets, surprise bills, and risky sh...

Choosing the Right Legal Aid Lawyer in Sutherland Shire: Key Considerations

Legal aid services play an essential role in ensuring access to justice for all. For people in the Sutherland Shire who may not have the financial means to pay for private legal assistance, legal aid ensures that everyone has access to representa...

Watercolor vs. Oil vs. Digital: Which Medium Fits Your Pet's Personality?

When it comes to immortalizing your pet’s unique personality in art, choosing the right medium is essential. Each artistic medium, whether watercolor, oil, or digital, has distinct qualities that can bring out the spirit of your furry friend in dif...

The Times Features

How much money do you need to be happy? Here’s what the research says

Over the next decade, Elon Musk could become the world’s first trillionaire[1]. The Tesla board recently proposed a US$1 trillion (A$1.5 trillion) compensation plan, if Musk ca...

NSW has a new fashion sector strategy – but a sustainable industry needs a federally legislated response

The New South Wales government recently announced the launch of the NSW Fashion Sector Strategy, 2025–28[1]. The strategy, developed in partnership with the Australian Fashion ...

From Garden to Gift: Why Roses Make the Perfect Present

Think back to the last time you gave or received flowers. Chances are, roses were part of the bunch, or maybe they were the whole bunch.   Roses tend to leave an impression. Even ...

Do I have insomnia? 5 reasons why you might not

Even a single night of sleep trouble can feel distressing and lonely. You toss and turn, stare at the ceiling, and wonder how you’ll cope tomorrow. No wonder many people star...

Wedding Photography Trends You Need to Know (Before You Regret Your Album)

Your wedding album should be a timeless keepsake, not something you cringe at years later. Trends may come and go, but choosing the right wedding photography approach ensures your ...

Can you say no to your doctor using an AI scribe?

Doctors’ offices were once private. But increasingly, artificial intelligence (AI) scribes (also known as digital scribes) are listening in. These tools can record and trans...