The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Researchers trained an AI model to 'think' like a baby, and it suddenly excelled

  • Written by Susan Hespos, Psychology Department at Northwestern University Evanston, Illinois, USA and Professor of Infant Studies at MARCS Institute, Western Sydney University
Researchers trained an AI model to 'think' like a baby, and it suddenly excelled

In a world rife with opposing views, let’s draw attention to something we can all agree on: if I show you my pen, and then hide it behind my back, my pen still exists – even though you can’t see it anymore. We can all agree it still exists, and probably has the same shape and colour it did before it went behind my back. This is just common sense.

These common-sense laws of the physical world are universally understood by humans. Even two-month-old infants share[1] this[2] understanding. But scientists are still puzzled by some aspects of how we achieve this fundamental understanding. And we’ve yet to build a computer that can rival the common-sense abilities of a typically developing infant.

New research[3] by Luis Piloto and colleagues at Princeton University – which I’m reviewing for an article in Nature Human Behaviour – takes a step towards filling this gap. The researchers created a deep-learning artificial intelligence (AI) system that acquired an understanding of some common-sense laws of the physical world.

The findings will help build better computer models that simulate the human mind, by approaching a task with the same assumptions as an infant.

Childish behaviour

Typically, AI models start with a blank slate and are trained on data with many different examples, from which the model constructs knowledge. But research on infants suggests this is not what babies do. Instead of building knowledge from scratch, infants start with some principled expectations[4] about objects.

For instance, they expect if they attend to an object that is then hidden behind another object, the first object will continue to exist. This is a core assumption that starts them off in the right direction. Their knowledge then becomes more refined with time and experience.

The exciting finding by Piloto and colleagues is that a deep-learning AI system modelled on what babies do, outperforms a system that begins with a blank slate and tries to learn based on experience alone.

Read more: Artificial intelligence can deepen social inequality. Here are 5 ways to help prevent this[5]

Cube slides and balls into walls

The researchers compared both approaches. In the blank-slate version, the AI model was given several visual animations of objects. In some examples, a cube would slide down a ramp. In others, a ball bounced into a wall.

The model detected patterns from the various animations, and was then tested on its ability to predict outcomes with new visual animations of objects. This performance was compared to a model that had “principled expectations” built in before it experienced any visual animations.

These principles were based on the expectations infants have about how objects behave and interact. For example, infants expect two objects should not pass through one another.

If you show an infant a magic trick where you violate this expectation, they can detect the magic. They reveal this knowledge by looking significantly longer at events with unexpected, or “magic” outcomes, compared to events where the outcomes are expected.

Infants also expect an object should not be able to just blink in and out of existence. They can detect[6] when this expectation is violated as well.

A baby makes a comical 'shocked' face, with wide eyes and an open mouth.
Infants can detect when objects seem to defy the basic laws governing the physical world. Shutterstock

Piloto and colleagues found the deep-learning model that started with a blank slate did a good job, but the model based on object-centred coding inspired by infant cognition did significantly better.

The latter model could more accurately predict how an object would move, was more successful at applying the expectations to new animations, and learned from a smaller set of examples (for example, it managed this after the equivalent of 28 hours of video).

An innate understanding?

It’s clear learning through time and experience is important, but it isn’t the whole story. This research by Piloto and colleagues is contributing insight to the age-old question of what may be innate in humans, and what may be learned.

Beyond that, it’s defining new boundaries for what role perceptual data can play when it comes to artificial systems acquiring knowledge. And it also shows how studies on babies can contribute to building better AI systems that simulate the human mind.

References

  1. ^ share (www.sciencedirect.com)
  2. ^ this (psycnet.apa.org)
  3. ^ research (www.nature.com)
  4. ^ expectations (onlinelibrary.wiley.com)
  5. ^ Artificial intelligence can deepen social inequality. Here are 5 ways to help prevent this (theconversation.com)
  6. ^ can detect (www.sciencedirect.com)

Read more https://theconversation.com/researchers-trained-an-ai-model-to-think-like-a-baby-and-it-suddenly-excelled-186563

Times Magazine

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

Kool Car Hire

Turn Your Four-Wheeled Showstopper into Profit (and Stardom) Have you ever found yourself stand...

EV ‘charging deserts’ in regional Australia are slowing the shift to clean transport

If you live in a big city, finding a charger for your electric vehicle (EV) isn’t hard. But driv...

How to Reduce Eye Strain When Using an Extra Screen

Many professionals say two screens are better than one. And they're not wrong! A second screen mak...

The Times Features

Are mental health issues genetic? New research identifies brain cells linked to depression

Scientists from McGill University and the Douglas Institute recently published new research find...

What do we know about climate change? How do we know it? And where are we headed?

The 2025 United Nations Climate Change Conference (sometimes referred to as COP30) is taking pla...

The Industry That Forgot About Women - Until Now

For years, women in trades have started their days pulling on uniforms made for someone else. Th...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

Indo-Pacific Strength Through Economic Ties

The defence treaty between Australia and Indonesia faces its most difficult test because of econ...

Understanding Kerbside Valuation: A Practical Guide for Property Owners

When it comes to property transactions, not every situation requires a full, detailed valuation. I...

What’s been happening on the Australian stock market today

What moved, why it moved and what to watch going forward. 📉 Market overview The benchmark S&am...

The NDIS shifts almost $27m a year in mental health costs alone, our new study suggests

The National Disability Insurance Scheme (NDIS) was set up in 2013[1] to help Australians with...

Why Australia Is Ditching “Gym Hop Culture” — And Choosing Fitstop Instead

As Australians rethink what fitness actually means going into the new year, a clear shift is emergin...