Cassandra Willyardarchive

Human babies are far better at learning than even the very best large language models. To be able to write in passable English, ChatGPT had to be  trained on massive data sets that contain millions or even a trillion words. Children, on the other hand, have access to only a tiny fraction of that data, yet by age three they’re communicating in quite sophisticated ways.

A team of researchers at New York University wondered if AI could learn like a baby. What could an AI model do when given a far smaller data set—the sights and sounds experienced by a single child learning to talk?

Advertisement

A lot, it turns out.  The AI model managed to match words to the objects they represent.  “There’s enough data even in this blip of the child’s experience that it can do genuine word learning,” says Brenden Lake, a computational cognitive scientist at New York University and an author of the study. This work, published in Science today, not only provides insights into how babies learn but could also lead to better AI models.