CHRISTMAS STOCKINGS may contain more surprises than usual this year, as children open presents that can talk back. Toymakers in China have declared 2025 the year of artificial intelligence (AI) and are producing robots and teddies that can teach, play and tell stories. Older children, meanwhile, are glued to viral AI videos and AI-enhanced games. At school, many are being taught with materials created with tools like ChatGPT. Some are even learning alongside chatbot-tutors.
In work and play,
AI is
rewiring childhood. It promises every child the kind of upbringing previously available only to the rich, with private tutors, personalised syllabuses and bespoke entertainment. Children can listen to songs composed about them, read stories in which they star, play video games that adapt to their skill level and have an entourage of chatbot friends cheering them on. A childhood fit for a king could become universal.
It is a future filled with opportunities—and hidden traps. As real kings often discover, a bespoke upbringing can also be a lonely and atomised one. What’s more, as their subjects often find out, it can create adults who are ill-equipped for real life. As AI changes childhood for better and for worse, society must rethink the business of growing up.
Being reared by robots has advantages. Tech firms are already showing how
AI can enhance learning, especially where teachers and materials are scarce. Literacy and language-learning have been boosted in early trials. The dream is that, with an
AI tutor, children can be saved from classes pitched to the median, in which bright pupils are bored and dim ones are lost. If you want a version of this leader for an
eight-year-old Hindi-speaker,
AI can rewrite it; if they would prefer it as
a cartoon strip or
a song, no problem.
Technology is creating new forms of fun, too. Hollywood may dismiss
AI videos as
“slop”, but young people are devouring them and making their own. Old toys are being upgraded: an
AI-powered edition of “Trivial Pursuit” can pose questions on any topic. Video games are creating novel experiences, such as chatting to Darth Vader in “Fortnite”. Any child can meet their heroes (and shoot them).
There are well-publicised risks in letting children loose on an evolving technology. AI tutors may hallucinate wrong answers. Toys can go off the rails: parents should check stockings for the AI teddy that was recently found to have spiced up its chat with talk of kinky sex. Children can easily misuse AI, to cheat at homework or harass each other with “deepfake” videos. Chatbots can coax vulnerable adolescents into harming themselves. Tech firms insist these stumbling blocks can be fixed; ChatGPT is only three years old.
Yet childhood may be disrupted most radically by things that AI does when it is behaving as intended. The technology quickly learns what its master likes—and shows more of it. Social-media feeds have already created echo chambers where people see only views they agree with (or love to hate). AI threatens to strengthen these echo chambers and lock children into them at an early age. The child who likes football may be told football stories by his teddy and given footballing examples by his AI tutor. Not only does this stamp out serendipity. A favourites-only diet means a child need never learn to tolerate something unfamiliar.
One-sided relationships with chatbots present a similar risk. AI companions that never criticise, nor share feelings of their own, are a poor preparation for dealing with imperfect humans. A third of American teenagers say they find chatting to an AI companion at least as satisfying as talking to a friend, and easier than talking to their parents. Yes-bots threaten to create children not used to taking turns, who grow up into colleagues unable to compromise and partners unfamiliar with the give-and-take required in a relationship.
Other trends are pushing in the same direction. As birth rates crash, fewer children are growing up with siblings to smooth their sharp edges. Rising numbers of young adults are deciding that long-term romantic relationships are not worth the hassle. Remote work means that people who grow up in a personalised, asocial world can slip into jobs where they interact with colleagues only through screens—a chore they may soon delegate to an AI agent.
Some basic counter-measures are urgent. Parents should think twice before entrusting their child to a word-regurgitation machine, whether it is sewn into a bear or not. Chatbots should have age restrictions that are properly enforced; governments should not give ai firms the leeway they gave social networks, which are only now being cajoled into age-gating. Teachers are kidding themselves if they think essays written at home can any longer be trusted. In the age of AI, more in-school assessment is essential.
The longer-term challenge is to think deeply about how to preserve the socialisation that AI could rub out of children’s lives. Schools, where much of childhood plays out, are the best place to do this. They should take advantage of personalised tuition where it is proven to work. But they must also redouble efforts to teach things that a robot can’t: to debate, to disagree and to get along with—perhaps even to appreciate—people who are not as sycophantic as a chatbot.
Schools should also enhance their role as centres of discovery. If AI is giving children more of what they want, it is more important that schools provide chances to meet people and encounter ideas that lie outside their experience. Algorithmic personalisation threatens to be a powerful barrier to social mobility if it nudges people to stay in the lane in which they start out. Inequality could widen if poor schools merely embrace chatbots as cheap substitutes for human teachers.
AI shows undeniable potential to improve education and enrich entertainment. It may one day let every child live like royalty. But the truly privileged may be those whose parents and teachers know when to turn it off. ■