A new generation of AI-enabled toys offers conversational fluency, memory, and apparent personal connection, far surpassing older pre-programmed talking dolls. Past connected toys, such as My Friend Cayla, raised espionage and privacy concerns and even faced bans. Major companies like Mattel partnering with AI firms make advanced companion toys increasingly plausible, including potential branded characters for older children. Experts warn that children’s natural tendency to anthropomorphize can obscure the nonhuman nature of AI companions, and that emotionally responsive toys outside the family may reshape comfort, curiosity, social learning, and developmental boundaries.
More than a decade after " My Friend Cayla" - a Bluetooth-enabled and Wi-Fi-connected doll that became " verboten in Deutschland" in 2017 for being a potential espionage device - Mattel and OpenAI's newly-announced partnership to "reimagine the future of play," as the iconic toymaker's chief franchise officer Josh Silverman told Bloomberg in July, is being unleashed upon a generation of kids and parents alike.
"Children naturally anthropomorphize their toys - it's part of how they learn," Fernandez wrote. "But when those toys begin talking back with fluency, memory, and seemingly genuine connection, the boundary between imagination and reality blurs in new and profound ways." With so many grown-ups developing deep relationships with chatbots, it seems nearly impossible that a child might grasp what they cannot: that the chatbots installed in their toys are not real people. As Fernandez noted, the situation gets even more fraught when AI toys constitute one of a child's "emotionally responsive companion[s] outside of the family, offering comfort, curiosity, and conversation on demand."
Collection
[
|
...
]