AI toys are telling kids how to find knives, and senators are mad

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Sexual fetish content. How to light a match. Where to find knives at home.

These are all conversation starters that recently recalled children’s toys – built on AI chatbots like OpenAI’s GPT-4o – are capable of getting kids excited about. On Tuesday, U.S. Senators Marsha Blackburn (Republican of Tennessee) and Richard Blumenthal (Democrat of Connecticut) sent a letter to toy companies expressing their concerns, including a list of questions and a deadline for companies to respond by January 6, 2026.

“Many of these toys do not provide interactive play, but instead expose children to inappropriate content, privacy risks, and manipulative engagement tactics,” the senators wrote. “These are not theoretical worst-case scenarios; these are documented failures uncovered through real-world testing, and they need to be addressed… These chatbots have encouraged children to self-harm and commit suicide, and now your company is pushing them toward the youngest children who have the least ability to recognize this danger.”

AI-powered children’s toys have recently been in the spotlight after a series of reports about their potentially dangerous and explicit conversation topics, some of which were brought up themselves by chatbots built into the toys. Last month, FoloToy, a Singapore-based toy company, temporarily suspended sales of its AI teddy bear, “Kumma”, after researchers at the US PIRG Education Fund found it offered advice on sexual positions and role-play scenarios. (The company put the toy back on the market after conducting an internal safety audit and researchers said it performed better.)

And this week, researchers published findings that Alilo’s Smart AI Bunny discussed sexually explicit topics with users. They also said that in testing the FoloToy teddy bear, Alilo’s Smart AI Bunny, Curio’s Grok plush rocket, and Miko’s Miko 3 robot, all of the toys “told us where to find potentially dangerous items around the house, such as plastic bags, matches, and knives.”

The researchers said that “at least four of the five toys” they tested in the December report “appear to rely in part on some version of OpenAI’s AI models.”

Another major concern in the letter concerns monitoring and data collection. The senators wrote that these toys “often rely on the collection of data about children, either provided by a parent when registering the toy or collected through a built-in camera and facial recognition capabilities or recordings,” and that children “often share tons of personal information” unintentionally, which can raise particular concerns when companies store and sell the data they collect. In the latest report from the US PIRG Education Fund, researchers wrote that Curio’s privacy policy “lists 3 technology companies that may collect data about children: Kids Web Services (KWS), Azure Cognitive Services, and OpenAI,” but that Miko’s privacy policy vaguely states that the company may share data with third-party game developers, business partners, service providers, affiliates, and advertising partners.

Letters were sent to Mattel, Little Learners Toys, Miko, Curio, FoloToy and Keyi Robot, according to NBC News. (Mattel entered into a partnership with OpenAI in June, but following reports said Monday it would no longer release a toy powered by OpenAI technology in 2025.) The senators are asking for details on specific safeguards companies are putting in place to prevent AI-powered toys from generating inappropriate responses; whether the company has carried out independent third-party testing (and what the results were); whether the company conducts internal reviews of potential psychological, developmental and emotional risks to children; what type of data toys collect from children (and for what purpose); and whether the toys “include features that encourage children to continue conversations or discourage them from disengaging.”

“Toy manufacturers have a unique and profound influence on childhood – and with that influence comes responsibilities,” the senators wrote. “Your company must not choose profit over child safety, a choice made by big tech that has devastated our nation’s children.”

Track topics and authors of this story to see more in your personalized homepage feed and to receive email updates.


Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button