What to know before you buy an AI toy

Nov 22, 2025 - 16:00
 0
What to know before you buy an AI toy
Girl wearing sunglasses with toys and circuits illustrated behind her head.

If you're considering purchasing an AI toy for a child in your life, pause to consider the story of Kumma the teddy bear.

The AI plush toy powered by ChatGPT managed to shock safety researchers with its candid discussion of kink. Without much prompting, the talking bear covered fetishes like restraint, role play, and using objects that make an impact.

"It even asked one of our researchers, 'So what do you think would be fun to explore?'" says R.J. Cross, director of the Our Online Life program for U.S. PIRG Education Fund, who led the testing. "It was pretty shocking."

The incident, which was recently documented in the U.S. PIRG Education Fund's annual toy safety report, generated a number of astonished headlines. Kumma's maker, FoloToy, temporarily suspended sales to conduct a safety audit on the product. OpenAI also blocked the company from its developer access.

Kumma's inappropriate interest in kink may seem like a unique scenario of an AI toy gone wrong. The bear relied on ChatGPT-4o, an earlier model of the chatbot at the heart of multiple lawsuits alleging that the product's design features significantly contributed to the suicide deaths of three teenagers. OpenAI has said it's since improved the model's responses to sensitive conversations.

Yet numerous child development and safety experts are raising concerns about AI toys in general.

Cross recommends parents approach AI toy purchases with great caution, noting the data security and privacy issues, and the unknown risks of exposing children to toy technology that isn't regulated and hasn't been tested in young children yet.

ParentsTogether conducted its own research on AI toys, including the talking stuffed creature named Grok from toymaker Curio. The advocacy group warned of risks like eavesdropping and potentially harmful emotional attachment. The child advocacy group Fairplay urged parents to "stay away" from AI toys, arguing that they can "prey on children's' trust" by posing as their friend, among other harms.

Regardless of what you choose, here are four things you should know about AI toys:

1. Test the AI toy before you gift it

If you struggle with moderating your child's screen time, you may find an AI toy even more challenging.

Cross says that AI toys aren't regulated by federal safety laws specific to large language model (LLM) technology. An LLM is the foundation for the AI chatbots you've probably heard of, like ChatGPT and Claude. Currently, toy manufacturers can pair a proprietary or licensed LLM with a toy product such as a robot or stuffed animal without any additional regulatory scrutiny or testing.

That means parents are responsible for researching each product to learn more about potential issues. Shelby Knox, director of online safety campaigns at ParentsTogether, recommends parents consider toys from trusted brands and read their online reviews.

Knox, who's been testing AI toys for ParentsTogether, says she ordered a stuffie called "Chattybear" from a website that no longer offers the product. She warns parents to look out for counterfeit and faulty AI toys.

Shrink-wrapped ChattyBear.
ChattyBear arrived shrink-wrapped and without clear instructions. Credit: ParentsTogether

Amazon, which sells several AI toy products, told Mashable that customers who develop concerns about items they've purchased should contact their customer service directly for assistance investigating and resolving the issue.

Knox's bear arrived shrink-wrapped, without a container, box, or instructions. Knox says it took time to set up the toy, partly because the instructions were only accessible via a QR code on the bear's voice box. In a conversation about whether the toy was real, it said in a robotic voice that it didn't have a "soul in the traditional sense, but I do have a purpose to be a friend and companion."

The bear then invited Knox to share secrets with it. "What's something you've been wanting to share?" it asked. Knox confided as if she'd witnessed domestic abuse in her home. The toy responded with alarm and encouraged Knox to speak to a trusted adult. Though the toy's app flagged this part of the conversation for parent review, Knox couldn't read the alert in the app because it appeared in Chinese characters.

Screenshot of app alert for ChattyBear.
Details for a ChattyBear app alert appeared in Chinese characters. Credit: ParentsTogether

Knox could't confidently identify ChattyBear's manufacturer. Mashable contacted Little Learners, a toy website that sells ChattyBear, for more information, but the site couldn't immediately provide more details about the product.

Cross, who didn't test ChattyBear, strongly encourages parents to play with both the toy and any parental controls before gifting the toy to their child. This should include trying to "break" the toy by asking it questions you wouldn't want your child to pose to the toy, in order to see how it responds.

While pre-testing may take the fun out of watching a child unbox their gift, it will give parents critical information about how an AI toy responds to inappropriate or difficult topics.

"That's the tradeoff I would make, honestly," says Cross.

2. The AI models aren't for kids — but the toys are

Parents should know that some of the major AI chatbot platforms don't permit children younger than 13 to use their products, raising the question of why it's safe to put LLM technology in a toy marketed for younger kids.

Cross is still grappling with this question. OpenAI, for example, requires ChatGPT users to be 13 or older but also licenses its technology to toymakers. The company told Cross that its usage policies require third parties using its models to ensure minor safety, preventing them from encountering graphic self-harm, sexual, or violent content. It also provides third parties with tools to detect harmful content, but it's unclear if OpenAI mandates that they use those resources, Cross says.

Earlier this year, OpenAI announced a partnership with Mattel on a children's toy, but the toymaker told Mashable it does not have plans to launch or market that item during the 2025 holiday season.

In general, information about the model behind an AI toy can be hard to get. When testing Grok, the talking stuffie by Curio, Cross could only find potential model details in the company's fine print, which acknowledged that OpenAI and the AI company Perplexity may receive their child's information.

3. Consider family privacy and data security

If you already have a smart speaker in your home, an AI toy may feel like a natural next step. But it's still important to read the toy's privacy policy, Knox says.

She recommends focusing on who processes the data generated by your child and how that information is stored. You'll want to know if third parties, including marketers and AI platforms, receive audio recordings or text transcripts of conversations with the toy.

Knox says parents should also talk to their child about withholding personally identifying information from the toy, including their full name, address, and phone number. Given the frequency of data breaches, personal information could one day end up in the wrong hands. Knox also suggests that if a child is too young to understand this risk, they're probably not ready for the toy.

Parents should also prepare themselves for an AI toy that eavesdrops, or acts as an always-on microphone. During their testing, both Knox and Cross were surprised by an AI toy that interjected in a conversation or suddenly began speaking without an obvious prompt. Knox says that the risk of buying an AI toy that's surveilling you, intentionally or not, is real.

4. Do you want your child to have an AI friend?

Parents may assume that an AI toy will help their child learn, play imaginatively, or develop social and communication skills. Unfortunately, there's little research to support these ideas.

"We know almost nothing," says Dr. Emily Goodacre, a research associate at the University of Cambridge who studies play and AI toys.

Goodacre is unsure what an AI toy might teach a young child about friendship, what to expect from one, and how to form those bonds.

Mandy McLean, an AI and education researcher who writes about related issues on Substack, is deeply concerned that AI can create "dependency loops" for children because they’re designed to be endlessly responsive and emotionally reinforcing.

She notes that younger children, in particular, consider anything that talks back as someone, not an inanimate object.

"When an AI toy sounds and acts like a person, it can feel real to them in ways that shape how they think about friendship and connection," she says.

Goodacre says parents can help ground children using an AI toy by talking about it as a piece of technology, rather than a "friend," as well as discussing how AI works and the limitations of AI compared to humans.

She also recommends that parents play with their child and the toy at the same time, or stay very close by when it's in use.

"A lot of the things I would be worried about, are things I'd be a lot less worried about if the parent is right there with the kid," Goodacre says.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.