Toys are an essential part of our development as people, whether you’re talking about baby toys that teach color recognition and empathy, collaborative toys that teach sharing and teamwork, or creative toys that encourage imagination and outside-the-box thinking. Just imagine what the toys of the future will be able to accomplish… assuming, of course, that the security issues we’re currently wrestling with are dealt with appropriately.
Unfortunately, this hurdle still needs work to be cleared.
Bondu: An Example of Why Toys Need to Be Built Safely
Imagine your neighbor walking up to you and telling you that they were waiting for their purchase of fancy stuffed dinosaurs for their children to converse with. That was Joseph Thacker’s experience, as his neighbor knew he worked as a security researcher into AI and its risks towards children. This neighbor wanted to pick his brain about their purchase of Bondus, toys that were capable of holding ongoing conversations with their assigned children.
Thacker teamed up with a colleague—Joel Margolis, a web security researcher—and together they found something deeply troubling: they were able to access the transcript of effectively every conversation a child had ever had with the toy with just a simple Google account.
Without a single line of actual hacking, the researchers suddenly knew how many Bondus had been given nicknames, which owners liked apple juice over grape juice, and who enjoyed certain dance moves more than others. Not only that, but each child’s name, birthday, family members, and parent-selected objectives were also left on display. When asked, the company confirmed that (save for those conversations manually deleted by either a parent or the company itself) every interaction could be accessed… over 50,000 conversations in total.
Considering the types of conversations a child might have with their toys, it is little wonder that Thacker described the experience as “intrusive and weird,” stating, “Being able to see all these conversations was a massive violation of children’s privacy.”
Fortunately, Bondu Responded Quickly to These Findings
To the company’s credit, it quickly put in place stronger data protection measures, including hiring an external firm to confirm that its improvements worked and that its systems would remain secure going forward. The company also has a bounty program for people to report any inappropriate statements or responses coming from the toy… one that has reportedly gone unclaimed.
Despite Bondus’ Fixes, Thacker and Margolis Remain Concerned
As the researchers put it, this incident highlights several key concerns about AI and data collection—particularly when children’s toys are involved. In summary:
- A single employee utilizing a weak password is enough to recreate this level of data exposure.
- This kind of data could be used to enable and exacerbate child abuse, manipulation, and abduction attempts.
- The use of external tools—such as Google Gemini and OpenAI’s GPT-5—means information is also shared with these platforms and added to their data reserves.
Particularly concerning was the likelihood that companies would use AI to code websites and product software—a process called “vibe coding.” Thacker and Margolis see it as likely that this was the reason that Bondu’s console had the flaws it did.
It frankly didn’t matter that Bondu’s toy was “safe” for children; its lack of data security simply changed the nature of the threat.
Thacker himself has changed his view on AI-powered toys. Once open to them, he no longer wants them in his house. As he puts it: “It's kind of just a privacy nightmare.”
Is Vibe-Coding Really So Bad?
Yes and no. If carried out safely, it can help businesses accomplish more… the problem arises when AI is used without someone to ensure it remains cognizant of security practices.
Bondu: A Warning to All Businesses Regarding AI Security
Let’s face facts for a moment and acknowledge that the Bondu situation is already very scary. However, let’s also acknowledge that there is very much a chance that your current business AI tool could be doing the same thing to your business, just replacing the favorite dances with trade secrets.
Any AI You Use for Work Needs to Be Both Safe and Secure
Bondu gives us just another example of this policy in action: the toy was very safe for the kids playing with it—it just wasn’t secure enough to be trusted with the data it relied on.
You simply can’t afford to have the same be true of your business tools. We can help by auditing them to ensure their security is up to par, keeping an eye on your vendors and their practices, and working with your team to ensure they use the tools at their disposal with a focus on security.
Let’s Turn Your IT into the Advantage It is Meant to Be
Reach out to us so we can keep an eye out for any “stuffed dinosaurs” hiding in your business’ network. Give us a call at (504) 840-9800 to start a conversation about what we can do for you.
Comments