The Mozilla Foundation took a closer look at 11 of the more popular romantic AI chatbots and found that they collect an alarming amount of personal information and apparently do little to protect the privacy of that information.
The advent of the chatbot has had an unexpected consequence: the apparition of romantic chatbots designed to interact with people as if they were real partners and people. Many companies were quick to market romantic AI chatbots to fill a niche market, with little to no concerns regarding the users' privacy.
Mozilla's investigation revealed that almost all chatbots collected a lot of personal information, including stuff that people wouldn't normally share online, such as sexual health information, the use of prescribed medication, and even gender-affirming care data.
One of the problems is that it's unclear if these chatbots have any safeguards to protect users from dangerous interactions. Also, there's no indication of what happens with the collected information.
"How does the chatbot work? Where does its personality come from? Are there protections in place to prevent potentially harmful or hurtful content, and do these protections work? What data are these AI models trained on? Can users opt out of having their conversations or other personal data used for that training?" These are some of the questions raised by The Mozilla Foundation.
Researchers found that 73 percent of the companies with chatbots don't specify how they handle security vulnerabilities. 64 percent say nothing about encryption or even if it's in use, and around 45 percent allow weak passwords, including just using a number.
Making matters worse, 54 percent of the apps don't allow users to delete their personal data, and all but one said that they might share or sell the collected information.
tags
Silviu is a seasoned writer who followed the technology world for almost two decades, covering topics ranging from software to hardware and everything in between.
View all postsSeptember 06, 2024
September 02, 2024
August 13, 2024
July 25, 2024