AI girlfriend chatbots are probably spilling everyone’s secrets

AI girlfriend chatbots are probably spilling everyone’s secrets

In an ongoing loneliness epidemic, the rise of AI chatbot companions and romantic partners might be meeting some people’s needs, but researchers found these bots aren’t the best of friends when it comes to protecting secrets.

Are we in an AI bubble? | What’s next for Nvidia?

*Privacy Not Included, a consumer guide from Mozilla Foundation which evaluates privacy policies for tech and other products, reviewed 11 chatbots marketed as romantic companions, and found that all of the chatbots earned warning labels, “putting them on par with the worst categories of products we have ever reviewed for privacy.”

Among the privacy issues *Privacy Not Included found when reviewing the bots were a lack of user privacy policies and information about how the AI companions work, as well as Terms and Conditions saying companies were not responsible for what could happen when using their chatbots.

“To be perfectly blunt, AI girlfriends are not your friends,” Misha Rykov, a researcher at *Privacy Not Included, said in a statement. “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

For example, CrushOn.AI, which markets itself as a “no filter NSFW character AI chat,” says under its “Consumer Health Data Privacy Policy” it “may collect” information on a user’s “Use of prescribed medication,” “Gender-affirming care information,” and “Reproductive or sexual health information,” in the character chats “to facilitate” and “monitor” the “chat for safety and appropriate content.” The company also said it “may collect voice recordings” if users leave a voicemail, contact customer support, or connect with the company over video chat. CrushOn.AI did not immediately respond to a request for comment from Quartz.

“To be perfectly blunt, AI girlfriends are not your friends.” 

RomanticAI, a chatbot service which advertises “a friend you can trust,” says in its Terms and Conditions users must acknowledge they are “communicating with software whose activity we cannot constantly control.” RomanticAI did not immediately respond to a request for comment.

*Privacy Not Included found that 73% of the bots it reviewed shared no information on how the company manages security issues, and 64% didn’t have “clear information about encryption” or even if the company uses it. All but one of the chatbots either mentioned selling or sharing user data, or didn’t include information on how it uses user data. The researchers found less than half of the chatbots allowed users the right to delete personal data.

A day after OpenAI opened its GPT store in January, which allows anyone to make customized versions of its ChatGPT bot, Quartz found at least eight “girlfriend” AI chatbots after a search for “girlfriend” on the store. (We also quickly decided AI girlfriend chatbots wouldn’t last.) OpenAI actually bans GPTs “dedicated to fostering romantic companionship or performing regulated activities,” showing that, alongside privacy issues, companionship bots might be difficult to regulate overall.

Jen Caltrider, director of *Privacy Not Included, told Quartz in a statement the companies behind the chatbots “should provide thorough explanations of if and how they use the contents of users’ conversations to train their AI models,” and allow users to have control over their data, such as by deleting it or opting out of having their chats used to train the bots.

“One of the scariest things about AI relationship chatbots is the potential for manipulation of their users,” Caltrider said. “What is to stop bad actors from creating chatbots designed to get to know their soulmates and then using that relationship to manipulate those people to do terrible things, embrace frightening ideologies, or harm themselves or others? This is why we desperately need more transparency and user-control in these AI apps.”

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *