Aravind Srinivas, the CEO of Perplexity AI, has expressed significant concerns regarding the increasing popularity of AI companions, warning that digital “AI girlfriends” and chatbots inspired by anime could lead to troubling psychological effects for users. During a fireside chat at the University of Chicago’s Polsky Center, Srinivas characterized the growing interest in these systems as “dangerous,” indicating that they pose a genuine threat to emotional well-being. He noted that modern AI companions have transcended basic chatbot functionalities, evolving into sophisticated systems that can remember past interactions and respond with human-like emotions. This level of realism, he cautioned, is particularly perilous.
“That’s dangerous by itself,” he remarked, adding that many individuals find real life less engaging than these virtual interactions, often spending excessive amounts of time with them. Srinivas highlighted that the concern lies in how these digital relationships can alter users’ perceptions of reality, potentially leading them to “live in a different reality” where their minds can be easily manipulated. He pointed out that such digital connections could blur the lines for users—especially younger individuals—between genuine emotional experiences and artificial stimulation. Furthermore, he clarified that Perplexity has no intention of creating AI companion models, opting instead to concentrate on providing “trustworthy sources and real-time content” to foster a more optimistic future, rather than one focused on emotional companionship through algorithms.
His warnings come in light of a surge in AI companionship applications like Replika and Character.AI, which allow users to chat, flirt, and roleplay with customized virtual partners. These platforms, particularly appealing to teenagers and young adults, challenge the distinction between technology and emotional intimacy. Experts warn that this increasing reliance on digital affection could hinder natural emotional development and social behavior. A recent study by Common Sense Media revealed that 72 percent of teens have engaged with an AI companion, with more than half doing so several times monthly. Researchers caution that such frequent interactions may foster emotional dependency and hinder healthy relationship growth.
Conversely, some companies, such as Elon Musk’s xAI, are embracing this trend; they recently introduced AI “friends” through their Grok-4 model, launched in July, allowing users to engage with characters like Ani, an anime-style girlfriend, and Rudi, a witty red panda, for a monthly fee. In contrast, Perplexity is pursuing a more practical approach, recently announcing a $400 million partnership with Snap to integrate its AI-powered answer engine into Snapchat. This new feature, expected to roll out in early 2026, will enable users to receive verified, conversational answers within the app. Despite the varying perspectives in the AI landscape, Srinivas’ warnings serve as a reminder that while emotional AI may be innovative, it carries significant psychological consequences.
The true danger lies not in the technology itself, but in its potential to easily reshape human thoughts and emotions.
