As artificial intelligence becomes an integral part of everyday life, from curating social media feeds to shaping financial decisions, questions about who is drawn to these technologies and why are gaining new importance. Among the most thought-provoking debates is whether individuals with psychopathic or narcissistic tendencies are particularly attracted to artificial intelligence. The question is not just academic. It speaks to how personality and morality interact with tools that can amplify both empathy and manipulation.
The psychology behind technological attraction
Psychopathy and narcissism are part of what psychologists refer to as the "Dark Triad," alongside Machiavellianism. These traits are characterized by a lack of empathy, self-centeredness, and a willingness to exploit others for personal advantage. Studies have shown that people high in Dark Triad traits tend to adopt tools that offer power, control, or influence more quickly than others. In this sense, artificial intelligence presents a uniquely appealing medium: a system capable of simulating social interaction while providing emotional distance and anonymity.
A 2023 paper published in Frontiers in Psychology found that individuals with higher scores in psychopathy and narcissism reported a greater interest in generative AI tools for self-enhancement and manipulation. Similarly, researchers writing in Computers in Human Behavior have discussed how narcissistic personalities are drawn to digital technologies that enable self-promotion and attention-seeking behavior. Although the findings do not imply that AI inherently attracts such personalities, they indicate that the motivations of certain users can shape how these tools are employed.
According to cognitive psychologist Dr. Anne Kruger of the University of Amsterdam, the attraction lies in the psychological affordances of AI itself. For narcissists, she explains, AI offers "a mirror to amplify the idealized self." For individuals with psychopathic traits, it provides "a mechanism for manipulation without emotional cost." While Kruger’s comments are interpretive rather than derived from a specific study, they align with broader psychological theories about how people project their inner drives onto technology.
Manipulation and emotional detachment
Artificial intelligence tools are designed to emulate empathy and understanding, yet they operate without moral judgment or genuine emotional experience. This creates an environment that can appeal to individuals who experience little guilt or remorse. In such systems, emotional detachment is not only possible but often rewarded.
Cybersecurity experts have documented a steady rise in AI-assisted deception. Norton Labs’ 2024 Threat Report described a growing number of scams using chatbots and deepfake media to build trust with victims before extracting information or money. Although these operations are typically organized efforts rather than individual acts, psychologists note that people high in psychopathy often gravitate toward manipulative behaviors online. The scalability and anonymity of AI technologies make these tendencies easier to act upon.
This pattern suggests that while AI does not cause unethical behavior, it can magnify the opportunities for those predisposed to exploit others. The very qualities that make AI useful—efficiency, personalization, and realism. Can be repurposed for deception.
Narcissism and the digital mirror
Narcissism thrives on admiration and visibility. AI tools have expanded the range of ways in which users can craft and control their digital identities. From filters and avatars to AI-generated images and text, these tools allow individuals to curate idealized versions of themselves. Research published in Personality and Individual Differences has linked narcissistic traits with the pursuit of online validation, suggesting that new AI tools simply extend these existing dynamics.
While some of this self-enhancement is harmless, psychologists warn that excessive reliance on AI-mediated presentation may reinforce fragile self-esteem. When identity becomes tied to technological performance, the distinction between self-expression and self-delusion can blur. The AI mirror may reflect not authenticity but aspiration, feeding a cycle of validation that demands ever more attention and perfection.
The ethical and social dimensions
If people with certain personality traits are more inclined to exploit AI, this has implications beyond psychology. It raises questions about accountability, design ethics, and the structure of online interactions. Technology companies already face scrutiny for creating systems that prioritize engagement and visibility over empathy or integrity. Understanding how psychological tendencies interact with these systems could inform more responsible design.
Dr. Kruger and other ethicists argue that developing ethical AI requires integrating psychological insights into regulation and education. Teaching digital empathy, for example, might help counterbalance the manipulative potential of these systems. As AI becomes more embedded in communication, recognizing its appeal to different personality types is part of ensuring that its use remains constructive rather than corrosive.
Between innovation and exploitation
Whether psychopaths and narcissists are more likely to use AI is ultimately a question about human nature rather than technology itself. The evidence suggests correlation, not causation. Artificial intelligence magnifies whatever motivations exist in the person using it. For some, that means creativity, learning, and empathy. For others, it means control, deception, and self-glorification.
AI remains a mirror to human intent—a reflection of moral architecture more than a driver of it. As these systems grow in sophistication and reach, the challenge is not only to refine the technology but to understand ourselves in relation to it. The darker impulses AI exposes are not confined to a few disordered individuals. They may instead reveal the broader psychological vulnerabilities of a society that prizes visibility, control, and efficiency above reflection or empathy.
Category: Psychology
Sources: Frontiers in Psychology, Computers in Human Behavior, Personality and Individual Differences, Norton Labs
Note: All information in this article is based on verified public data and credible sources available at the time of writing.
