In 2025, it will be common to talk to a person I have an agent who knows your schedule, your circle of friends, the places you go. This will be sold as a convenience equivalent to having an unpaid personal assistant. These anthropomorphic agents are designed to support and enchant us to incorporate them into every part of our lives, giving them deep access to our thoughts and actions. With voice-enabled interaction, that intimacy will feel even closer.
That sense of comfort comes from the illusion that we are interacting with something truly human, an agent that is on our side. Of course, this appearance hides a very different type of system at work, one that serves industrial priorities that are not always in line with our own. New AI agents will have much greater power to subtly direct what we buy, where we go, and what we read. That is an extraordinary amount of power. AI agents are designed to make us forget their true loyalties while whispering to us in human tones. These are manipulation engines, marketed as perfect convenience.
People are much more likely to give full access to a helpful AI agent who feels like a friend. This makes humans vulnerable to being manipulated by machines that take advantage of the human need for social connection in a time of chronic loneliness and isolation. Each screen becomes a private algorithmic theater, projecting a reality designed to be as attractive as possible to an audience of one.
This is a moment that philosophers have been warning us about for years. Before his death, philosopher and neuroscientist Daniel Dennett wrote that we face grave danger from artificial intelligence systems that emulate people: “These counterfeit personas are the most dangerous artifacts in human history… distracting and confusing us and exploiting our most irresistible fears and anxieties. “They will lead to temptation and, from there, to accepting our own subjugation.”
The rise of personal AI agents represents a form of cognitive control that moves beyond the blunt instruments of cookie tracking and behavioral advertising to a more subtle form of power: the manipulation of perspective itself. Power no longer needs to exercise its authority with a visible hand that controls information flows; It is exercised through imperceptible mechanisms of algorithmic assistance, shaping reality to fit the desires of each individual. It is about shaping the contours of the reality we inhabit.
This influence on minds is a psychopolitical regime: Directs the environments where our ideas are born, developed and expressed. Its power lies in its intimacy: it infiltrates the core of our subjectivity, bending our internal landscape without us realizing it, while maintaining the illusion of choice and freedom. After all, we are the ones asking the AI to summarize that article or produce that image. We may have the power of warning, but the real action lies elsewhere: the design of the system itself. And the more personalized the content, the more effectively a system can predetermine results.
Let us consider the ideological implications of this psychopolitics. Traditional forms of ideological control were based on open mechanisms: censorship, propaganda, repression. In contrast, today’s algorithmic governance operates under the radar, infiltrating the psyche. It is a shift from the external imposition of authority to the internalization of its logic. The open field of a warning screen is an echo chamber for a single occupant.
This brings us to the most perverse aspect: AI agents will generate a sense of comfort and ease that makes questioning them seem absurd. Who would dare to criticize a system that offers everything at your fingertips, catering to every whim and need? How can you oppose endless remixes of content? Yet this supposed convenience is the site of our deepest alienation. AI systems may seem like they respond to our every desire, but everything is full of things: from the data used to train the system, to the decisions about how to design it, to the business and advertising imperatives that shape the results. We will be playing an imitation game that in the end plays on us.