AI therapists might flatten humanity into patterns of prediction, and so sacrifice the intimate, individualized care that’s anticipated of conventional human therapists. “The logic of PAI leads to a future where we may all find ourselves patients in an algorithmic asylum administered by digital wardens,” Oberhaus writes. “In the algorithmic asylum there is no need for bars on the window or white padded rooms because there is no possibility of escape. The asylum is already everywhere—in your homes and offices, schools and hospitals, courtrooms and barracks. Wherever there’s an internet connection, the asylum is waiting.”
A Critical Analysis of
AI Mental Health Treatment
Eoin Fullam
ROUTLEDGE, 2025
Eoin Fullam, a researcher who research the intersection of know-how and psychological well being, echoes some of the similar issues in Chatbot Therapy: A Critical Analysis of AI Mental Health Treatment. A heady educational primer, the e-book analyzes the assumptions underlying the automated remedies supplied by AI chatbots and the method capitalist incentives might corrupt these varieties of instruments.
Fullam observes that the capitalist mentality behind new applied sciences “often leads to questionable, illegitimate, and illegal business practices in which the customers’ interests are secondary to strategies of market dominance.”
That doesn’t imply that therapy-bot makers “will inevitably conduct nefarious activities contrary to the users’ interests in the pursuit of market dominance,” Fullam writes.
But he notes that the success of AI remedy relies on the inseparable impulses to earn cash and to heal individuals. In this logic, exploitation and remedy feed one another: Every digital remedy session generates information, and that information fuels the system that income as unpaid customers search care. The more practical the remedy appears, the extra the cycle entrenches itself, making it tougher to tell apart between care and commodification. “The more the users benefit from the app in terms of its therapeutic or any other mental health intervention,” he writes, “the more they undergo exploitation.”
This sense of an financial and psychological ouroboros—the snake that eats its personal tail—serves as a central metaphor in Sike, the debut novel from Fred Lunzer, an writer with a analysis background in AI.
Described as a “story of boy meets girl meets AI psychotherapist,” Sike follows Adrian, a younger Londoner who makes a residing ghostwriting rap lyrics, in his romance with Maquie, a enterprise skilled with a knack for recognizing profitable applied sciences in the beta part.

Fred Lunzer
CELADON BOOKS, 2025
The title refers to a splashy business AI therapist referred to as Sike, uploaded into good glasses, that Adrian makes use of to interrogate his myriad anxieties. “When I signed up to Sike, we set up my dashboard, a wide black panel like an airplane’s cockpit that showed my daily ‘vitals,’” Adrian narrates. “Sike can analyze the way you walk, the way you make eye contact, the stuff you talk about, the stuff you wear, how often you piss, shit, laugh, cry, kiss, lie, whine, and cough.”