How do you build the perfect defense against cognitive blind spots?
More than any other, this is the question that undergirds the highly influential online “rationalist” movement. What sounds like an introductory philosophy exercise has, over the past decade, precipitated deep-pocketed research institutes, financial collapse, at least one abusive cult, at least one other murderous cult, and national political tumult.
A currently sold-out science fiction paperback might offer the answer.
In late 2024, There Is No Antimemetics Division by “qntm” (the pen name of British programmer Sam Hughes) became an unusual kind of viral sensation: rationalists and tech workers began posting photos of their copies across social media. By January 2025, the $9 paperback edition had vanished from Amazon's offerings, spawning listings from third-party sellers for hundreds of dollars.
The novel tells the story of an agent at a secret organization dedicated to catching, categorizing, and containing ideas that resist being remembered. Unlike regular memes, which spread by sticking in people’s minds, “antimemes” erase themselves from both individual memories and institutional records. An antimeme might be a face you instantly forget or a room that vanishes from floor plans. Or it might be an entity seeking to enslave all human will that your organization keeps forgetting to fight because your organization keeps forgetting it exists.
The novel's central tension revolves around the Antimemetics Division’s repeated failures — its once-numerous agents keep being eliminated in the same ways, unable to learn from their mistakes because of the very nature of their work. Again and again, the Division's careful protocols and documentation systems become vectors for the very threats they are meant to contain.
In 2025, this premise reads more like prophecy than genre fiction. The spectacular collapses of recent rationalist or rationalist-adjacent projects all blare a common warning: The more elaborate our defenses against cognitive bias, the more vulnerable we become to new forms of delusion.
Like the Antimemetics Division, Sam Bankman-Fried's empire was undone by vulnerabilities in its basic principles. The founder of cryptocurrency exchange FTX and quantitative trading firm Alameda Research cultivated an image as the poster child of "earning to give,” the Effective Altruist (EA) principle of maximizing income to maximize charitable impact. His stated goal was to eventually donate billions toward preventing global catastrophes, thus ensuring the survival of the human race.
But FTX’s implosion in November 2022 revealed a fundamental flaw in the EA framework. Tools designed to defend against irrationality, such as expected value calculations and utility maximization, had been weaponized to justify increasingly risky and ultimately criminal financial behavior.
In particular, the concept of "longtermism” — the view that what matters most is securing humanity's long-term future — functioned as the equivalent of an antimeme. It erased employees’ concerns about present-day ethical violations by making them seem insignificant compared to some imagined, astronomical future good. This utilitarian math constituted its own cognitive blind spot, allowing wrongdoing to hide in plain sight.
When the collapse finally came, the most telling response was confusion. Like Division agents discovering evidence of previous, forgotten failures, many EAs were genuinely shocked to learn their defenses had broken down. The postmortems were filled with talk of "improving heuristics," the rationalist version of the Division's endless protocols that never quite manage to prevent the next catastrophe.
Unlike the public spectacle of FTX's collapse, Leverage Research's failures unfolded behind closed doors. Founded by Geoff Anders in 2011, Leverage was allegedly trying to develop the "one true theory of psychology" through psychological and sociological experiments on group members in the Bay Area.
Like the Antimemetics Division's careful protocols, Leverage's “debugging” techniques were supposed to cognitively fortify the group. Former member Zoe Curzi describes how participants would engage in "2-6hr long group debugging sessions" aimed at improving their psyches. The goal was to become "a sort of Musk-level super-person" through self-modification.
Again, rationalist-adjacent methodologies were, over time, fully inverted. By 2019, former rationalists were "clearing their homes of bad energy using crystals" and conducting seances to call on "demonic energies." Their attempt to systematize psychology through rigorous frameworks had devolved into paranoia about mind reading, mind control, and magical attacks.
When that iteration of Leverage finally dissolved, members were left with confused narratives about what they had experienced, echoing Antimemetics Division personnel who can’t remember their previous encounters with dangerous entities. Leverage has since rebranded as a research institute conducting “revolutionary science" in fields like "quantum biology," once more obscuring its troubled history.
The Zizians represent one of the most alarming rationalist-adjacent failure modes yet. Named after its leader, Jack "Ziz" LaSota, the group first made headlines in 2019 when LaSota and three accomplices descended on a Center for Applied Rationality (CFAR) retreat in Sonoma County. Dressed in black robes and Guy Fawkes masks, they barricaded exits and distributed flyers claiming CFAR did not “appreciably develop novel rationality/mental tech."
But the Zizians' critique of mainstream rationalism evolved into something far more sinister. Members embraced increasingly extreme positions on artificial intelligence, decision theory, and animal rights — topics common in rationalist spaces but now twisted into justifications for violence.
By early 2025, members of the group had been implicated in multiple deaths across three states, with news reports suggesting an escalating pattern of violence that began with the 2022 stabbing of a Vallejo property owner and culminated in the fatal shooting of a Border Patrol agent in Vermont. Once again, a group that had begun by battling weaknesses in rationalist frameworks ended up developing its own reality-distorting logic.
In qntm's novel, the Antimemetics Division is said to have been originally created to fight memetic contagions like Nazism. In a cruel irony, however, antimemetic entities eventually became the greater hazards. The Zizians followed this same pattern when their constructs to improve thinking transformed into vectors for violence, proving that antimemetic infections can be as deadly as memetic ones.
Of course, no discussion of antimemetic threats could be complete without addressing the Department of Government Efficiency (DOGE), itself the logical conclusion of a rationalist-adjacent ideology that has been steadily advancing toward the halls of power. Venture capitalist Peter Thiel, the longtime mentor of Vice President JD Vance, spent years pushing the notion that democracy had failed. Political theory blogger Curtis Yarvin went further, arguing that the U.S. should be run like a corporation by a sovereign CEO.
Created by executive order in late 2024, DOGE was the result of that vision. The government was to be stripped down, “optimized,” and placed under a single executive’s control. Within weeks, DOGE slashed thousands of jobs, dismantled oversight, and terminated longstanding government contracts. When employees warned that entire departments were becoming nonfunctional, DOGE’s response was to fire more people. The agency treated institutional memory as a liability, forcing the government to forget itself.
The irony here is impossible to miss. DOGE, an agency whose very name is based on a meme, is now acting as a force of antimemetic destruction. Executive Order 14188, issued to suppress student protests, reinforced this dynamic. Titled “Additional Measures to Combat Anti-Semitism,” its numbering — whether by accident or design — echoes 1488, a known white supremacist reference. The blurring of irony and reality has long been a feature of online culture, but never before has it been so literal.
Sadly, the victims are real. The administration's actions have been most keenly felt among those who lack the luxury of treating governance as a game. There’s Mahmoud Khalil, detained despite his legal status for pro-Palestinian activism; Rümeysa Öztürk, seized for her student journalism at Tufts; Kilmar Ábrego García, mistakenly deported to El Salvador for hand tattoos; and countless others. The debugging approach to statecraft has arrived at its ultimate expression: treating people as problems to be solved, then solving them by making them disappear.
The bleakest aspect of There Is No Antimemetics Division's popularity is that online rationalists and technocrats appear to have completely missed its central message. The title is the novel's thesis hiding in plain sight — there is no defense against cognitive blind spots. Every system designed to protect against memetic threats inevitably creates new vulnerabilities.
The esoteric message is just the exoteric one, but the novel's insight remains unseen and unheeded precisely where the book is most celebrated. Whether qntm intended the critique or not, his novel functions as the perfect parable for our moment. The Division's cyclical collapse, effected through its own protocols, encapsulates the pattern we've seen with FTX, Leverage Research, the Zizians, and now in American governance.
What makes the antimemetic metaphor so powerful is that it lays bare the self-erasing nature of these failures. Each collapse is not followed by learning. Rather, the fundamental blind spots remain invisible to those operating within these frameworks. The fact that the book has become a cult favorite in rationalist circles may be the ultimate antimemetic phenomenon, its central truth somehow evaporating from the awareness of those who most urgently need to grasp it.
Stephanie Yue Duhem is an Austin-based poet and essayist. She is the author of Cataclysm Moves Me I Regret to Say (House of Vlad, 2025).
Hm. I don't think it's wrong to say that rationalist ideology gives smart people a way to paper over obvious-to-an-outsider moral concerns with their political project, but is that really a unique feature of rationalism? Or is that just a description of *every* political ideology? The 20th century, far as I can tell, was a series of clashes between competing excuses (nationalism, communism, fascism, domino theory) for mass murder; the Zizians' ideology eventually landing on "and thus we need to kill some people" is the least weird thing about them.
Interesting survey, but I think 'it's impossible to get rid of cognitive biases' is the wrong conclusion, because that's the same old original-sin, or witch-hunting, idea which motivated all this in the first place. Humans may believe a lot of silly things, but their reasoning is pretty sound on many other things, and they have the ability to improve it - not through struggle sessions or enumeration of 'biases,' but through humility, care, and learning.