Commentary
Article
Author(s):
Explore Joe Pierre, MD's insightful analysis of why we believe falsehoods, blending psychology and cognitive science to address misinformation's impact on society.
BOOK REVIEW
False: How Mistrust, Disinformation, and Motivated Reasoning Make Us Believe Things That Aren’t True
by Joe Pierre, MD; Oxford University Press, 2025
256 pages; $8 (hardcover)
Reviewed by Awais Aftab, MD
In a moment where ideological polarization, conspiracy theories, and viral misinformation dominate public discourse, psychiatrist Joe Pierre, MD’s False is an intellectually rigorous, urgent exploration of why human beings believe things that are not true. Pierre, a professor of psychiatry at the University of California San Francisco and an experienced academic clinician, delivers work that blends insights from psychiatry, psychology, cognitive science, and social theory. As a result, False is a deeply informed and accessible volume that is of interest to anyone concerned with the fragility of truth in contemporary society. The book is highly recommended for psychiatrists, psychologists, educators, and public health professionals.
Pierre sets himself a formidable task: to trace the full spectrum of false belief, from clinical delusions to everyday distortions, and to explain why a species capable of scientific accomplishments such as putting men on the moon also struggles to agree on basic facts. He draws on cognitive psychology, psychiatry, and media studies to produce a diagnosis of the “post-truth” condition. The book also pushes readers to recognize that susceptibility to falsehood is not an aberration of those with mental illness, but an everyday feature of the normal human mind.
Pierre begins with the clarification that false beliefs endemic to the human condition are not hallmarks of psychopathology. Drawing on his clinical experience with delusions, he frames belief formation along a continuum, from psychosis to cognitive distortions to socially shared misbeliefs. Various chapters of the book explore and explain key psychological phenomena related to misbeliefs. Chapters focus on topics such as overconfidence and the Dunning-Kruger effect, confirmation bias, motivated reasoning, identity-protective cognition, epistemic relativism, the erosion of authority in a fragmented media ecosystem, the “disinformation industrial complex,” and the exploitation of psychological vulnerabilities by state and corporate actors.
Pierre is at his best when integrating clinical insights with cognitive science. His use of case vignettes, particularly his long-standing engagement with patients suffering from delusional disorder, grounds the text in real world psychiatric experience. He also handles complex cognitive biases and behavioral economic theories (eg, base rate neglect, heuristic-driven errors) with clarity and pedagogical skill.
The book builds on a simple but unsettling idea: the very mental faculties that make us brilliant, such as pattern recognition, intuitive inference, and emotional salience, also render us systematically prone to error. And in the attention economy, these design features can generate all sorts of problematic dynamics. From harmless quirks to violent delusions, from UFO enthusiasts to QAnon, from climate change denial to vaccine refusal, these are expressions along a continuum of false beliefs with overconfidence at one end and psychosis at the other, with quantitative as well as qualitative differences between the 2 ends.
Drawing from years of clinical work with psychotic patients, Pierre outlines a characterization of belief pathology by degrees of conviction, preoccupation, extension, and distress. We all fall prey to cognitive distortions. We all tell ourselves stories that insulate identity from dissonance. What marks clinical delusions, he argues, is not their falsity per se, but their unshakeability and disruptive impact. On Pierre’s account, while both a QAnon adherent who withdraws from family and stockpiles supplies and a patient convinced they are under CIA surveillance might hold beliefs that are, in a loose sense, “delusion-like,” the 2 occupy qualitatively different territory. Delusions proper, Pierre argues, can be reliably distinguished from conspiracy-driven beliefs by features such as their self-referential structure and reliance on subjective rather than objective reasoning.
The first half of the book is dedicated to unpacking the psychological architecture that supports false belief. Pierre covers the usual suspects (confirmation bias, motivated reasoning, the Dunning-Kruger effect) with clinical texture and empirical grounding. He tackles concepts such as identity protective cognition, a term borrowed from legal scholar Dan Kahan, JD, which refers to our instinct to reason in ways that protect group identity rather than objective accuracy.
The metaphor of the flea market of opinion is a memorable frame for the contemporary information ecosystem. Here, amid algorithmic amplification, epistemic gatekeeping collapses. Loudness, confidence, and tribal resonance outperform nuance. In this bazaar, everyone’s an expert and no one is accountable. The result is a public sphere saturated with persuasive nonsense.
Chapters on disinformation, conspiracy theories, and “bullshit” (speech indifferent to truth) build momentum toward the book’s second half, which tackles remedies.
Pierre stresses structural and institutional reforms alongside personal virtues, but with regards to proposed solutions addressing cognitive failings, Pierre focuses on what he calls the “Holy Trinity of Truth Detection”: intellectual humility, cognitive flexibility, and analytical thinking. These, he argues, are the dispositions most protective against the adoption of “epistemically suspect beliefs.” They can be seen as theoretical virtues and habits cultivated through education, metacognition in the style of cognitive behavioral therapy, and guided dialogue.
The prescriptions are laid out across 5 levels: individual, educational, media, platform, and civic. At the individual level, Pierre advocates for the trinity of virtues. At the educational level, he recommends media literacy curricula modeled after Finland’s widely cited national strategy. For information systems, he endorses prebunking (inoculation against falsehoods before exposure) rather than slow and mostly ineffective debunking. For platforms and policy, he calls for more content moderation, accountability mechanisms, and the flagging of repeat misinformation spreaders. Pierre insists that moderation is not censorship, though he knows this assertion can be politically fraught. For civic life, he recommends deliberative engagement across divides, creating space for reasoned disagreement and rehumanization.
These remedies are based on theoretical conceptualization of the mechanisms at play while drawing support from empirical research; Pierre does a good job highlighting what works (eg, the modest but reproducible effects of pre-bunking) and what does not (eg, fact-checking alone). Because personal bias is entangled with structural dynamics, the book insists on both psychological and social reforms. However, even the social reforms discussed tend to converge on the end goal of developing epistemic humility and cognitive resilience in the individual.
Pierre’s account consistently situates misbelief at the intersection of normal cognition and environmental context. He rejects mass psychosis tropes and the tendency to pathologize individuals, instead stressing the interplay between personal cognitive habits and systemic factors like polarization, distrust, and media ecosystems. While the Holy Trinity of Truth Detection is aimed at individual epistemic resilience, Pierre acknowledges that deeper structural change is also needed, and that neither level of intervention can succeed without the other. In this way, False offers an integrated view of the cognitive terrain and the political and institutional landscapes in which misinformation thrives.
Pierre defines misinformation somewhat in passing, “misinformation—that is, information that’s wrong or false,” (p 51) and this definition is presented in the context of differentiating between objective reporting and opinion. Opinions that are wrong or false are not necessarily misinformation in Pierre’s view, provided their status as opinions is acknowledged. The meaning of misinformation used by Pierre is that of false factual claims, which is narrower than “any information that happens to be wrong.” Pierre is less concerned with defending a particular notion of misinformation and he focuses on clear examples of false beliefs, such as the PizzaGate conspiracy theory and false claims that COVID-19 vaccines caused sterility or contained microchips.
Pierre also emphasizes the difference between misinformation and disinformation, defining disinformation as “information that’s deliberately created and spread for its false counternarrative” (p 65). Pierre recognizes that the line between misinformation and disinformation is blurred, since intentions and genuineness of beliefs are hard to judge, but maintains that this distinction is important, “amounting to the difference between believing things that are merely wrong and believing lies” (p 65). Pierre devotes an entire chapter to discussing the “disinformation industrial complex” and sets up the metaphors as follows1:
“Within the disinformation food chain, “apex predators” who are part of a “disinformation industrial complex” sit at the top, fomenting mistrust in institutions of epistemic authority and creating false information based on some underlying motive, usually to obtain either financial profit or political power…In the middle of the food chain are the “mesopredators” or “prosumers” who both consume disinformation and pass it on as misinformation while also creating novel disinformation themselves. Finally, the “prey” or “pure consumers” lie at the bottom of the food chain as passive receptacles of misinformation and disinformation alike, sharing it with other consumers.”
Pierre's discussed examples of apex predators include Alex Jones and InfoWars (notorious in particular for his baseless claims that Sandy Hook Elementary School 2012 shootings were a staged operation), the wellness and lifestyle company Goop, and Kelly Brogan, MD. In Pierre’s analysis, “from a societal standpoint, there’s good evidence that our current vulnerability to disinformation is more a result of mistrust than gullibility, ignorance, or stupidity as is often claimed” (p 66). While some of this mistrust is a consequence of the failures of mainstream institutions, breeding mistrust in facts and experts has been a specific strategy weaponized by political actors.
One way of appreciating the position False occupies in the contemporary intellectual discourse on the phenomena of misinformation and post-truth world is by comparison with the work of philosopher Dan Williams. Williams is best known for his work on the marketplace of rationalizations.2 I have enthusiastically followed his Substack blog “Conspicuous Cognition” since its launch in January 2024, and his writings have had a tremendous influence on how I personally think about these issues. It was natural then that while reading False, I often viewed Pierre’s arguments through the lens of Williams’s ideas.
Williams is dissatisfied with how “misinformation” and “disinformation” are typically defined and studied by scholars of misinformation.3 Williams’s argument is that clearly fabricated stories (a la Alex Jones) are relatively rare and their impact is confined to a small segment of society.4 On an expanded definition that includes any falsehood or even true but misleading information, “misinformation” indeed has more societal impact but this also makes objective characterization of misinformation far harder, since truth (especially in politics) is complex, contested, and often uncertain. This creates a dilemma: researchers can either target the politically significant forms of misleading communication or preserve scientific impartiality, but not both.
I would add that I personally find the definition of “misinformation” as “information that’s wrong or false” or “false factual claims” as philosophically inadequate, primarily because the domain of assertions that are wrong or false is far greater than the domain of things we can reasonably call misinformation. Our scientific understanding of the world is constantly progressing and many of our scientific explanations that are considered true today will be proven wrong tomorrow. There are ways to add further qualifications to the definition to bolster it, and to Pierre’s credit, he goes to great lengths to explain how scientific understanding changes over time, “science is always ready to be wrong,” (p 158) and trusting science is not an argument from authority. I completely agree with Pierre when he writes, “scientific consensus based on such evidence shouldn’t be as easily discounted as it often is. It is one thing when a lone scientist or a handful of scientists makes a claim that’s debatable; it’s another when nearly the entire field is in agreement” (p 159). At the same time, these insights do not translate into an updated definition of misinformation. Fundamentally, there is no escaping the messiness of delineating the domain of false information in contexts where truth is complex, contested, evolving, and uncertain, and the problem is all the more acute in situations where an expert consensus does not exist.
This definitional problem, even if accepted, is not fatal to Pierre’s project. Pierre works with a relatively broad definition of misinformation, but his focus remains on clear instances of misinformation that are immune to boundary disputes. While Williams maintains that unmistakable instances of misinformation and disinformation affect a relatively small portion of society, Pierre argues that such misinformation and disinformation can have and have already had an outsized influence in decisively shaping outcomes such as political elections. Pierre writes, “belief in modern conspiracy theories related to vaccines, the integrity of democratic elections, and climate change has a much greater practical relevance to our daily lives and a much greater potential for harm… [B]elief in disinformation—promoting conspiracy theories about these pivotal issues of our time is already contributing to human lives lost through neglect of proper medical care and failure to take action on attempts to halt or reverse global warming” (p 91).
Williams acknowledges that harmful disinformation exists and can sometimes be clearly identified, but stresses that misinformation research is not an objective or apolitical endeavor, and its experts share the same biases and fallibility as anyone else.5 As an example, he cites an exposé on the Global Disinformation Index’s disputed decision to label outlets like Unherd as disinformation sources.5 This does not contradict or undermine Pierre per se, whose examples are unambiguous and who never claims that the field of disinformation is free of bias, but it shows how implicitly portraying judgments of misinformation and disinformation as obvious and infallible can erode trust in experts, and that the field must embrace the same humility and self-awareness that Pierre recommends generally.
Both Pierre and Williams recognize in their own ways that cognitive biases and naive realism are the human default.1,6 Both authors agree that disinformation is a symptom of societal dysfunctions such as institutional distrust and political sectarianism.1,7 A core point from Williams is that most social epistemology frames the central puzzle around why people believe false things. Williams argues that just as poverty is the historical default and wealth is the true puzzle, ignorance and misperception are the human default¾the real epistemic puzzle is how and why people can come to hold true beliefs.6 Williams would argue that the epistemic default is distorted, locally adaptive, self-serving cognition; accurate beliefs about complex, non-local matters arise only through rare alignments of institutions, norms, and incentives.
False takes for granted that we are living in a post-truth world, while Williams finds the very idea frustrating. “For the love of God, stop talking about "post-truth"” is the exasperated title of one of Williams’s posts in which he argues, “There was never a golden age of objectivity, and today’s epistemological problems result from competing visions of reality, not a conflict over the value of truth.”8
False was written before Williams’s critiques of misinformation studies on Substack took off. What I want to convey here is that my reading of False was shaped in part by Williams’s influence, and as a reviewer, that seems like important information to communicate. By the time I picked up Pierre’s work, concepts like “misinformation” and “post-truth” had already been thoroughly problematized in my mind (for better or worse) thanks to Williams, stripped of the solidity Pierre seems to assume. That Pierre’s project remains stable, and his thesis retains both vitality and viability, is a testament to his scholarly rigor and the excellence of this book.
Dr Aftab is clinical assistant professor of psychiatry at Case Western Reserve University in Cleveland, Ohio. He is the editor of “Conversations in Critical Psychiatry” and writes online at Psychiatry at the Margins.
References
1. Pierre J. False: How Mistrust, Disinformation, and Motivated Reasoning Make Us Believe Things That Aren’t True. Oxford University Press; 2025.
2. Williams D. The marketplace of rationalizations. Economics and Philosophy. 2023;39(1):99-123.
3. Williams D. Misinformation researchers are wrong: there can't be a science of misleading content. Conspicuous Cognition. January 10, 2024. Accessed August 25, 2025. https://www.conspicuouscognition.com/p/misinformation-researchers-are-wrong
4. Williams D. What is misinformation, anyway? Conspicuous Cognition. December 1, 2024. Accessed August 25, 2025. https://www.conspicuouscognition.com/p/what-is-misinformation-anyway
5. Williams D. Debunking disinformation myths, part 2: the politics of big disinfo. Conspicuous Cognition. May 12, 2024. Accessed August 25, 2025. https://www.conspicuouscognition.com/p/debunking-disinformation-myths-part-e14
6. Williams D. Why do people believe true things? Conspicuous Cognition. July 28, 2024. Accessed August 25, 2025. https://www.conspicuouscognition.com/p/why-do-people-believe-true-things
7. Williams D. Misinformation is often the symptom, not the disease. Conspicuous Cognition. February 17, 2024. Accessed August 25, 2025. https://www.conspicuouscognition.com/p/misinformation-is-often-the-symptom
8. Williams D. For the love of God, stop talking about "post-truth." Conspicuous Cognition. October 11, 2024. Accessed August 25, 2025. https://www.conspicuouscognition.com/p/for-the-love-of-god-stop-talking
Receive trusted psychiatric news, expert analysis, and clinical insights — subscribe today to support your practice and your patients.