So often I hear experienced psychiatric colleagues, usually past Osler’s cutoffs, talk simply about the need to go back to ideas from decades ago: the biopsychosocial model is the most commonly cited, as if turning back the clock would solve the problems of today.6 They mistake the memories of youth as visions for the future. They don’t seem to realize that they haven’t changed their minds on any central ideas of importance since Richard Nixon was president. They don’t seem to be bothered by the idea that progress, at least in science, usually doesn’t happen by sticking to the same ideas all the time.
I have already passed Osler’s first cutoff, and I can only hope to join McNamara and Wallace and Havens in changing my mind about something important in the future. In the meantime, I would like to propose a test of who among the older can deign to advise the younger: before recommending the beliefs of your past youth for today’s generation, think of something that you fervently believed while young and now realize is false. If you can’t come up with the latter, withhold the former. This is not to say that Darwin should have changed his mind about evolution in his 80s; but his enemies should have.
I can cite some personal examples of changing my mind, now that I’m in my mid-40s. Until the last few years, I believed that antidepressants were much more effective in major depressive disorder (MDD) than I can bring myself to believe now after the results of studies like STAR*D. After I critiqued those results to interpret them as showing relatively low antidepressant efficacy,7 a colleague commented to me: “Nassir, you can’t expect people who have spent their entire careers trying to show that antidepressants work very well to suddenly accept that they don’t work so well.” Another example: I assumed for a long time that the biopsychosocial model was a benign, pleasant philosophy. Over time, after studying it, I determined that it was used for decades as a way to hide a certain postmodernist relativism about truth, an unwillingness to value science with a higher priority than other opinions, which connects to a third recent example. I naively believed, until recently, that our DSM leaders based psychiatric diagnoses on science wherever possible, and I defended the DSM system, by and large, against those who were skeptical of it. In just the past few years, I’ve learned, based on their own confession,8 that science mattered least for past DSM leaders: they “pragmatically” made diagnoses up, based on their personal views about what was best for the profession.9 So I’ve made a change in my view: I’ve had to accept the criticism that DSM revisions are mostly unscientific. Furthermore, I’ve concluded that we have failed to progress in psychiatric knowledge for the past half century—not because of the complexity of mind and brain, but because of the “pragmatic” gerrymandering of our psychiatric diagnoses, which fail, naturally, to correlate with nature.
Keep going, our elders tell us: we are still on the right path, even though the path has gone nowhere. Ezra Pound’s World War I poem doesn’t exactly apply: “. . . Walked eye-deep in Hell, believing in old men’s lies, then unbelieving came home, home to a lie. . . .”10 The errors of our past leaders—whether in psychopharmacology or DSM or the biopsychosocial model—weren’t lies because they weren’t intentional—which is worse. Napoleon once killed a Duke for plotting against him; the outcry that followed was more harmful to Napoleon than the plots of the Duke. Said Talleyrand years later: It was worse than a crime; it was a mistake. So, too, with our past psychiatric leaders: they didn’t lie, or knowingly harm. It was worse: they made mistakes with the best of intentions—hence, they still don’t realize it.
Positions of power, even in our young nation, are held mostly by persons in their 50s and 60s; these are exactly the decades that resist novelty, as Osler rightly noted. Almost the entire biological establishment opposed Darwin. The great British psychiatrist Aubrey Lewis, so unjustly unappreciated by his American cousins, once noted that “in positions where freshness is all, the old are not left to clog and petrify affairs; for we have it on wise authority that men of age object too much, consult too long, adventure too little, repent too soon, and seldom drive business home to the full period, but content themselves with a mediocrity of success.”11 I might slightly correct Lewis to note that I see less and less repentance with age, and one might say that Lewis himself, having written that comment in his 40s, failed to prove it wrong in his 60s and 70s. In older age, he abetted an unwise attack on lithium(Drug information on lithium), partly out of his lifelong attachment to social aspects of psychiatry as against drugs, and thereby he unjustly harmed acceptance of a drug that some consider our most effective medication ever, a drug that, to this day, is overly avoided by patients who don’t know better, and clinicians who should.12
The philosopher William James13 consciously took this attitude to every new idea: He first accepted it wholeheartedly for as long as he could; only afterward did he begin to analyze or critique it. For most of us, we barely begin to hear a new idea before we automatically mentally criticize it.
For those who are young now, realize this: you will never be as open to new ideas as you are now. Pay close attention to your attitude, and try to keep it forever, although all the forces of nature and of society will oppose you.
If only we could be as wisely naive as James, as children are, and as we increasingly lose the ability to be as we get older—and less wise. Let us respect our elders—yes, but let’s also respect ourselves and, in honor of our youth, respect truth above all. Amicus Plato, sed magis amica veritas. Loosely translated: I love Plato, but I love truth even more.