How long is 20 years? A generation of human beings.
Turn to history. There has never been, claims one historian of medicine,1 any example of an historic medical advance that now is widely accepted (like the germ theory of disease, or anesthetic surgery, or disproof of the efficacy of bleeding for pneumonia, or washing hands before delivery of a baby, or widespread antibiotic use) without a delay of at least 40 years (usually longer) between the presence of sufficient scientific evidence and widespread acceptance by the medical community.
How long is 40 years? Two generations of human beings.
The rule of thumb, proven by science and history, is that current generations always reject new truths. Can we ever get beyond this depressing fact?
It’s not ageism; it’s honesty, the most brutal honesty, to admit that we, not others, are the problem, and even more difficult, to admit that our teachers and our leaders have been part of the problem more often than they have been part of the solution.
We can find another source of insight in another late 19th century scientist, the physician William Osler, who got into trouble for a talk in which he jokingly recommended chloroform for those over 60 years. He gave that speech at that age, when retiring from Johns Hopkins, where he had founded and chaired the department of medicine. If he truly advocated euthanasia, he was advocating suicide. Osler’s comment was symbolic, not vulgar. Yet even symbolically, we don’t want to hear it. But it is worthwhile to listen:
“It may be maintained that all the great advances have come from men under 40, so the history of the world shows that a very large proportion of the evils may be traced to the sexagenarians—nearly all the great mistakes politically and socially, all of the worst poems, most of the bad pictures, a majority of the bad novels, not a few of the bad sermons and speeches.”4(pp382-383) Osler argued that almost all new and original ideas or projects are started by people in their 20s and 30s. Even if great work is produced after 40, it is almost always conceived, or started, earlier. Most persons think their great thoughts early in life and spend the rest of their lives proving, or expanding, or teaching those thoughts. It is rare for any human being to have a truly novel, important, original contribution to humankind after age 40, or 50, or 60, which he had never considered at all before those ages.
Or, to put it another way, as we get older, we stop changing our minds; our ideas become frozen; our minds become like museums, where the furniture doesn’t change, but is merely dusted off and spruced up. When we are younger, we have no past to defend; we are just beginning to furnish the houses of our mind; we take in new ideas, test them, experiment, accept, reject. Eventually, we choose the chairs and tables we like, and we settle down; our minds, made up of those belongings, settle down too. We are loathe to make radical changes afterward.
But what about the benefits of experience? As we age, we gain more clinical experience, which is thought to reflect wisdom. Often, the reverse is the case. As we get older, we learn so much, we see so much, that eventually our minds are full—and then, we become blind to all that we have not already seen or learned. Osler calls it “mind-blindness”—the most tragic problem in human knowledge: “It is not . . . that some people do not know what to do with truth when it is offered to them, but the tragic fate is to reach, after years of patient search, a condition of mind-blindness in which the truth is not recognized, though it stares you in the face. . . . It is one of the great tragedies of life that every truth has to struggle to acceptance against honest but mind-blind students.”4(p398)
There are exceptions; there are always exceptions. Some of us like to move; we don’t settle down. We continue to have new and different ideas as we age; we even change our minds into our 80s, sometimes radically. Secretary of Defense John McNamara, the architect of the Vietnam War, admitted he had been wrong when he was in his 80s. Governor George Wallace, the paragon of segregation, admitted he had been wrong when he was in his 60s. In addition to my father, Kamal Ghaemi, I’ve had excellent teachers, mentors, and friends who have retained their mental flexibility into their 60s and beyond—like Ross Baldessarini, Frederick Goodwin, Paul Roazen, Leston Havens, Kenneth Kendler, Jules Angst, Athanasios Koukopoulos, and Ronald Pies. Havens always taught us: “Take your theories lightly” and, to the end of his life, he was still rearranging the furniture of his mind. My friend, Dr Marc Agronin,5 has written about how some people manage to age well, including in the intellectual sense I’m describing: he relates examples from his experiences with Erik Erikson and Senator George McGovern, among others. An observer of the event once told me that he saw Erikson, in his 70s, walk up to a Harvard dean at the faculty club, and say, “Dean, can I be given a sabbatical for the next year? I have been thinking that most of my ideas may have been wrong.” These exceptions give us hope that some of us may evade Planck’s law; but they are, unfortunately, a minority. In contrast to these exceptions, over 2 decades of my active involvement in our profession, I’ve known multiple-fold more psychiatric leaders who follow Planck’s law, rather than break it.
This is the problem of generations: new ideas tend to grow not because contemporaries are convinced, but because unborn generations are.
1. Wootton D. Bad Medicine: Doctors Doing Harm Since Hippocrates. New York: Oxford University Press; 2007.
2. Kuhn TS. The Structure of Scientific Revolutions. 2nd ed. Chicago: University of Chicago Press; 1970:150.
3. Tatsioni A, Bonitsis NG, Ioannidis JP. Persistence of contradicted claims in the literature. JAMA. 2007;298:2517-2526.
4. Osler W. Aequanimitas. Philadelphia: The Blakiston Company; 1948.
5. Agronin ME. How We Age: A Doctor’s Journey Into the Heart of Growing Old. New York: Da Capo Lifelong Books; 2011.
6. Ghaemi SN. The Rise and Fall of the Biopsychosocial Model: Reconciling Art and Science in Psychiatry. Baltimore: Johns Hopkins University Press; 2009.
7. Ghaemi SN. Why antidepressants are not antidepressants: STEP-BD, STAR*D, and the return of neurotic depression. Bipolar Disord. 2008;10:957-968.
8. Frances AJ. DSM5 should not expand bipolar II disorder. http://www.psychologytoday.com/blog/dsm5-in-distress/201004/dsm5-should-.... Accessed November 15, 2012.
9. Ghaemi N. Mood swings. DSM 5 and bipolar disorder: science versus politics. http://www.psychologytoday.com/blog/mood-swings/201004/dsm-5-and-bipolar... and http://www.psychiatrictimes.com/mood-disorders/content/article/10168/164.... Accessed November 15, 2012.
10. Pound E. Hugh Selwyn Mauberley. Whitefish, MT: Kessinger Publishing; 1920 (2010).
11. Lewis A. The problem of ageing. Lancet. 1944;ii:569.
12. Shorter E. The history of lithium therapy. Bipolar Disord. 2009;11(suppl 2):4-9.
13. Simon S, ed. William James Remembered. Lincoln, NE: University of Nebraska Press; 1996.