Topics:

Planck's Law of Generations: Page 2 of 3

Planck's Law of Generations: Page 2 of 3

How long is 20 years? A generation of human beings.

Turn to history. There has never been, claims one historian of medicine,1 any example of an historic medical advance that now is widely accepted (like the germ theory of disease, or anesthetic surgery, or disproof of the efficacy of bleeding for pneumonia, or washing hands before delivery of a baby, or widespread antibiotic use) without a delay of at least 40 years (usually longer) between the presence of sufficient scientific evidence and widespread acceptance by the medical community.

How long is 40 years? Two generations of human beings.

The rule of thumb, proven by science and history, is that current generations always reject new truths. Can we ever get beyond this depressing fact?

It’s not ageism; it’s honesty, the most brutal honesty, to admit that we, not others, are the problem, and even more difficult, to admit that our teachers and our leaders have been part of the problem more often than they have been part of the solution.

We can find another source of insight in another late 19th century scientist, the physician William Osler, who got into trouble for a talk in which he jokingly recommended chloroform for those over 60 years. He gave that speech at that age, when retiring from Johns Hopkins, where he had founded and chaired the department of medicine. If he truly advocated euthanasia, he was advocating suicide. Osler’s comment was symbolic, not vulgar. Yet even symbolically, we don’t want to hear it. But it is worthwhile to listen:

“It may be maintained that all the great advances have come from men under 40, so the history of the world shows that a very large proportion of the evils may be traced to the sexagenarians—nearly all the great mistakes politically and socially, all of the worst poems, most of the bad pictures, a majority of the bad novels, not a few of the bad sermons and speeches.”4(pp382-383) Osler argued that almost all new and original ideas or projects are started by people in their 20s and 30s. Even if great work is produced after 40, it is almost always conceived, or started, earlier. Most persons think their great thoughts early in life and spend the rest of their lives proving, or expanding, or teaching those thoughts. It is rare for any human being to have a truly novel, important, original contribution to humankind after age 40, or 50, or 60, which he had never considered at all before those ages.

Or, to put it another way, as we get older, we stop changing our minds; our ideas become frozen; our minds become like museums, where the furniture doesn’t change, but is merely dusted off and spruced up. When we are younger, we have no past to defend; we are just beginning to furnish the houses of our mind; we take in new ideas, test them, experiment, accept, reject. Eventually, we choose the chairs and tables we like, and we settle down; our minds, made up of those belongings, settle down too. We are loathe to make radical changes afterward.

But what about the benefits of experience? As we age, we gain more clinical experience, which is thought to reflect wisdom. Often, the reverse is the case. As we get older, we learn so much, we see so much, that eventually our minds are full—and then, we become blind to all that we have not already seen or learned. Osler calls it “mind-blindness”—the most tragic problem in human knowledge: “It is not . . . that some people do not know what to do with truth when it is offered to them, but the tragic fate is to reach, after years of patient search, a condition of mind-blindness in which the truth is not recognized, though it stares you in the face. . . . It is one of the great tragedies of life that every truth has to struggle to acceptance against honest but mind-blind students.”4(p398)

There are exceptions; there are always exceptions. Some of us like to move; we don’t settle down. We continue to have new and different ideas as we age; we even change our minds into our 80s, sometimes radically. Secretary of Defense John McNamara, the architect of the Vietnam War, admitted he had been wrong when he was in his 80s. Governor George Wallace, the paragon of segregation, admitted he had been wrong when he was in his 60s. In addition to my father, Kamal Ghaemi, I’ve had excellent teachers, mentors, and friends who have retained their mental flexibility into their 60s and beyond—like Ross Baldessarini, Frederick Goodwin, Paul Roazen, Leston Havens, Kenneth Kendler, Jules Angst, Athanasios Koukopoulos, and Ronald Pies. Havens always taught us: “Take your theories lightly” and, to the end of his life, he was still rearranging the furniture of his mind. My friend, Dr Marc Agronin,5 has written about how some people manage to age well, including in the intellectual sense I’m describing: he relates examples from his experiences with Erik Erikson and Senator George McGovern, among others. An observer of the event once told me that he saw Erikson, in his 70s, walk up to a Harvard dean at the faculty club, and say, “Dean, can I be given a sabbatical for the next year? I have been thinking that most of my ideas may have been wrong.” These exceptions give us hope that some of us may evade Planck’s law; but they are, unfortunately, a minority. In contrast to these exceptions, over 2 decades of my active involvement in our profession, I’ve known multiple-fold more psychiatric leaders who follow Planck’s law, rather than break it.

This is the problem of generations: new ideas tend to grow not because contemporaries are convinced, but because unborn generations are.

Pages

 
Loading comments...

By clicking Accept, you agree to become a member of the UBM Medica Community.