BAM! A sharp thud on our little back deck about a yard from me the other day. I looked and saw a brick, lobbed over the fence by three kids in the alley. I yelled an obscenity and dashed for the gate. The kids took off and I gave chase, barefoot, indifferent to the shards of back-alley glass. The boys were young—between 9 and 12—brown-skinned. They outran me easily after a couple of blocks. But I got close enough to get a good look. They were clean and well-groomed. Nice-looking kids. They probably had moms who would give them a licking if they knew what their boys had done. Fortunately, no damage was done. I didn’t get a concussion or a bone bruise. It didn’t total my laptop. It didn’t shatter a window. The event was not serious in the wider scheme of city crime. But it was an invasion, a violation. It pissed me off and I thought about it the rest of the day. I weighed their crime as racially motivated. They were black and I am white and they probably wouldn’t have thrown that brick into a black family’s yard. Then I thought about it as motivated by class. Houses in our neighborhood are modest, but probably by those boys’ standards we are wealthy. I thought about how much violence lay behind the gesture. The beefy white cop who took my statement told me to dispose of the brick safely (lest it explode?) and suggested I work in a safer place than my back deck. The brick remains, as a reminder, and I continue to write in the garden. I will not be cowed by a nine-year-old. In the end, I concluded that class was more important than race—and mischief more important than class. The incident was the more troubling because two days earlier, I had also been writing outside when helicopters began circling. We live near a hospital with a Medevac, and traffic copters occasionally make a few passes when there’s a jam or an accident on a nearby artery, so a couple of minutes of their drone is normal. But these persisted, and then I saw that they were black police choppers. A few minutes later, a woman ran up our small one-way street screaming and wailing into her cell phone. We thought we heard her scream, “My baby!”
Shooting. 3600 block Old York Road. Adult female and juvenile reported to be shot.
It was about five blocks from my house, across the busy thoroughfare marking my neighborhood from the friendly but sketchier one to the east. It’s not “The Wire” sketchy. Just a lower-middle-class neighborhood, mostly black, higher-than-average unemployment rate, lots of families and low-budget hipsters. Shootings are rare there, and broad-daylight gunplay is rare anywhere. But this particular afternoon, three-year-old MacKenzie Elliot was playing on the porch. Caught a stray bullet. Was dead by sundown. The piece I was trying to write that weekend was a review of several books, on genetic and cultural theories of race. One is Nicolas Wade’s A Troublesome Inheritance, which received a satirical review on these pages. It is a pernicious book, a defense of white privilege on biological grounds, cloaked in the same phony tone of reason that eugenicists and anti-evolutionists have evoked for decades: I just want to talk about this issue. Science has to be able to investigate any question, no matter how unpopular. Help help, the Political Correctness Police are trying to silence me. Blah blah blah.
In the early 1980s, I learned that the nature/nurture controversy was officially over. The Victorian polymath Francis Galtonhad coined the phrase “nature vs. nurture” a century before.
Everyone knows now that it’s a false dichotomy. Everything interesting is shaped by both genes and environment, and moreover, genes and environment mold one another. The relative influence of genetics on a trait is not fixed; the trait may be primarily genetic under some conditions, primarily environmental under others. Scientists know this. Science journalists know it. Scholars of science know it. We have moved past it. Twenty-first century biology is about the interplay among heredity and environment: gene–gene, gene–environment, and environment-environment interactions.
Except it isn’t. Why else do we still have books like Wade’s? If anyone ought to be up on the latest findings in genetics it ought to be him, a long-time reporter on the genetics beat for the New York Times. Yet instead of providing a fair survey of the field as he was trained, he chose to be persuaded by a narrow slice of work that continues a long-discredited scientific tradition. One focusing on the biological race concept and its supposed connections with intelligence, sexuality and other tinderbox issues. As Sussman shows, much of this research is sponsored by the blatantly white-supremacist Pioneer Fund. When it comes to those qualities we think of as quintessentially human, the basic question of nature or nurture seems independent of the state of scientific knowledge. The question returns with force whenever the trait is morally charged. Sexuality. Violence. Intelligence. Race.
Since the 1970s, the brilliant Marxist population geneticist Richard Lewontin has been arguing that the essence of using genetics as a social weapon is equating “genetic” with “unchangeable.” For decades, Lewontin has been pointing out examples of how that’s not true. It’s even less true now, with biotechnology such as prenatal genetic diagnosis and genome editing. Increasingly, the eugenicists’ dream—the control of human evolution—seems to be coming within our grasp. The new eugenicists want to give individuals the opportunity to make the best baby money can buy. No government control, they insist, no problem: if the free market takes care of it, the ethical problems disappear. Adam Smith’s invisible hand will guide us toward the light. As we take control of our own children’s genomes, the rich white people may have rich white babies, but, once we equalize access to whole genome sequencing, IVF, and prenatal genetic diagnosis, then poor black couples can have,…um…the smartest little black babies they can. And so can the Hispanics! And the Catholics who believe procreation shouldn’t require intervention, well they can produce “love children,” just like in GATTACA. It’ll all be fair and market-driven, once we socialize it a little bit.
So why are we even still talking about race and IQ? To Wade and others who say that it is a reasonable scientific question, that proper science has no politics and that the Morality Police have no business blocking scientific progress, I respond: What progress? What benefit? In order to frame this as a scientific question one has to define race, and any definition of race has a moral dimension. There is no way to ask whether racial associations with IQ are “real” without an agenda. The association of race and IQ is a legitimate historical question, but it must be acknowledged that even the most objective historian can only be interested in that question for moral reasons. If the scholarship is good, the agenda will be transparent, evaluable, debatable. But not absent. A good scholar (or reporter) will seriously investigate other viewpoints, present all sides. But he or she will not make pretense to absolute objectivity. The great danger of scientific investigations of questions such as race and IQ is just that pretense.
Science has immense cultural authority—it is the dominant intellectual enterprise of our time. Consider the state of funding or education for “STEM” (science, technology, engineering, mathematics) fields versus that for the humanities, social sciences, or arts. A good deal of science’s cultural authority stems from its claims to objectivity. Thus when a scientist investigates race and IQ, or a science journalist writes about it, they can invoke a cultural myth of science as having privileged access to The Truth. Not all do it—those with historical sensitivity recognize and teach the fallibility of science. But it’s common enough, even among experienced science educators and reporters, to be a crucial justification for the scholarly study of science as a social process. Science has a potent Congressional lobby. Like any industry, it needs watchdogs. Science is not just any industry. Aspects of it remain curiosity-driven, independent of the profit motive. It has an aesthetic side that unites it with the arts. And yet, for many types of questions, it provides a pleasingly rigorous set of methods for cutting through bias and pre-expectation. When scientific methods are pitted against superstition, belief, and prejudice, I side with science every time.
But when you study a lot of science; when you examine it over broad swaths of geography and time, rather than focusing on one particular tiny corner of it; when you study the trajectories of science; when you study the impact of science; when you examine the relationship of science to other cultural enterprises; you find that scientific truth is always contextual. The science of any given day is always superseded by the science of tomorrow. Despite popular myth, science does not find absolute Truth. “Science erases what was formerly true,” wrote the author John McPhee. When I was in college, brain-cell formation stopped shortly after birth. The inheritance of acquired characteristics was debunked nonsense. Genes were fixed and static. Humans had about 100,000 of them. IQ did not change over one’s lifetime. There were nine planets in our solar system. All of that was scientifically proven. None of it is true any more. Only a scientist ignorant of history can be confident that what she knows now will still be true a generation hence.
Which brings me back to the murder and the brick. On one level, the shootings a few blocks away were another incident of violence, probably drug-related, in a poor, predominantly black neighborhood. When they catch the bastard that shot that little girl, if they do a DNA test they might find genetic variants that occur with higher frequency in black males than in the population as a whole. If I catch the little punk who nearly beaned me with that brick, should he spit on my clothes and were I to have it analyzed, the lab might find SNPs in his DNA associated with a predisposition to violence. Whether those differences exist are legitimate scientific questions. But they are moot. The only reason to ask them is to prove an innate predisposition that, historically, has tended to foster racism and hinder social change. They may be legitimate scientific questions, but they’re stupid questions, and the motives of anyone who asks them are suspect. It’s not censorship to declare certain inquiries out-of-bounds. And people knowledgeable about science but outside the elite ought to be part of the process. Scholars. Journalists. Technicians. Students. Research funding should be less of a plutocracy, more of a representative democracy, so we can make better decisions about what questions are worth asking. In my case, the right questions are not “What biological differences account for that brick or that murder?” They are, Who is that brick-throwing kid’s mom? Can I, a “rich” white male, win her trust enough for her to let me into her house, to tell her my story in a way she can hear, so that she can discipline her child and get him back on a more positive path? What can we do to take our neighborhoods back, to make them not shooting galleries but communities again? How can we get people to get to know their neighbors, to keep their eyes open, to watch out for each other?
The other night, my wife took me along to an impromptu wake for the murdered girl, a five-minute bike ride away, near where the shootings occurred. In conventional racial terms, the crowd looked like Baltimore: about two-thirds black, one-third white (the latter mostly young), a sprinkling of Asians. But culturally, it was a black event, run by black women. The MC was the head of the neighborhood community association, a black woman. Words were said by the mayor, a state senator, a city councilwoman—all black women—and the governor, a white man. There was a prayer led by Sister Tina, a holy-rolling preacher who could make a middle-aged, over-educated, white atheist’s eyes well with her furious message of love and community. After the prayers and speeches, one young man threw down a Michael Jackson imitation, lip-synching and doing every move in Michael’s bag—full splits, knee-drops, and skids—on the coarse, hot Baltimore asphalt. The crowd whooped its approval. But the power that evening was held by the women. As we got ready to leave, I walked up and introduced myself to a few of those formidable, warm women. I threw my arms around Sister Tina and told her I thought she was amazing. She beamed and said she could see that the light of God was in me, she could see that I understood. And maybe I did. I know too much about evolution to believe in a literal god, but our mutual warmth and shared ideals are real. It may have been a culturally black event, but all were welcome. I understood in a new way how race matters in exactly the ways, to precisely the extent, that we want it to. Searching for the SNPs that make “them” and “us” different, seeking differences in test scores between the mixture of genes and culture Americans call “black” with those we call “white,” divides us. But here in this corner of this city, we have opportunities to celebrate each other’s cultures, and we have opportunities to share each other’s grief. The more I take those opportunities, the less value I see in the sciences of human racial difference.
A flurry of eugenics-related news over the last couple of weeks demonstrates that we have to stop considering eugenics a historical period and think about it more as an ever-present theme. In my book I called it “the eugenic impulse”—not to invoke some sort of misty, mystical force but rather simply to point to something that seems deeply part of our nature. Which is not to say part of our DNA. My research convinced me of two things:
1) Mixed with the chauvinism, intolerance, and paternalistic governmentality of Progressive-era eugenics was an impulse to prevent disease and disability using state-of-the-art knowledge of heredity.
2) Mixed with present-day impulses to prevent disease and disability using state-of-the-art knowledge of heredity is a great deal of hype motivated more by the desire for profits than by humanitarian concerns.
In short, I could not escape the conclusion that some aspects of contemporary genetic medicine—both good and bad—are indistinguishable from some aspects of Progressive-era eugenics—both good and bad.
The Science of Human Perfection is my attempt to wrestle with the question, “Is eugenics ever okay?” Because I have refused to come down on the side of the dogmatic anti-eugenicists, some pro-eugenics types, eager for recruits, have marshaled my words for their cause. At the same time, some antis have accused me of supporting the enemy. If I make the argument that modern medical genetics comes from the same rootstock as Progressive-era eugenics, they fear that anti-abortion fanatics will use my work as ammunition to repeal Roe v. Wade.
To those of you on both extremes, here’s my answer: No, eugenics is not okay. It scares the crap out of me, to be honest. But it’s happening anyway. No one—and certainly not a historian—is going to stop us from using genetic technology in the attempt to perfect the human race. The most intelligent response is to point out (and so hopefully avoid) the greatest risks.
For years, historians of eugenics have maintained that the term eugenics is no longer helpful. It is too loaded, they say; invariably, it invokes the Nazi past. Whatever programs in controlled breeding or self-directed evolution may be going on, it’s alarmist and a distraction, they say, to call them “eugenics.” For years, this was a reasonable and level-headed response, but it is no longer viable. Not because it’s less loaded, but because today’s historical actors are using it.
A growing number commentators from within the scientific community are arguing for a revisitation of eugenics:
“Seeing the bright side of being handicapped is like praising the virtues of extreme poverty. To be sure, there are many individuals who rise out of its inherently degrading states. But we perhaps most realistically should see it as the major origin of asocial behavior that has among its bad consequences the breeding of criminal violence.” (James Watson, “Genes and Politics,” 1997)
In 2001, the conservative theorist Richard Lynn published Eugenics: A Reassessment, which argues just what you think it does. In 2002, researcher DJ Galton (no relation to the founder of eugenics) considered the new genetics, test-tube babies, and genetic screening and called a spade a spade: Eugenics: The Future of Human Life in the 21st Century.
“Eugenics failed because it was not scientific enough…The role of eugenics in our time is in maximizing [hereditary] information and its availability to those who need it and minimizing the temptation to use the State as the means of enforcing eugenic ideals.” (Elof Carlson, “The Eugenic World of Charles Benedict Davenport,” 2008)
“A new interest in rational discourse about eugenics…should be our goal.” (Maynard Olson, “Davenport’s Dream,” 2008)
“Soon it will be a sin of parents to have a child that carries the heavy burden of genetic disease. We are entering a world where we have to consider the quality of our children.” (Bob Edwards [creator of first test-tube baby])
“Eugenics, once discredited as part of the first wave of social authoritarian progressives that trampled free will for women, handicapped people and minorities, is attempting a 21st century comeback.” (Hank Campbell, “Genetic Literacy Project on Neo-Eugenics,” 2012)
The most recent is Jon Entine, who runs the Center for Genetic Literacy and writes regularly for the conservative money magazine Forbes. “Instead of being driven by a desire to ‘improve’ the species,” he writes, the “new eugenics is driven by our personal desire to be as healthy, intelligent and fit as possible—and for the opportunity of our children to be so as well.” (Jon Entine, “DNA Screening is Part of the New Eugenics—and That’s Okay,” 2013)
No, we are not trying to improve the species—just our children, and our children’s children, and our children’s children’s children,…
Talk of a new eugenics, then, is no longer idle hand-wringing. When our actors themselves are using the term, historians and philosophers need to take notice and help make sense of it.
The fact that Entine writes for Forbes, Ridley for the National Review, and Lynn for Mankind Quarterly suggests a linkage between the new eugenics and conservative ideologies. Eugenics has long had such associations. Some of the neo-eugenicists (e.g. Lynn) are ideologically linked to the old, discredited eugenic ideologies. But others (e.g., Ridley, Entine) I think are more complicated. Liberals and conservatives, of course, are a diverse lot. When critiquing neo-eugenics, we must bear in mind whether someone is writing from a position of profit-making, preservation of the social status quo, libertarian individualism, or other ideology.
Further, liberals can be eugenicists too. As Diane Paul showed years ago in “Eugenics and the Left,” political liberals were also deeply involved in eugenic schemes during the Progressive era. Most historians of eugenics agree that to a first approximation, everyone in the Progressive era was a conservative. Sterilization legislation was democratically approved, and most sterilizations were carried out in state hospitals, under at least a premise of social benefit. There may well have been a conservative slant to Progressive eugenics, but it was only a slant, and by the 1930s eugenics probably had a liberal slant.
Because of this political ecumenicalism, eugenics today makes for some strange political bedfellows. If some pro-eugenics advocates lean conservative, so do some antis. The Catholic Church—hardly a bastion of liberal fanaticism—opposes eugenics on grounds that it generally entails either abortion or embryo selection. Matt Ridley favors eugenics and is a pro-business conservative. Genetic screening can be seen as a liberal, feminist issue—an issue of women’s choice and empowerment. Or it can be seen as a tool of government social control. Finally, genetic screening and eugenics are not necessarily the same thing. The Center for Genetics and Society supports abortion and genetic screening but seeks to establish a critical biopolitics that can help shape policy to reap the benefits and avoid the risks of reproductive technologies—a position Entine constantly takes them to task over, presumably because they are not simple cheerleaders.
Eugenics, then, does not hew unswervingly toward either pole of the political spectrum. The eugenics question forces us to parse some traditionally liberal and conservative ideas in new ways. Favoring genetic technology is pro-business (conservative). Favoring prenatal genetic diagnosis with abortion is pro-choice (liberal). Fearing the power of genetic manipulation falling into the hands of totalitarian regimes: liberal. Favoring open markets and “consumer choice”: pro-business conservative. Sometimes this consumer-driven eugenics is even called “liberal eugenics.” Perhaps that’s a smokescreen, but maybe not entirely.
Political ideology, then, can’t help us make an easy decision on whether eugenics is ever okay. If the new eugenics has a conservative tilt it’s only a tilt, and there’s plenty of counterweight on the other side. Unfortunately, we’re going to have to make up our own minds.
To do that, we first have to accept that the eugenic train has left the station. Understood as “the self-direction of human evolution” (the slogan from the 1921 eugenics congress and for me still the most inclusive definition I’ve found), eugenics is going to happen. Is happening. Always happens. For now, it’s still mainly for elites who can afford expensive IVF and genetic screening, but the cost of those procedures is dropping rapidly and more people are gaining access to it each year. Many people are in fact currently making eugenic choices, from the wealthy who can afford prenatal genetic diagnosis with selective abortion to the Dor Yeshorim who screen for and discourage marriage between carriers of Tay-Sachs and a range of other genetic diseases. On this much, I agree with folks like Entine. Where we part company is that I’m not nearly so sanguine about it as he seems to be.
Recognizing that we are grasping the reins of human evolution as fast as we can raises two sets of concerns. First, “What if it doesn’t work?” It’s been argued for some time that our technological capacity greatly outstrips both our wisdom and our understanding. It’s often argued that genetic choices have been made since the dawn of marriage, so opposition to techniques such as embryo selection is mere technophobia. But even age-old holistic breeding practices have unpredictable, undesired effects. Sweet-tempered Laborador retrievers tend to get hip dysplasia and eye problems. Great Danes’ hearts fail. Some quarter horses are prone to connective tissue disorders or “tying up” episodes related to their highly bred musculature. The European royal families are prone to hemophilia and polydactyly. Selecting for single genes, rather than traits that involve suites of genes that have evolved together, seems likely to exacerbate such unintended consequences. The emerging science of systems biology holds that genes act—and hence evolve—in networks. Selecting for particular genes rather than complex traits disrupts those networks and is likely to have unpredictable effects.
We in fact have very little idea how the genome works. The genome is like an ecosystem, a brain, or the immune system: an immensely complex, deeply interconnected system. Altering one element or a few elements has effects that are not only unknown but in many cases unpredictable. Evolution, Darwin showed, is an immensely slow process, in which innumerable parts “negotiate” with one another to produce the best-adapted organisms in a given environment at a given time. In taking control over that process, we will be altering the “ecology” of the genome, and it’s bound to have similar effects to our impact on the environment. With great wisdom, it might be handled safely, but experience does not give one much hope for collective human wisdom.
The second concern is, “What if it does work?” What if it does indeed become possible to select traits—health, height, complexion, intelligence—without creating cruel monsters? I have enough faith in technology that I think this may eventually happen. Some unforeseen consequences will doubtless occur, but in time they will become correctable. So what do we do when this becomes possible? We need to keep in mind that this will be a tool of the upper strata of society for a good long time. The rich will do it more than the poor, and Americans and Europeans will do it more than Bangladeshis and Somalians. So it will be a way of inscribing socioeconomic status literally in our DNA. This is in fact a conservative application, because it will tend to reinforce the socioeconomic status quo.
Further, in most developed countries, it’s not government control we need to worry about; it’s corporate control and the tyranny of the marketplace. Advertisers will push certain genotypes. Ad campaigns, current styles, and the rapidly shifting current consensus on what is or is not healthy will shape people’s genetic decisions. And of course, you can’t shed your genome the way you can last year’s fashions. The concern here, then, is that the new eugenics harnesses long-term processes in the service of short-term goals. This too will have unpredictable effects. History shows without a doubt that societies are rarely wise; we have great trouble seeing several moves ahead, planning for the future, delaying gratification, or sacrificing some of next quarter’s earnings so that we may reap greater health and happiness some time in the future. Even more troubling than failures of technology, then, are failures of morality. And glib reassurances that we are beyond Nazi-style totalitarianism do little to comfort me. The age of self-interested individualism can be just as scary as that of communal self-sacrifice.
Most critical analyses of past eugenic efforts have centered on race, class, and gender. I think that the greatest concern with the new eugenics will likely be the fourth member of the “big three”: disability. Another recent story concerns the stunning development of a method of “silencing” chromosomes. Every nucleated cell in a woman’s body uses this to turn off one of her two X chromosomes; otherwise, women would have a double dose of X chromosome genes, which would lead to lots of problems. The advance is in harnessing this technique so that it can be applied to non-sex chromosomes. Down syndrome results from an extra (third) chromosome 21. The blogs and papers have been awash lately with speculations about “shutting off” the extra chromosome 21 in embryos, to prevent Down syndrome.
The problem is that the severity of Down’s is unpredictable. A family might well be happy to have a high-functioning Down’s baby, but a severely affected child suffers greatly, as does its parents. Who would take that chance? If (when) this technique becomes widely medically available, the frequency of Down syndrome will drop, simultaneously reducing suffering among the victims and families of severe Down’s and joy and love among those close to high-functioning Down’s patients. No humane person would never wish, say, Down syndrome on a family not equipped to handle such a child. But nor would I want to live in a society lacking in people with Down syndrome, or little people, or the blind. It’s not a wish for suffering; we all suffer. But engineering our own evolution will likely have a normalizing effect. Intolerance of abnormality was, indeed, a common refrain among Progressive-era eugenicists and greater power over our genetic future is only likely to increase it. The movie GATTACA got this much right: genetic disease leads to suffering—but so does intolerance.
Is eugenics ever okay? On the individual scale, of choosing not to raise a child with a debilitating disease, I think we have no moral choice but to condone it. A prospective parent talking with a genetic counselor about whether to prevent a deformed or diseased baby from being born is in fact a form of eugenics. But my research made it irrefutable that eugenics has always been simultaneously about individuals and populations. Individual choices lead to population changes—and individual choices are influenced by more than objective genetic knowledge. Although those parents’ choice is for their family rather than the race, they are simultaneously participating in the self-direction of human evolution—it is a choice that any Progressive-era eugenicist would have condoned. And, granting the right to abortion and embryo selection, that is an entirely moral choice.
But what influences that parent’s choice? The biomedical industry hides truly fantastic profits behind the cloak of “health.” Moving responsibly into this inevitable future demands that someone call out the self-interest of the diagnostics and pharmaceutical companies, the instrument-makers and laboratories, the hospitals, the advertisers, and the investors in this new age gold mine. It demands analysis of subtle forms of coercion. It demands a jaundiced eye. Skepticism isn’t Luddism, isn’t anti-choice, isn’t anti-health. It’s following the money.
Much as one might wish to do so, the genie can’t be stuffed back into the bottle. The new eugenics is here. This worries me greatly. But worry, by itself, solves nothing. The concerns it raises are too complex for either dogmatism or complacency. It comes with new, subtle kinds of coercion. Science alone cannot be our guide into this brave genetic world. The closer we come to guiding our own evolution, the more important a humanistic perspective—one that takes the long view of history and the broad view of social context—becomes in helping us make sense of it. The future is here, and, dammit, it’s complicated.
[Update 7/26/13 3:20 pm: Changed description of the Center for Genetics and Society to more accurately reflect their philosophy and agenda. H/t Alex Stern.]
I’m gathering my thoughts on this issue, so stay tuned. But two points immediately leap out at me. First, both Entine and Shah are either ignorant or Panglossian about the early history of eugenics. Entine writes that some imagined “negative wing” of the eugenics movement “was never widely embraced.” Historians of eugenics agree that on some level, almost everyone in the Progressive era was a eugenicist, in the sense of advocating or supporting eugenics. There was no “negative wing”–there was only positive and negative eugenics, which were seen as complimentary.
And Shah writes that the Nazis’ use of eugenics “ended up undermining its credibility as a science.” Actually, its credibility as a science had been undermined for quite some time. By 1933, few seriously trained geneticists were willing to do more than sigh longingly for the day when we would know enough to direct our own evolution without wrecking the gene pool, society, or both. Its popularity as medicine and as population control rose steadily through and beyond the Nazi period. Indeed, the Nazis’ experiment in scientifically rationalized genocide coincided with the peak in sterilization and compulsory birth control of Americans and Scandinavians, and with explicitly eugenic programs ranging from immigration control to race- and class-based family planning on every inhabited continent of the globe.
The second point that immediately comes to mind is that these reports and commentaries suggest that my argument, which I made in the conclusion of The Science of Human Perfection, about eugenics regaining respectability in the post-genome age, is correct (see also my article “The Eugenic Impulse“). Scientists, at least, really do seem to be more comfortable with the term “eugenics” as a name for what they are trying to do. And what they’re trying to do, in a nutshell, is engineer ourselves a better future. To control human evolution.
The argument is that “Sure, it was done wrong before–but that was because we didn’t understand the science well enough.” That’s always the argument. Eugenicists have always said “Now we know enough to do it right.” And the next generation always comes along and clucks its tongue at the naivete and ignorance of its forbears.
No, it’s not because we didn’t understand the science. It’s because we didn’t understand society well enough before. And for all the remarkable technological advances of the last century, there’s scant evidence that we understand society much better now.
A recent post by Jon Entine on the Forbes website leads with a complimentary citation of my book– and then goes on to undermine its central thesis. He concludes:
Modern eugenic aspirations are not about the draconian top-down measures promoted by the Nazis and their ilk. Instead of being driven by a desire to “improve” the species, new eugenics is driven by our personal desire to be as healthy, intelligent and fit as possible—and for the opportunity of our children to be so as well. And that’s not something that should be dismissed lightly.
Well, first of all, as the recent revelations of coerced sterilization of prisoners in California shows, “draconian, top-down” measures do still occur. Genetics and reproduction are intensely potent, and wherever we find abuse of power we should be alert to the harnessing of biology in the service of tyranny.
Second, there’s more than one kind of tyranny. Besides the tyranny of an absolute ruler, perhaps the two most potent and relevant here are the tyranny of the commons and the tyranny of the marketplace. The fact that they are more subtle makes them in some ways more dangerous. The healthcare industry does much good in the world, but it is naive to treat it as wholly benign.
Further, putting human evolution in the hands of humans, means accepting long-term consequences for short-term goals. The traits we value–health, intelligence, beauty–are the result of the action of many genes interacting with each other and with a dynamic environment. The entire system is contingent, inherently unpredictable. Yet we treat it as simple and deterministic. Until now, technology has been the major obstacle to guiding human evolution. It may be that now the major obstacle is our reasoning ability, our capacity for grasping contingency and probability and change. We’re tinkering with the machinery of a system whose complexity is still unfolding before us. The probability of unforeseen consequences is 100%. The only question is how severe they will be. We will only know in retrospect.
If we now have the tools to meaningfully guide our own evolution–as eugenicists have always wanted to do–we cannot take a blithe and Panglossian attitude. We have to be alert to the risks and take them seriously. That is not traditionally science’s strong suit. The public face of science is sunny, optimistic, fun. It strides boldly into the future, laughing and making striking promises. The industries behind science and health are wealthy and politically powerful. Not everything they do is benign.
To be a critic of that public-relations machine–of hype, in other words–is not to be a critic of health or knowledge or progress. Genetic science has the potential to bring us enormous benefits in health and well-being, and as they do, I stand in line with my fellow humans for my fair share. But that science also carries huge and unforeseeable risks, the root of which, perhaps, is arrogance. It’s one whose consequences are painfully evident in the historical record.
[UPDATE: Changed the link from the Sacramento Bee article to the longer report from cironline. h/t Alex Stern.] A quick note on today’s report from the Center for Investigative Reporting that at least 150 pregnant inmates in prisons in Corona and Chowchilla, CA, were sterilized against their will. Between 2006 and 2010. That’s TWO THOUSAND six. Another hundred or more may have been sterilized in the 10 or so years before that. (See also this HuffPo piece from last month.)
In an earlier post, I noted that when I applied for my marriage license in California, my betrothed and I received a state-sponsored booklet called “Your future together.” It was heavily gene-centered and mentioned that one can obtain free birth control and sterilization, paid for by the state. The historian Alexandra Minna Stern has written about the racial politics of California sterilization (see my review of her latest book–and then buy the book). Not surprisingly, the largest number of people sterilized are poor Mexicans, often illegal immigrants. Those surgeries, however, are at least nominally voluntary. Involuntary sexual surgery on prisoners sounds like something from the 1910s, not the 2010s.
In my book, The Science of Human Perfection, I note that eugenics is alive and well, though it often travels under an assumed name. The principles of informed consent can be–and as this report shows, are–used to mask persuasion. When that persuasion includes being made to “feel like was a bad mother if I didn’t do it,” it grades into coercion. Further, the ethics of sterilizing minority women in prison are even more complex than doing it outside—one wonders, for example, how many of those women were impregnated by prison guards. We should not let the drawing of apparently bright ethical lines allow us to become complacent about the gray, unlit areas where that good ol’ time eugenics can still flourish.
“Personalized medicine” is both one of the hottest topics in biomedicine today and one of the oldest concepts in the healing arts. Taking the long view reveals some of the trade-offs in trying to personalize diagnosis and treatment—and suggests that truly personalized medicine will involve not only technological advance, but also moral choices.
It is both one of the hottest topics in biomedicine today and one of the oldest concepts in the healing arts. Visionaries of the genome claim that molecular personalized medicine will eliminate “one size fits all” medicine, which treats the disease, and return us to an older approach, in which the patient was pre-eminent. The revolution, they say, will be “predictive, preventive, personalized, and participatory“—it will be possible to identify why this person has this disease now, and even to prevent disease before it starts. Personalized medicine depends on the individual person. But the individual is not a constant. Over the centuries, the medical individual has evolved along with the increasing reductionism of biomedicine. Medicine has narrowed its scope, moving from the whole person, to part of the body, to proteins, DNA sequences, and single nucleotides. Underlying contemporary, genomic personalized medicine are assumptions that, first, molecular medicine operates on a level that unites us all (indeed, all life), and thus it is the best—even the true—way to explore and describe human individuality. And second, that understanding human individuality on a molecular level will lead willy-nilly to better care and a less alienating medical experience for patients. I think a lot of benefit can come out of the study of genetic constitution and idiosyncrasy—and one can hardly oppose the idea of more personal care. But demonizing one-size-fits-all, promising a revolution, and making fuzzy connections between biochemistry and moral philosophy are risky propositions. Personalized medicine today is backed by money and larded with hype. Setting the medical individual in historical context, we can ask what personalized medicine can and cannot claim. In short, what is the difference between personalized and truly personal?
Seed and soil
The Hippocratic physicians, Aristotle, and Galen all used the concept of diathesis to describe the way a person responds to his environment. They used the term flexibly, to describe everything from a tendency to particular diseases to one’s general constitution or temperament. Around 1800, “diathesis” gained a more specific meaning: it came to signify a constitutional type that made one susceptible to a certain class of diseases. Some diatheses, such as scrofulous, cancerous, or gouty, were believed to be inherited. Others, such as syphilitic, verminous, or gangrenous, were understood as acquired. In its original sense, then, diathesis was related to heredity, but not synonymous with it.
Diathesis and constitution were often discussed in the form of a metaphor of seed and soil. In 1889, the physician Stephen Paget wrote in relation to breast cancer, “When a plant goes to seed, its seeds are carried in all directions; but they can only live and grow if they fall on congenial soil…Certain organs may be ‘predisposed’ to cancer.” The “soil,” he said, could be either a predisposition of certain organs, or diminished resistance. The seed and soil metaphor was also applied to infectious disease. Radical germ theorists such as Robert Koch often argued that the germ was both necessary and sufficient to cause disease. Critics observed that not everyone who was exposed to the germ developed the disease, and that the intensity of the disease often varied. Max von Pettenkofer quaffed a beaker of cholera and suffered only a bit of cramping. The seed-and-soil metaphor helped explain why. In 1894, the great physician William Osler argued:
As a factor in tuberculosis, the soil, then, has a value equal almost to that which relates to the seed, and in taking measures to limit the diffusion of the parasite let us not forget the importance to the possible host of combating inherited weakness, of removing acquired debility, and of maintaining the nutrition at a standard of aggressive activity.
It was a losing battle, though. The germ theorists were winning: diathesis and constitutionalism were already becoming outdated.
“One size fits all” medicine is a direct legacy of the germ theory of disease and of the notion that you can isolate the causative agent in any disease. This was a remarkable advance in medical history. It didn’t matter whether you were a princess or a hack driver, doctors could figure out what you had and make you better. The great legacy therapies of microbial medicine—salvarsan, penicillin, the polio vaccine—represented the first times in medical history that doctors actually cured anyone. One-size-fits-all medicine, then, was positively brilliant, a medical revolution, in an age and culture where infectious disease killed a dominant fraction of the population. But it always had critics, doctors and others who bemoaned the loss of complexity, artistry, humanity from the medical arts. One of those critics was the London physician Archibald Garrod.
The case of the black nappie
Constitution, or soil, had always been associated with heredity; Garrod linked it to genetics. Garrod was a biochemically oriented doctor, interested in the physiological mechanisms of disease.
In 1898, a woman brought her newborn baby to his clinic. It seemed healthy, but she had noticed that its diapers turned an alarming black. The biochemically trained Garrod identified the condition as alkaptonuria, an exceedingly rare and essentially harmless condition believed at the time to be caused by a microbe. Garrod collected all the cases he could, mapped out pedigrees, and published a short note on it, suggesting that the high frequency within the families of his study could hardly be due to chance. The naturalist and evolutionist William Bateson read Garrod’s paper. He had long been interested in variation as the basis of evolution. Bateson had just discovered Gregor Mendel’s work and was emerging as Mendel’s greatest champion in the English language. He saw in Garrod’s alkaptonuria cases powerful ammunition against Mendelism’s critics–proving Mendelian inheritance in man would silence those who argued it a primitive strategy, restricted to plants and lower animals. Bateson and Garrod collaborated to show that alkaptonuria indeed follows a Mendelian recessive pattern. In 1902, Garrod summarized these and other analyses of alkaptonuria as, “Alkaptonuria: a study in chemical individuality.” It was a classic paper, carefully researched, brilliantly argued, compassionate, rational–and widely ignored. Part of the reason for its lack of medical impact was Garrod’s resolutely non-clinical thrust. He was interested not in finding the genes for disease, but in discovering the harmless idiosyncrasies that make us unique. The 1902 paper began to elaborate a biochemical theory of diathesis, which Garrod developed over the succeeding decades. Predisposition to disease, and constitution generally, he said, were biochemical in nature. “Just as no two individuals of a species are absolutely identical in bodily structure,” he wrote, “neither are their chemical processes carried out on exactly the same lines.” He proposed that physiological traits including responses to drugs would be similarly individual and presumably therefore genetic:
The phenomena of obesity and the various tints of hair, skin, and eyes point in the same direction, and if we pass to differences presumably chemical in their basis idiosyncrasies as regards drugs and the various degrees of natural immunity against infections are only less marked in individual human beings and in the several races of mankind than in distinct genera and species of animals.
Obesity, racial features, drug idiosyncrasies, and sensitivity to infectious disease, of course, are now among the primary targets of genetic medicine.
Malaria, drugs, and race
They are also interrelated. For example, take primaquine. After WWI, Germany was cut off from the quinine plantations of Indonesia. German pharmaceutical companies such as Bayer developed synthetic antimalarial drugs, the best of which was primaquine. Primaquine was field-tested in malarial regions such as banana plantations in South America and the Caribbean, especially those run by United Fruit Company. Most of the banana-pickers were black Caribbeans. Researchers found that primaquine was effective, but that about one in ten blacks developed severe anemia when they took it. In whites, this response was extremely rare. This response became known as “primaquine sensitivity.” Today it is recognized as an expression of G6PD deficiency, the most common genetic disease in the world.
During WWII, the Indonesian quinine fields went over to the Germans; now it was the US that needed synthetic anti-malarial drugs. The Army set up research programs in several American prisons—the largest and best-run of these was at Stateville Prison in Illinois. Prisoners were given experimental malaria of different types, and then experimental drugs were tested on them. Racial differences manifested in different roles in the experiments. Ernest Beutler, one of the researchers on the project, said in an interview:
We knew it was only the blacks who were primaquine sensitive. So that was very important. Second place, the blacks didn’t get malaria. They’re resistant to vivax. So we used black prisoners for studies of hemolysis, we used white prisoners usually for malaria.
Thus, skin color became a proxy for susceptibility to malaria. There are other examples, with other drugs, other diseases. Isoniazid was billed as a miracle drug for tuberculosis. But it was soon found that half of all whites and blacks were extremely sensitive to the drug. Physiological studies showed that they metabolized the drug more slowly; their blood drug levels built up quickly, leading to adverse side effects. In such slow acetylators, isoniazid could trigger peripheral neuropathy and even a lupus-like autoimmune reaction. Interestingly, only 15 percent of Asians were slow acetylators. Another drug, succinylcholine, is a muscle relaxant used primarily as a premedication for electroconvulsive therapy. D. R. Gunn and Werner Kalow found that rarely, in one out of 2500 Caucasians, it paralyzes breathing. By the mid-fifties, then, idiosyncratic drug response, susceptibility to infectious disease, and the “various tints of hair, skin, and eyes” were linked in the study of genetic individuality. In 1954, the brilliant and visionary geneticist JBS Haldane could write a small book on biochemistry and genetics. Concluding, he suggested:
The future of biochemical genetics applied to medicine is largely in the study of diatheses and idiosyncrasies, differences of innate make-up which do not necessarily lead to disease, but may do so.
The young physician Arno Motulsky, of the University of Washington, took that notion as a call to arms. In 1957, he reviewed the literature on drug idiosyncrasy and gave it both context and an audience. “Hereditary, gene-controlled enzymatic factors,” he wrote, “determine why, with identical exposure, certain individuals become ‘sick,’ whereas others are not affected. It is becoming increasingly probable that many of our common diseases depend on genetic-susceptibility determinants of this type.” His short article became a classic and is often cited as the founding paper of pharmacogenetics. The actual term wasn’t coined until two years later, and then in German, by the German researcher Friedrich Vogel: “pharmacogenetik.” In 1962, Werner Kalow published a monograph (in English) on pharmacogenetics. The field soon stalled, however; little was published on the subject through the ‘60s and ‘70s. Pharmacogenetics only really gained traction after the development of gene cloning.
Molecular approaches to variation had been developing since the late 1940s. In 1949, using the new electrophoresis apparatus—biology’s equivalent of a cyclotron—the great physical chemist Linus Pauling found that the hemoglobin of sickle-cell patients had different mobility than normal hemoglobin. He called sickle cell “a molecular disease.” It was also, of course, considered a “black disease,” for reasons connected to primaquine sensitivity. Pauling once suggested that carriers of sickle cell and other genetic diseases should have their disease status tattooed on their foreheads as a public health measure. It was the kind of step eugenicists of the Progressive Era might have applauded. Pauling was a brilliant and imaginative analyst, but he was not a visionary. He did not foresee that all diseases would become genetic. The study of molecular diseases was greatly aided by technological developments that aided the separation, visualization, and purification of the proteins in biological fluids. Searching for alternative media to replace paper, researchers experimented with glass beads, glass powders, sands, gels, resins, and plant starch. Henry Kunkel used powdered potato starch, available at any grocery. Although inexpensive and compact enough to fit on a tabletop, one could use it to analyze a fairly large sample. Still, electrophoresis with starch grains had its drawbacks. In 1955, Oliver Smithies, working in Toronto, tried cooking the starch grains, so that they formed a gel.
This not only made the medium easier to handle and stain; it made the proteins under study easier to isolate and analyze. Starch gel democratized electrophoresis. Immediately, all sorts of studies emerged characterizing differences in protein mobility; many of these correlated biochemistry with genetic differences. Bateson’s variation had been brought to the molecular level. Sickle cell, the black and molecular disease, continued to play a leading role in the study of genetic idiosyncrasy. In 1957, using both paper electrophoresis and paper chromatography, Vernon Ingram identified the specific amino acid difference between sickle cell hemoglobin and “normal” hemoglobin—specifying Pauling’s “molecular disease.” An idiosyncrasy—or diathesis—now had a specific molecular correlate.
Polymorphism, from proteins to nucleotides
Polymorphism is a population approach to idiosyncrasy. Imported from evolutionary ecology, as a genetic term polymorphism came to connote a regular variation that occurs in at least one percent of the population. Where the ecologist E. B. Ford had studied polymorphisms in moth wing coloration, the physician Harry Harris studied it in human blood proteins. Like Garrod, Harris wanted to know how much non-pathological genetic variation there was in human enzymes. He concluded that polymorphism was likely quite common in humans. Indeed, Harris identified strongly with Garrod; in 1963, he edited a reissue of Garrod’s Croonian Lectures of 1909, Inborn Errors of Metabolism. Together with Lionel Penrose, an English psychologist interested in the genetics of mental disorders, Harris headed up an informal “English school” of Garrodian individualism and biochemical genetics, located at London’s Galton Institute. Harris fulfilled Garrod’s vision by categorizing amino acid polymorphisms and relating them to human biology.
Through the ‘50s and ‘60s, young researchers interested in human biochemical genetics streamed through the Galton to learn at the knobby knees of Penrose and Harris. Arno Motulsky penned his 1957 review just after visiting. The Johns Hopkins physician Barton Childs also spent time at the Galton, and later went on to articulate a Garrodian “logic of disease,” based on Garrod’s and Harris’s principles of biochemical individuality. Childs’s vision is now the basis of medical education at Hopkins and elsewhere. Charles Scriver, of McGill University, also studied under Harris and Penrose. He is best known for his work on phenylketonuria, a hereditary metabolic disorder that leads to severe mental retardation, but which is treatable with a low-protein, phenylalanine-free diet. In the 1970s, recombinant DNA and sequencing technologies helped bring polymorphism down to the level of DNA. In 1978, Y. W. Kan and Andreé Dozy returned once again to sickle cell disease, and showed that sickle cell hemoglobin could be distinguished from normal hemoglobin by DNA electrophoresis. The difference can be detected by chopping up the DNA with restriction enzymes, which cut at a characteristic short sequence. The sickle cell mutation disrupts one of those restriction sites, so that the enzyme passes it over, making that fragment longer than normal. Ingram’s single amino acid difference could now be detected by the presence of a particular band on a gel. This became known as a restriction fragment length polymorphism, or RFLP. It was a new way of visualizing polymorphism. In 1980, David Botstein, Ray White, Mark Skolnick, and Ron Davis combined Harry Harris with Kan and Dozy. They realized that the genome must be full of RFLPs. They proposed making a reference map of them, a set of polymorphic mile markers along the chromosomes. “The application of a set of probes for DNA polymorphism to DNA available to us from large pedigrees should provide a new horizon in human genetics,” they wrote grandly. Medical geneticists were beginning to think in terms of databases. Further, Botstein & Co. recognized their method’s potential for medically singling out individuals: “With linkage based on DNA markers, parents whose pedigrees might indicate the possibility of their carrying a deleterious allele could determine prior to pregnancy whether or not they actually carry the allele and, consequently, whether amniocentesis might be necessary.” In other words, whether abortion might be indicated. One should also be able to determine, they continued, whether cancer patients are at risk in advance of symptoms. These are basic principles of personalized or genetic medicine. Further, they wrote, the method would be useful for determining population structure—i.e., identifying racial characteristics by geography and genetics. With RFLP mapping, polymorphism was now divorced from phenotype—it was a purely genetic construct. Researchers then took polymorphism down to the level of single nucleotides. The first single nucleotide polymorphisms, or SNPs, were identified in the late ‘80s—an estimated 1 every 2000 nucleotides. And late in 1998, a database was created to pool all this data. Researchers imagined a high-resolution map of genetic variation—an estimated 10 million variants. It was the ultimate in Garrodian genetic variation. The vision was to use dbSNP to identify any individual’s sensitivities and resiliencies. The end of race? A romantic ideal emerged that the discovery of such enormous variation at the DNA level was not merely a scientific but a social triumph. In 2000, on the announcement of the completion of the draft sequence of the human genome, Craig Venter proclaimed, “the concept of race has no scientific basis.” And NIH director Francis Collins strummed his guitar and sang (to the tune of This Land is Your Land),
We only do this once, it’s our inheritance, Joined by this common thread — black, yellow, white or red, It is our family bond, and now its day has dawned. This draft was made for you and me.
Since then, the genome’s supposed refutation of a biological race concept has become a standard trope among scientists, journalists, and historians. But the SNP database turned out to be too data-rich. Human genetic diversity is far too great to be useful to our poor small brains and computers. Circa 2001, it was discovered that SNPs cluster into groups, or “haplotypes.” Most of the information in a complete SNP map lies in 5% of the SNPs. This brings the number from ten million down to half a million or so. Conveniently, haplotypes cluster by race. In October, 2002, an International HapMap Consortium convened. The HapMap project sampled humanity with 270 people drawn from four populations. There were thirty “trios”—father, mother, and adult child—from the Yoruba of Nigeria—part of an African diaspora widely and condescendingly noted for its literacy. Thirty more trios were white Utahns of European ancestry. Forty-five unrelated individuals were drawn from Japanese in Tokyo and Han Chinese in Beijing (China boasts more than fifty ethnic groups, the largest of which is the Han). No native Australians or Americans, North or South, were included. Conceptually, the HapMap project was a mess. It claimed to explore human diversity while genetically inscribing and condensing racial categories—which in turn were defined by the project in terms of highly cosmopolitan and otherwise problematic groups. It implied that the Yoruba were representative of “Africans,” the Japanese and Chinese of “Asians.” Framed as a corrective to the disastrous but well-intentioned Human Genome Diversity Project of the 1990s, it nevertheless reinforced the racial categories that the genome project was supposed to have shattered. Indeed, Venter’s proclamations and Collins’s corny folksongs notwithstanding, the use of race has actually increased in studies of genetic polymorphism in response to drugs. I looked at the number of papers listed in PubMed that had “pharmacology” as a keyword, and the fraction of those papers that also had “race” as a keyword. That proportion held fairly steady at about a third of a percent from the ‘70s through the ‘90s. But it nearly tripled in the decade after we got the genome, to more than three quarters of a percent: from 34 to 359 publications. Once a sport, a rare mutation in the pharmacological literature, race is now approaching the frequency of a polymorphism. So either race does have a scientific basis after all, or scientists are using a social construct as if it were a biological variable. Either way, there’s a problem.
BiDil and BRCA2 As Beutler used skin color as a proxy for primaquine sensitivity and malaria susceptibility, so physicians today are using it as a proxy for haplotype. For example, take the heart-failure medication BiDil. In 1999, this drug was rejected by the American Food and Drug Administration, because clinical trials did not show sufficient benefit over existing medications. Investigators went back and broke down the data by race. Their study suggested that “therapy for heart failure might appropriately be racially tailored.” The licensing rights were bought by NitroMed, a Boston-area biotech company. Permission was sought and granted to test the drug exclusively in blacks, whose heart failure tends to involve nitric oxide deficiency more often than in people of European descent. On June 23, 2005, FDA approved it for heart failure in black patients. As a result, it became the first drug to be marketed exclusively to blacks. The study’s author claims congestive heart failure is “a different disease” in blacks. This argument thus presupposes that “black” is an objective biological reality, and then identifies health correlates for it. Ethnicity is not so black-and-white. Another well-known case is the gene BRCA2, a polymorphism of which increases the risk of breast cancer. Myriad Genetics, founded by Mark Skolnick, cloned BRCA1 and 2, and took out patents on tests to detect them. Myriad gets a licensing fee for all tests. The BRCA2 mutation is found mainly in Ashkenazi Jews. Due to the wording of the European patent, women being offered the test legally must be asked if they are Ashkenazi-Jewish; if a clinic has not purchased the (quite expensive) license, it can’t administer the test. Gert Matthijs, of the Catholic University of Leuven and head of the patenting and licensing committee of the European Society of Human Genetics, said, “There is something fundamentally wrong if one ethnic group can be singled out by patenting.” The case has been controversial. The patent was challenged, and in 2005, the European Patent Office upheld it. The next year, the EU challenged Myriad. In 2010, an American judge invalidated the Myriad patents. This spring, opening arguments began in the appeal. No one can predict the outcome, but some investors are betting on Myriad. The point is not hypocrisy but internal contradiction. As the ethicist Jonathan Kahn points out, “Biomedical researchers may at once acknowledge concerns about the use of race as a biomedical category, while in practice affirming race as an objective genetic classification.” There’s a deep cognitive dissonance within biomedicine between the public rhetoric and the actual methodology of fields such as pharmacogenetics over the question of whether or not race is real. And this of course has a strong bearing on the question of individuality. Which is it, doctor: are we members of a group, or are we individuals?
So although biomedicine claims to be moving from “one size fits all” to personalized medicine, in practice, researchers find that race is a necessary intermediate step in getting from the entire population to the individual.
The claim is that individuality is on the horizon—once the databases fill out and testing costs come down, medicine will be truly personalized. In the meantime, though, we’ll put you in a smaller group, which is better than treating everyone the same.
The history shows that treating the individual always involves putting that individual into one or another group. It is neither possible nor even desirable to treat everyone uniquely. When faced with the vastness of human variation, the complexity quickly becomes overwhelming. One has to look for patterns in the data, to group people by their responses. In practice, this often seems to mean typing people according to familiar categories. These categories are of course drawn from the experience of the researchers: if you grow up in a culture where race is real, then those are the categories into which your data fall. Biomedicine is not separate from culture; so long as race exists in our society, it will imprint itself on our science. In this way, the drive for individualism often leads to its opposite: typology. Race becomes reified—it now has an empirical and apparently unbiased basis.
Does personalized equal personal?
Individuality in biomedicine, then, has long been an elusive concept. Biomedical researchers claim with justifiable pride that medicine is beginning to take the individual seriously once again. Specialties such as pharmacogenomics and personalized medicine are increasingly recognizing that not everyone responds the same way to a given disease or a drug. This is a good thing, and could both improve therapeutic effectiveness and reduce incidence of idiosyncratic toxic responses. On the level of technical diagnostics and therapeutics, I see many benefits from tailoring care to whatever extent possible. But that doesn’t make it personal. Science can’t eliminate the concept of race, let alone racial prejudice. It can’t make our doctor take us seriously and treat us respectfully. It’s at best naive and at worst cynical for clinicians and researchers to suggest otherwise. We should always be wary of claims that science & technology will solve social problems. Truly personalized medicine is more than a problem of technology, data collection, and computation. It has to be a moral choice. Acknowledgments This essay is based on a talk delivered to PhD Day in the Division of the Pharmaceutical Sciences, University of Geneva, and was shaped by questions, comments, and discussion afterward. Michiko Kobayashi provided valuable comments and criticisms on both the talk and the essay.
Ackerknecht, Erwin. “Diathesis: The Word and the Concept in Medical History.” Bull. Hist. Med. 56 (1982): 317-25.
Bateson, William. “An Address on Mendelian Heredity and Its Application to Man.” British Medical Journal (1906): 61-67.
Bearn, Alexander G. Archibald Garrod and the Individuality of Man. Oxford, U.K.: Clarendon Press, 1993.
Burgio, G. R. “Diathesis and Predisposition: The Evolution of a Concept.” Eur J Pediatr 155, no. 3 (1996): 163-4.
Childs, Barton. “Sir Archibald Garrod’s Conception of Chemical Individuality: A Modern Appreciation.” N Engl J Med 282, no. 2 (1970): 71-77.
Comfort, Nathaniel C. “The Prisoner as Model Organism: Malaria Research at Stateville Penitentiary.” Studies in History and Philosophy of Science, Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 40 (2009): 190-203 (available at academia.edu: http://bit.ly/mjn2CJ)
Comfort, Nathaniel C. “Archibald Edward Garrod.” In Dictionary of Nineteenth-Century British Scientists, edited by Bernard Lightman. London, Chicago: Thoemmes Press/University of Chicago Press, 2004.
Garrod, Archibald Edward. Inborn Errors of Metabolism: The Croonian Lectures Delivered before the Royal College of Physicians of London in June 1908. London: H. Frowde and Hodder & Stoughton, 1909.
———. The Inborn Factors in Disease; an Essay. Oxford: The Clarendon Press, 1931. Hamilton, J. A. “Revitalizing Difference in the Hapmap: Race and Contemporary Human Genetic Variation Research.” The Journal of Law, Medicine & Ethics 36, no. 3 (Fall 2008): 471-7.
Jones, David S., and Roy H. Perlis. “Pharmacogenetics, Race, and Psychiatry: Prospects and Challenges.” Harvard Review of Psychiatry 14, no. 2 (2006): 92-108.
Kay, Lily E. “Laboratory Technology and Biological Knowledge: The Tiselius Electrophoresis Apparatus, 1930-1945.” Hist Philos Life Sci 10, no. 1 (1988): 51-72.
Nicholls, A. G. “What Is a Diathesis?” Canadian Medical Association Journal 18, no. 5 (May 1928): 585-6.
Wailoo, Keith, and Stephen Gregory Pemberton. The Troubled Dream of Genetic Medicine: Ethnicity and Innovation in Tay-Sachs, Cystic Fibrosis, and Sickle Cell Disease. Baltimore: Johns Hopkins University Press, 2006.
Slater, L. B. “Malaria Chemotherapy and the “Kaleidoscopic” Organisation of Biomedical Research During World War II.” [In eng]. Ambix 51, no. 2 (Jul 2004): 107-34.
Snyder, Laurence H. “The Genetic Approach to Human Individuality.” Scientific Monthly 68, no. 3 (1949): 165-71.
Strasser, B. J., and B. Fantini. “Molecular Diseases and Diseased Molecules: Ontological and Epistemological Dimensions.” History and Philosophy of the Life Sciences 20 (1998): 189-214.
Strasser, Bruno J. “Linus Pauling’s “Molecular Diseases”: Between History and Memory.” American Journal of Medical Genetics 115, no. 2 (2002): 83-93.
 Paget, Stephen. “The Distribution of Secondary Growths of Cancer of the Breast.” The Lancet 133, no. 3421 (1889): [email protected]
 Various. “Discussion of the Advisability of the Registration of Tuberculosis.” Transactions and Studies of the College of Physicians of Philadelphia 16 (1894): [email protected]
 Beutler, Ernest, interview with Andrea Maestrejuan, March 8, 2007, La Jolla, CA, Oral history of human genetics project (http://ohhgp.pendari.com/).
 Kalow, W., and D. R. Gunn. “The Relation between Dose of Succinylcholine and Duration of Apnea in Man.” J Pharmacol Exp Ther 120, no. 2 (Jun 1957): 203-14.
 Haldane, J. B. S. The Biochemistry of Genetics. London: George Allen & Unwin, 1954 @ 125.
 Motulsky, Arno G. “Drug Reactions, Enzymes and Biochemical Genetics.” JAMA 165 (1957): 835-37; Vogel, Friedrich. “Moderne Problem Der Humangenetik.” Ergeb. Inn. Med. U. Kinderheilk. 12 (1959): 52-125; Kalow, Werner. Pharmacogenetics; Heredity and the Response to Drugs. Philadelphia: W.B. Saunders Co., 1962. See also Price Evans, David A., and Cyril A. Clarke. “Pharmacogenetics.” British Medical Bulletin 17, no. 3 (1961): 234-40; Price Evans, David A. “Pharmacogenetics.” American Journal of Medicine 34 (1963): 639-62. See also Jones (2006) in Deep Background.
 The well-known malaria resistance conferred by a single dose of the sickle cell allele is in the same biochemical pathway as the glucose-6-phosphate deficiency involved in primaquine sensitivity. Those sensitive to artificial antimalarials are resistant to malaria anyway.
 Ingram, V. M. “A Specific Chemical Difference between the Globins of Normal Human and Sickle-Cell Anaemia Haemoglobin.” Nature 178, no. 4537 (Oct 13 1956): 792-4; “Gene Mutations in Human Haemoglobin: The Chemical Difference between Normal and Sickle Cell Haemoglobin.” Nature 180 (1957): 326-28.
 Botstein, D., R. L. White, M. Skolnick, and R. W. Davis. “Construction of a Genetic Linkage Map in Man Using Restriction Fragment Length Polymorphisms.” Am J Hum Genet 32, no. 3 (1980): [email protected]
 Venter, J. C. “Remarks at the Human Genome Announcement.” Functional & Integrative Genomics 1, no. 3 (Nov 2000): 154-5.
 Hamilton, J. A. “Revitalizing Difference in the Hapmap: Race and Contemporary Human Genetic Variation Research.” [The Journal of Law, Medicine & Ethics 36, no. 3 (Fall 2008): 471-7; McElheny, Victor K. Drawing the Map of Life : Inside the Human Genome Project. New York, NY: Basic Books, [email protected]
On May 6, at the age of 80, the writer Horace Freeland Judson died. He didn’t pass away; he would have insisted that he simply die. We worked together for five years, from 1997 to 2002, at The Center for History of Recent Science, George Washington University. Here is my appreciation.*
He was six feet in his stockings, which were certainly silk. He dressed impeccably. His leather shoes were glossy, his slacks pressed, even on writing days. His collar was crisp, his cufflinks shone. His houndstooth jacket might be a little worn at the seams, as if from catching too many brambles, but it was always clean. He wore a pocketwatch, and spoke with a vaguely “U” (upper-class, to the English) accent. He was raised in Chicago. A black, broad-brimmed hat, molded just so and rakishly tilted, cravat, and even a cape were not unknown on special occasions. I once saw him open a bottle of New Year’s champagne with a sword. Horace Judson was more than a little vain. But judgment should be tempered by the knowledge of two facts: his preening both reflected and masked enormous effort; and he had terrific taste. He seemed to put thought into every stitch of clothing on his body, every stick of furniture in his house, every book in his library, every word in his books.
His meticulous appearance was, like a peacock’s tail, a display of vigor, for he zipped his pants, buttoned his shirt, oiled his hair, and tied his shoelaces with only one good hand. The left. Childhood polio had withered the other, leaving it floppy and florid. The handicap slowed him down a little—he reconfigured that slowness into a magisterial pace—but prevented him from doing nothing. He shook hands cross-handed (I soon learned to shake left-to-left with him). He opened wine, canned tomatoes, and drove a stick-shift, flamboyantly flouting his disability. He never asked for help. His pride was his strength.
Of course, he typed the quarter million or so words of The Eighth Day of Creation, and everything else he wrote, with one hand. The deliberation this required contributed to the precision and elegance of his prose. Though a journalist by training, he wrote not like a reporter but like an essayist or a novelist. His style was always formal, never slangy (though he was not averse to the occasional earthy vulgarity), and occasionally pompous. He used every rhetorical trick in Aristotle, and had a flair for the dramatic. But his prose was always muscular, and he regularly knocked off passages of such clarity, insight, and grace as to leave the attentive reader breathless.
Consider his definition of X-ray crystallography, an arcane science if ever there was one, from chapter 2 of The Eighth Day. He begins with an analogy simultaneously quantitative and yet immediately apprehensible: “The X-rays used in crystallography are shorter than visible light by some four thousand times.” That does what good writing must, especially when it is on a technical subject: it connects the unknown to the known. Any educated person has a rough idea about visible light, and a sense of how big four thousand times is. Judson then explains the one thing everyone wants to know about X-rays. “Their wavelength is 1.5 Angstroms, which is the same order of size as the spacing between atoms in many solids, and that is why X-rays go through substances that are opaque to the eye.” Did you know that? “Penetrated by a thin beam”—thin beam is nice, both as explanation and in sonority—“the orderly array of atoms in a crystal, layer upon layer, scatters the X-rays in an orderly way, causing a repeating series of overlapping circles of waves.” Note the repetition of orderly…orderly, prefiguring the repeating series of the following clause. Powerful rhetoric illustrates the underlying concept. “At intersection with the flat sheet of film”—note the alliteration, intercalated by the contrasting long vowel sound—“the troughs and peaks of these waves reinforce each other at some points and cancel each other out elsewhere.” The image recalls every child’s game of dropping two stones into a pond and watching the ripples crash. Here is the crucial insight: “The interference pattern that results is characteristic of the structure that produced it,” he writes, and he abruptly ends the lesson by waving a hand at the Nobel-winning mathematics behind it, which would confuse and bore his audience: “…though to figure back to the structure takes mathematics and experience.” That’s all you need to know, really, and not a word more than it takes to describe it.
He never matched that book. How many of us get even one book of that magnitude? Judson is by far the non-PhD most quoted by historians and philosophers of twentieth-century biology, and scholars hate the fact. “Molecular biology is a discipline, a level of analysis, a kit of tools,” he wrote at the beginning of chapter 4. “Which is to say, it is unified by style as much as by content.” There’s a dissertation in that casual observation. He had hundreds like it. At the same time, the book is notoriously difficult to teach, and so few do. (I like to assign chapter 3. Students love it, but find it challenging.) Teachable books show their scaffolding and leave their beams exposed, so that students can skim for argument and get on to their orgo problem sets. Not so Judson. He draws you in, makes you read slowly. The paragraphs are meticulously constructed, but the themes are buried a sentence or two deep, and the arguments build incrementally. You end up highlighting every line.
That quality, combined with his pomp and occasional arrogance, alienated many historians of science. He could be difficult. Yet three things saved him for me. First, his intelligence and his skill as a writer and editor. Many times he annoyed me by ignoring my requests to comment on my green prose, only to read through it once and mark it up on the fly, effortlessly extracting the voice and meaning that I had buried in murk and flab and cliché. He still sits on my shoulder as I write, insisting that I justify the sequence of items in a list, put a twist on every cliché, and, unless I have a specific reason to do otherwise, narrate my story in strict chronological order. Second, his generosity. He often went far out of his way for people, with no expectation of recognition. Behind the formality was a genuinely warm person who loved sentiment but hated sentimentality. And third, his tragedy. He wanted to be accepted as a historian, and never quite was. I think he idealized the university as only someone can who doesn’t have the professorial press pass. Which is too bad, because he was better—gave more insight and more pleasure—than many credentialed scholars. His obituary in the Times listed him as a historian of science, which would have pleased him but which underestimated him. He was a writer.
The scientific study of human heredity has and has always had two types of practical application: relief of suffering and human improvement. Research programs with those ends in mind have existed at least since the beginning of the 20th century—maybe earlier, depending on how you define things. But by the Progressive Era (roughly 1890–1920), research in human heredity and genetics explicitly sought to reduce or eliminate human disease, raise the average level of our intelligence, beauty, and longevity, and improve our character.
For a long time, the only way to accomplish those goals was to regulate behavior. At the highest level—i.e., the least invasive of bodies but the most invasive of liberty—you regulate the relationship between people who might have children together. In the Progressive Era, many states passed laws prohibiting marriage between two people who were mentally retarded, or certifiably insane, or had tuberculosis (though its infectious nature was recognized, researchers also understood that there was an inherited predisposition). Immigration restriction laws, too, were a form of regulating behavior in the supposed interest of the national heredity (at least in part). They can’t breed if you don’t let them in in the first place.
Many people at the time saw surgical sterilization as much less invasive than marriage or immigration restriction. Advances in surgical technology and practice shifted the target of modification from the relationship to the individual. Modify the individual body and you can afford to be unconcerned with who that person marries or lives with or next to. From our perspective today, sterilization is an appalling invasion of autonomy, but in the 1930s, the heyday of eugenic sterilization—worldwide, by the way, not just in Germany—many people saw it, like abortion, as a way to loosen restrictions on the behavior of the sick, imperfect, and impure while still working toward improving society.
For a long time, then, “applied” human genetics was synonymous with what we think of as the worst excesses and sins of eugenics. Science historians and historically minded scientists have often written that human genetics got “tangled up” with eugenics because the researchers back then did not have sufficient knowledge. Now that we understand the science better, the argument runs, we can avoid the kinds of simplistic fallacies that drove the eugenics movement—fallacies such as the idea that there is a single gene for “feeblemindedness.” Or, ahem, the love of the sea.
But that argument gets it backward. Eugenicists resorted to marriage laws and sterilization for the same reason that there was so little reliable data on human genetics: genetics required sex. Because human geneticists couldn’t carry out breeding experiments, they couldn’t do backcrosses, self-fertilizations, and all the other kinds of matings that other geneticists could do. They could, though, control who mated with whom to some degree on a broad social scale.
The significance of DNA is that it made it possible to do genetics without sex. It wasn’t just DNA, of course—cell culture as well as lots of advances in biochemistry and microbial genetics also contributed—but by the 1960s DNA had emerged as the emblem of a “new genetics.” From the beginning, the DNA double helix had an iconic aspect. The first published image, in Watson and Crick’s first paper (the anniversary of which is the impetus for DNA Day), had a stripped-down, cartoonish quality, and was described in the figure legend as “purely diagrammatic.” Everyone understands DNA, then, to mean much more than “deoxyribonucleic acid.” It stands for the relationship between heredity and health.
The new, DNA-based, molecular genetics finally made it possible to do genetics without sex. Reducing or preventing disease no longer required controlling who married whom, or (more theoretically) even which babies got born. Technology made it possible to select which genomes made it into the next generation, and even, in principle, to alter and “correct” genes in the individual.
“DNA” thus solved the fundamental ethical problem of eugenics. State-level involuntary coercion of reproductive behavior simply makes no sense in a developed country with sophisticated biomedical facilities. It is pointless and paranoid to fear a “return to eugenics” if what you mean is that good ol’ time Progressive eugenics.
In the DNA era, human genetics is still about relief of suffering and human improvement. The NIH touts the disease side of things, but what counts as a disease is heavily freighted with subjectivity, cultural bias, gender, and racial prejudice. Further, at the molecular level, the difference between preventing disease and genetic enhancement dissolves. If you up-regulate transcription of the gene for Human Growth Factor, for example, it makes no difference technically whether you do it in a dwarf, a short person, or a person of normal stature. And the moral distinction between remediation and enhancement relies on soft, unsatisfying philosophical arguments that basically amount to “Ugh!”—in the same way that a conservative parent reacts when his child comes home with blue hair and a lip piercing.
In 1957, Julian Huxley—grandson of Darwin’s bulldog, a distinguished biologist in his own right, and an articulate, politically liberal eugenicist—coined the term “transhumanism.” He wrote, “The human species can, if it wishes, transcend itself —not just sporadically, an individual here in one way, an individual there in another way, but in its entirety, as humanity.” This is what he defined as transhumanism, and he intended us to accomplish it by a variety of means, but of course at the root of it would be the conscious, deliberate manipulation of the human germ line. Throughout the 1960s, geneticists fantasized about using the new knowledge of the genetic code to control human development and evolution, to tinker with the design of human beings. The overwhelming majority of this fantasizing was done with the noblest of intentions. Huxley, JBS Haldane, HJ Muller, Joshua Lederberg, Edward Tatum—these were not ignorant fools but rather some of the greatest, most sophisticated minds in biology. They wanted not to rule the world but to reduce suffering and improve happiness, compassion, and noble achievement.
Muller’s eugenic scheme was called “germinal choice.” We’ve all heard of the Nobel sperm bank that William Shockley (inventor of the transistor) wanted to establish—that was Muller’s germinal choice. Present-day transhumanists prefer Muller’s term to “eugenics,” which is irritating because it requires so much explanation about how their eugenics isn’t the same eugenics as the bad old eugenics. But it’s eugenics. The only reason to deny it is the bad publicity the term gives you.
Transhumanists such as Gregory Stock and ScienceBlog’s own Eveloce tend to argue that genetic enhancement is coming whether we drag our feet or not, and they may be right. The sociotechnical power of contemporary biomedicine is astonishing—and on the rise. I’m not yet sure how I feel about this. I am inherently suspicious of any structure with such a concentration of technological and economic power, and power leads to hubris. It is a truism that 21st century DNA science has the potential for enormous benefit as well as catastrophic harm.
The problem is that the largest benefits tend to be long-term, while the largest risks are in the short term. It is not paranoid to be worried about such a situation, nor is it inconsistent to enjoy and admire positive results as they come out while maintaining a healthy, grouchy skepticism about the larger project.
I’m actually encouraged by the fact that transhumanism has a significant overlap with the blue-dreads-and-lip-piercing set. I’m more comfortable with tweaking our genes to, say, be able to grow horns or have Mr. Spock ears than to make everyone tall, white, and smart. Sure, it can be trendy and pretentious, like other body modification subcultures such as the “modern primitives,” but at bottom these folks are interested in it as a form of expression, not social control. Anything that breaks down barriers rather than reinforcing them gets my vote.
Nice piece today from SciCurious, guest blogging over at Scientific American. The post is an analysis of a recent article in Nature claiming that by knocking out serotonin in two different ways (both neurotransmitter production and receptors), they abolished sexual preference. The mice apparently mounted either sex with equal frequency.
SciCurious does a beautiful job dissecting the assumptions in the Nature article, analyzing the data, presenting alternative hypotheses, and looking at the history of the research. For example, the authors might have merely lowered the threshold of sexual activity–an extension of “all girls get prettier at closing time”. Or perhaps the researchers influenced the perception of other cues, for example olfactory cues. “So does this paper prove that there are drastic increases in sexual behavior associated with low serotonin?,” SciCurious writes. “Absolutely. Does it show that low levels of serotonin change sexual PREFERENCE? Well, that’s difficult to say.”
Also, Ed Yong looked at the article from a different but equally skeptical point of view. He points out how difficult it is to translate these kinds of behavioral findings from mice to humans. Further, he writes, “serotonin isn’t all about sex.” When I was a teaching assistant for the Neural Systems and Behavior course at Cornell back in the late 1980s, we used to drill in to the students’ heads the idea that neurotransmitters do not have behaviors. They act in many regions of the brain and influence all sorts of behaviors in ways that are very far from straightforward.
Yong worries (rightly) that anti-gay groups will use findings like this to argue for a simple biological basis for homosexuality, perhaps even proposing serotonin therapy as part of their effort to “cure” it. And SciCurious links to news stories soon appeared suggesting that the researchers had “turned mice gay.”
Such stories illustrate a fundamental fallacy that is one of the gravest dangers of popularizing science. For the sake of argument, let’s say the researchers did in fact eliminate sexual preference. In what sense is “no sexual preference” the same as “gay”? Ans: only in a world so normative that strict, unwavering heterosexuality is the only behavior considered normal. Of course, there are lots of people like that–I read about them all the time. But it is blinkered, naive, and deeply chauvinistic.
Biological and especially genetic explanations of behavior are a double-edged sword. The gay community has oscillated in its support for research to find “gay genes” and other traces of the biological basis of homosexuality. If homosexuality is innate, the reasoning goes, then it is cruel and pointless to try to “cure” gays, in the same way it was cruel to “cure” left-handed people.
But “gay” is a cultural construct. There were no “gays” in Ancient Rome or in 19th century Paris, and there are no gays in the Foré of Papua New Guinea. Here and now, in our culture, we need the term in order to protect human rights that are trampled on by people unreflectively absorbing an outdated cultural taboo on homosexual activity. But in the long run, the ideal should be to get rid of the concept–for us all, in short, to be “gay-blind.”
Good skeptical science writing helps that cause, because it exposes fallacies in the ways we think about science. I’m fine with describing a physiological mechanisms for a behavior, but we need to be careful not to equate mechanism with cause. I’m wary of science writing that talks about the “roots” or the “basis” of complex behavior or disease. It implies a hierarchy that blinds us to many biological mechanisms that work in the other direction. In biology, cause and effect go both directions: behavior changes gene activity as much as gene activity changes behavior. Studies purporting to examine the biological “basis” of behavior rely on cross-species analogies and make unsupportable assumptions about motivation.
My satirical post last week about scientists finding a gene for love of the sea was intended to make a point about how we view genomics today—and a historical point about how we smugly congratulate ourselves on being so much more sophisticated than early human geneticists and eugenicists. Most people got that it was a spoof, but I thought it would be worthwhile to discuss some of the deeper issues at stake.
Charles Davenport was a real scientist, and the quotes from him are real. Davenport was a geneticist in the first half of the 20th century and the leader of the American Eugenics movement during the Progressive Era. He is often demonized as wrong-headed, misguided, and simple-minded. Indeed, he could be all of these things. Davenport really did believe there was a recessive, male-linked trait for the love of the sea. Thalassophilia has become a classic example of how eugenicists could ignore obvious environmental explanations in favor of the hereditary. When I told my 11-year-old daughter about Davenport’s thalassophilia, she immediately saw the fallacy: the sons of ship captains learn their love of the sea, they don’t inherit it.
My larger point is that simplistic analyses like Davenport’s can be masked by numbers and fancy technology.
For years, medical genetics involved the search for genes underlying genetic disease. Diseases that were caused by a defective gene, and not, say, by a germ or some other environmental factor. But that distinction has been erased. We used to think of genetic traits and non-genetic traits. Now, non-genetic traits are called “complex”—i.e., partly genetic and partly environmental. In other words, all diseases, and indeed all traits are understood as partially genetic.
There are sound reasons for thinking this way. I’m not arguing that those genes don’t exist. I don’t question the data—I’m happy to believe that there really is a genetic association with all of these traits. Indeed, I think it’s becoming possible to find a real, verifiable genetic basis for almost anything you like.
The advent of genome-wide association studies (GWAS) has made it vastly easier to examine traits with smaller and smaller genetic contributions. In essence, you can pick your trait, sample the DNA of a large group of people, and scan their genomes for bits of shared sequence.
As a consequence, we have the recent bloom of studies describing the genetic component of all sorts of “complex” traits, from religiosity to getting drunk and beating people up. We’re only limited by our imaginations, and by the kinds of traits we’re interested in today.
Thinking about these recent studies, it occurred to me that these traits were not fundamentally different from Davenport’s old favorite, thalassophilia. I bet, I said to myself, that if sailing were as culturally important today as it was in 1919, people would be doing GWAS to find the genetic basis of sea-lust. And I bet they’d find it.
Of course, there are big differences between human genetics in 2011 and human genetics in 1919. Davenport advocated sterilization laws and immigration laws to manage and shrink what he saw as the swelling populations of the “unfit.” That would be inconceivable today. I don’t think we’re returning to a “new eugenics” in any meaningful sense.
But cutting across the cultural differences are some continuities. One of them is the desire to believe there is a simple genetic explanation for our tastes and talents. That I think is a dangerous view. So on the one hand, I think we should be careful to evaluate 1920s science by the standards of the day, rather than by those of the 21st century. And on the other, we must not delude ourselves that modern science is completely objective. Mechanistic explanations are not proof against cultural bias.
My spoof was intended as a word of caution, a way to inject a note of skepticism about genetic explanations of human nature. C.M (“Call Me) Ishmael, the journal Genetic Determinism Today, MysticGene, the 4C (“for sea”) variant, the salt-stained polo shirts and the sailing widows—all that was pure balderdash. As the motto of this site goes, “Here lies truth”— in roughly equal measure.
So, keep your heads up, folks—and watch for the keyword “Satire” in the Categories section of this blog. Thanks for reading.