A Whig History of CRISPR

“Shitstorm” would be one term of art for the reaction in the genome community to a commentary in Cell by Eric Lander, published on January 14. It presents as a definitive account of the discovery of CRISPR, the “gene editing” technique invented in 2013 and which blasted onto the science pages this year. CRISPR is likely to go down as the most important biotechnological invention since Kary Mullis invented the polymerase chain reaction (PCR).

But I prefer another phrase to describe Lander’s account: “Whig history.” The term comes from the Europeanist Herbert Butterfield. In a classic 1931 essay, Butterfield wrote that Whig history was “the tendency in many historians to write [English history] on the side of Protestants and Whigs, to praise revolutions provided they have been successful, to emphasize certain principles of progress in the past and to produce a story which is the ratification if not the glorification of the present.”

The term has become historical shorthand for one way to use history as a political tool. It rationalizes the status quo, wins the allegiance of the establishment, justifies the dominance of those in power. One immediate tip-off to a Whiggish historical account is the use of triumphalist or melodramatic terms such as “heroes” in the title.

Lander’s piece is called “The heroes of CRISPR.”

In April 2014, the Broad Institute at Harvard and MIT—of which Lander is the director— was awarded the first patent for CRISPR technology. The team of Jennifer Doudna (UC Berkeley) and Emmanuelle Charpentier (Umeå University, Sweden) filed their own application seven months earlier, but Zhang obtained fast-track approval. Much remains at stake over CRISPR: fat scientific prizes, almost certainly including a Nobel, as well as further patents. Who claims them will be decided in part by what version of history becomes accepted as “the truth.”

When Michael Eisen, the UC Berkeley/Howard Hughes Medical Institute biologist and astute commentator on genomics read Lander’s article, he went ballistic. In a tweet-blast of righteous indignation, Eisen howled that Lander’s piece minimizes Doudna’s contributions to CRISPR and thus (I’m paraphrasing here) serves as a propaganda organ on behalf of the Broad’s claim to the patent rights. “The whole thing is about trying to establish Zhang paper as pinnacle of CRISPR work,” tweeted Eisen. He continued, “it’s a deliberate effort to undermine Doudna and Charpentier patent claims and prizeworthiness.” It is, he believes, “science propaganda at its most repellent.” “Eric Lander and @broadinstitute should be ashamed of themselves.”

Others have joined in to express their dismay. At the least, many in the community think, some sort of conflict-of-interest statement should have accompanied Lander’s article. A long thread at PubPeer is devoted to the kerfluffle.

Is Eisen right? I’ll leave analysis of the technical arguments over the relative merits of each group’s contributions to the biologists. What I can do is look at the paper itself. Good writers know how rhetoric can be used to persuade. Does Lander use writing techniques to advance a self-interested version of history?

On first read, Lander’s piece seems eminently fair, even generous. It “aims to fill in [the] backstory” of CRISPR, Lander writes; “the history of ideas and the stories of pioneers—and draw lessons about the remarkable ecosystem underlying scientific discovery.” He traces CRISPR’s origins all the way back to Francisco Mojica, a doctoral student at the University of Alicante, in Spain, in 1989. Mojica discovered a new class of repeating sequence that was present in diverse organisms, suggesting widespread taxonomic importance. These, of course, were the first CRISPR sites—clustered regularly-interspaced short palindromic repeats. By 2000, Mojica had found CRISPR loci in 20 different organisms.

By turning his lens on such unsung heroes, laboring away at universities well beyond the anointed labs of Harvard, MIT, UCSF, Johns Hopkins, and the like, Lander creates the impression of inclusiveness, of the sharing of credit among all the “heroes” of CRISPR.

But when he reaches Doudna and Charpentier’s chapter in the story, the generosity becomes curiously muted. Though Lander maintains his warm, avuncular tone, Doudna and Charpentier enter the story as brave soldiers, working shoulder to shoulder with others on the long journey to practical application of CRISPR. Some subtle techniques create a very definite impression.

For example, Lander narrates Charpentier’s story alongside that of the Lithuanian scientist Virginijus Siksnys. But Siksnys receives top billing. His name appears in the first line of two sections of the paper:

Screen Shot 2016-01-18 at 10.38.28 AM

Screen Shot 2016-01-18 at 10.40.57 AM

Charpentier’s name, on the other hand, appears at the bottom of a paragraph devoted to a component of the CRISPR-cas9 system called tracrRNA.

Screen Shot 2016-01-18 at 10.38.53 AM

Jennifer Doudna is graciously given the epithet “world-renowned,” which may distract our attention from the fact that her first mention is buried in the middle of a paragraph, in the second half of a long sentence, the direct object rather than the subject of the sentence:

Screen Shot 2016-01-18 at 10.45.59 AM

Doudna and Charpentier go neck and neck with Siksnys through the next sections, but Doudna and Charpentier’s contributions are repeatedly diminished. “Sisknys submitted his paper to Cell on April 6, 2012,” begins one paragraph.  It gets rejected without review. He revises and resubmits, to PNAS and appeared online on Sept. 4. The Doudna-Charpentier paper, he writes, “fared better.” He takes care to note that Doudna and Charpentier’s paper was “submitted to Science 2 months after Siksnys’s on June 8.” It “sailed through review,” he writes, and appeared online on June 28. His point: Although Doudna and Charpentier published first, Siksnys submitted his paper weeks before them. Equating the two papers, Lander writes, “both groups clearly recognized the potential for biotechnology.” This clearly undermines Doudna and Charpentier’s claim to invention of the CRISPR-cas9 technology and hence weakens their case for a patent.

Now, enter Feng Zhang and George Church of the Broad Institute. They receive the longest treatment of any actor in the story—a solid page out of nine pages of text. Zhang’s biographical sketch alone receives a long paragraph. Lander is enough of a writer to know that you indicate a character’s importance with the amount of space you devote to them; the longer the bio, the more important the person.

Then Doudna submits a key paper “with assistance from Church.” This and three other “short [ie., minor] papers,” Lander makes sure to note, “were accepted soon after Zhang and Church’s papers were published in early January, 2013.

Lander concludes his saga with words of benevolent wisdom, extolling the “ecology” of science that produces profound discoveries. History provides optimistic lessons about the idealistic world of pure science, carried out purely for the sake of furthering knowledge. One can almost see Lander dabbing away tears of joy as he writes,

The human stories behind scientific advances can teach us a lot about the miraculous ecosystem that drives biomedical progress—about the roles of serendipity and planning, of pure curiosity and practical application, of hypothesis-free and hypothesis-driven science, of individuals and teams, and of fresh perspectives and deep expertise.

So I think Eisen has a point in reading the paper as a crafty effort to establish Zhang and Church as the scientists who brought the relay race to the finish line—and to portray their principal competitors for patents and prizes, Doudna and Charpentier, as merely two in a long string of runners.

I’m glad to see other scientists, such as Mojica, receive credit in a major CRISPR narrative. Too often the early players and the scientists at lesser-known universities become lost to history altogether. But we should also recognize how Lander uses those actors to create a crowd in which to bury Doudna and Charpentier. It would have been possible to mention Mojica, Gilles Vergnaud, and others while still giving Doudna and Charpentier their due.

Why did he do it? The most obvious reason is the patents over the CRISPR technology. The Broad Institute, which Lander directs, is in a heated patent battle with U.C. Berkeley, Doudna’s home. Lander craftily undermines Doudna and Charpentier’s claim to both priority and originality, as well as to the recognition of CRISPR’s commercial potential. Whig history is written by the winners—and sometimes by the competitors.

Update 1/19: Both Doudna and Church have said Lander’s article contains factual errorsI’ll leave it to the experts to debate the technical details of the science. My argument—stimulated and shaped by Eisen’s tweets—is about the tone and style of the piece. Lander is a public-relations master. He’s a compelling speaker and a sophisticated writer. He’s a giant in the field: he has been a leader in the genome community since early days of the human genome project. He knows exactly what he’s doing.

A Nobel can be split at most three ways, and there are four principal actors. How will the prize be partitioned? Doudna, Charpentier, and Zhang? Doudna, Charpentier, and Mojica? Zhang, Church, and Lander? I have witnessed the steady PR campaigns of scientists who went on to win Nobels. The Prize is supposed to be wholly merit-based, but, we being humans, reputation and economics matter.

Update: One should also note the gender dynamics of the story. However conscious or unconscious it may be, efforts such as this underscore the often-subtle ways in which “history by the winners” still tends to end up being “history by the men.” Only way that stops is by saying it out loud. Tip o’ the pin to Anne Fausto-Sterling and Alondra Nelson for nudging me on this.

At its best, science is a model of human interaction: cooperative, open, focused on evidence and reason, unbiased by prejudice of ethnicity, gender, sexuality, or disability. But science is no longer done in monasteries. Competition, pride, ego, greed, and politics play all too great a role in determining who gets credit, who wins the prizes, and who gets into the textbooks. As Butterfield recognized, controlling the history is both a perk of coming out on top and, while the battle still rages, a way to cement your team’s role in the crystallizing master narrative.

When a scientific history promises an account of “heroes,” when it is filled with sentimental language “miraculous ecosystem” of “pure curiosity and practical application,” and when that history is written by an individual who has much to gain by the acceptance of his own account, the piece should come with a conflict-of-interest statement—or at least a road-sign reading, “Danger! Whig history ahead.”

PS: See also Dominic Berry’s take on the Lander article, also drawing on the history of science but framing it in terms of intellectual property, here: http://blogs.sps.ed.ac.uk/engineering-life/2016/01/18/crispr-in-the-history-of-science-and-intellectual-property/

[2017-12-06: Minor edits, mainly typos and style]

DNA as Commodity

Here’s a riddle: In the morning I was in the soup. At noon I was in a dish. In the afternoon I was in your gas tank. And at night I am in the bank. What am I?

The answer is DNA. From a natural object emerging, some say, from a primordial soup, to a laboratory object, to a cultural object, it has become data, a special string of computer code endowed with the power to foretell disease, identify criminals, and be leveraged, like software, as a product.

from venturebeat.com

Robert Resta, a genetic counselor and always a reliable source of depressing, ironic, frightening, and amusing stories about heredity and DNA, forwards this piece, by @alexlash, from Xconomy.com on how your DNA is becoming commodified. What’s happening is interesting not for how new it is, but for the way in which the exotic is becoming commonplace. Featuring the San Francisco biotech company Invitae, the piece shows how this small but highly capitalized company is taking on giants such as Illumina, 23andMe, and Myriad in a bid to monetize your sequence—and give you a small cut.

The troubling thing is how commonplace this is all becoming. Nothing Invitae is doing is really new. They want to persuade you to donate your genome to their database, where it can be analyzed to inform you about your health, contribute to research, and be sold to other companies who might use it for anything from curing cancer to targeted advertising. Invitae CEO Randy Scott he is not bringing in new tests–only offering existing ones all in one place, an approach he calls “generic genetics.” (“Generic genomics” would be more accurate–this is about as far from Gregor Mendel as you can get with a double helix.) And he wants to include users in the process, so that if their sequence is bundled into some DNA-based product, you get a tiny royalty. Sort of like allowing ads on your blog.

For some time, historians and sociologists of science have been writing about “biological citizenship,” the idea that we’re coming to base our identity on our biological status rather than our labor. Many people today identify more as a cancer survivor, as living with depression, or as gluten-intolerant than as a carpenter, secretary, or professor.

DNA has been a big part of that shift to biological citizenship. It’s not the only thing, of course, but it’s a big one. DNA, a hypothetical Marxist historian (there are still a few!) might say, came to have “use value.” We hear every day suggestions that our genes make us who we are. Leaving aside for the moment whether they actually do—a challenge for me, as regular readers know—we believe our DNA to be the secret of life. And so, in a sense, it is.

What’s happening now is that use value is being converted into exchange value. DNA is becoming a currency. An investment account we’re all born with. What are you going to do with yours? Hide it under a mattress? Or make it work for you?

One could imagine a day when there’s a new kind of hereditary aristocracy. A group of nouveau riche whose wealth nevertheless was inherited. Those who, through no effort of their own, received a legacy of valuable SNPs (single-nucleotide polymorphisms). But it will take a go-getter to capitalize on that legacy. You’ll have to have ambition, street smarts, and at least a bit of lab smarts.

Marx also said that history was determined by the material reality of the individual. One might now say the molecular reality of the individual. But total self-awareness at the molecular level won’t lead to the end of exploitation of man by man. Indeed, it is only the beginning.

Human Theome Project

This is one of the most serious pieces ever posted on Genotopia. As the shooting at the French humor magazine Charlie Hebdo shows, satire can be dangerous business. It is also one of the most important forms of speech—both provocative and healing. We are reposting this feature, originally posted in 2011 and lightly edited, to lock arms with the staff and writers at Charlie Hebdo and with satirists everywhere. We hope it offends someone and are glad that our home address is not public.

Joe and Mary Juke are models of piety. They attend services twice a week, are active in faith-based charity organizations, and their house brims tastefully with Christian iconography and literature. They describe themselves as “fundamentalists,” although Joe is quick to emphasize, “We’re moderate fundamentalists—we don’t bomb clinics or anything.” They are planning to have a family, and they are making sure to create a pious environment for their children. They know that the setting in which a child is raised helps determine the kind of adult he or she becomes.

But for the Jukes, books, icons, and saying “Grace” are not enough. In what is being cited as a milestone in personal genomics, Joe and Mary have taken steps to ensure their baby is religious—by selecting its genes.

 photo gods-dna-church.gif

From Poor Old Spike’s photobucket

 

Using preimplantation genetic diagnosis (PGD), a combination of genetic screening and in vitro fertilization (IVF), Joe and Mary are loading the genetic dice for their progeny, selecting embryos that carry the traits they want in little Joe Jr. (or mini-Mary). Modern techniques allow them to select for a wide range of qualities, from avoiding hereditary diseases, to selecting eye, hair, and skin color, to shaping aspects of personality. For example, choosing a combination of half a dozen genes allows them to add a cumulative 40 points to their unborn child’s IQ. Many of these tests have been available for years, although they have only recently begun to be available to consumers. But the most striking decision in their family-planning process was to expressly select for embryos that will grow up to be religious, because they carry the allele known colloquially as the “god gene.”

“It kind of gives a whole new meaning to the phrase, “Chosen One,” Mary says.

Sequencing the human theome

The gene, which was identified statistically in twins in a study published in 2005, was recently cloned and sequenced, as reported in the online journal Nature Theology. Dubbed yhwh1, the gene correlates strongly with feelings of religious fervor. Studies show that the gene encodes a protein that is expressed in a part of the brain called Chardin’s area 86, long associated with religious activity and, strangely, anterograde amnesia. One famous patient was Guineas Phage, a virologist who suffered an injury with a pipetteman that resulted in a plastic tube being driven precisely into area 86; he spent the last two decades of his life on a constant pilgrimage along US Route 66 between Kingman and Barstow, accompanied by his wife, Winona, whom he continually left behind at gas stations.

Particular expression of religiosity in a given individual varies according to environment; what is inherited is the capacity for intense religious experience and evangelism. First described in the Amish in a classic study of the 1960s, the trait was described as an autosomal recessive with high penetrance, and was linked to a rare inherited form of dwarfism. Recent analyses have also found the trait occurring at high frequency among charismatic ministers, shamans, and suicide bombers.

The yhwh1 allele is one of the latest findings in the burgeoning field of “theomics,” which aims to identify all genes associated with the practice of preaching, as well as general feelings of spirituality. The Human Theome Project overran its projected completion date of December 21, 2012—the date, according to ABC News, the world as we know it might have come to an end. All things considered, then, researchers are content with their progress.

Here are some of the most exciting new findings of the HTP:

▪   Scientists estimate that at least 400 genes are involved with religious feelings or activity. Thus far, more than 100,000 variants have been described. Easing the task of studying these genes is the fact that they cluster into “ecclesiotypes” — groups of religion genes that tend to segregate together. A research team at Mystic University in Connecticut is coordinating an effort, called the “EccMap project,” to characterize them.

▪   The EccMap project consists of 400 trios  (deity, parent, and child) representing each of the five dominant world religions—christianism, judaishness, islamian, confusianity, and pastafar-I. Researchers say these explain 80% of genetic variation for religious belief.

▪   Polytheism is more common than had been thought. Ninety-five percent of all trios in the EccMap project have genes for at least two religions. Since phenotype is almost always monotheist, this suggests that the environment may play some role in fine-tuning religious beliefs.

▪   A related project seeks to uncover the epigenetics of evangelism, which is thought to be caused by methylation of regions of the X chromosome, a reversible process that can profoundly affect gene expression. Researchers hope this may provide therapeutic targets for drugs or gene therapy to “de-program” those who become convinced, against thousands of years of recorded history, that theirs is the only path up the spiritual mountain.

▪   A newly discovered kinase, called Bub666, is strongly correlated with atheism. It seems to be responsible for the breakdown of yhwh1, suggesting that biochemists are approaching a mechanistic explanation of religious experience.

▪   Rocker Ozzy Osborne has had his genome sequenced. Preliminary results show 85% homology with a Presbyterian minister from Des Moines.

“It’s tremendously exciting research,” said Mary Magdalene-Gohdtsdottir, a senior researcher in the University of Utah’s Department of Omics. “Just think of it: the genes for God! Isn’t that cool?” Indeed, the federal government thinks so. NIH Director Francois Colon, a molecular biologist and born-again Christian, has recently created a National Institute of the Molecular Biology of Yahweh (NIMBY), with an annual research budget of $400/year, as part of the government’s effort to support faith-based initiatives in biomedicine.

 

But is it science?

Some critics have called the Jukes’ actions a step toward eugenics, described in the 1920s as the “self-direction of human evolution.” They see religiosity as a gift, not something that can be ordered from a catalog. “This is an outrage,” said the Reverend Reginald S. Inkblot, of Southboro Baptist Church in Onan, Kansas. “Religion can’t be in your genes. Science can’t explain it. It’s just a part of who…you…um, are. It’s just in your…uh, yea…” He brightened and added, “If God had wanted us to be religious, he would have….oh, wait. Damn!—I mean, darn!”

Others are appalled that religion would receive scientific consideration from scientific foundations at all. Dick Dorkins, President of the atheistic Society for the Prevention of Intelligent design, Theology, Or Other Nonsense (SPITOON), calls the entire effort a “travesty.” “If I must check my brain at the church-house door,” he said in a Skype interview, “then you must check your soul at the laboratory door. Come on—be fair.

Dorkins worries that should the procedure become widespread, it could lead to nonreligious persecution. If those chosen by PGD tend to express genes such as yhwh1, scientists predict, it could lead to changes in gene frequency across the population. Dorkins envisions a dystopian scenario in which an atheistic underclass washes the wineglasses and polishes the pews for their genetic spiritual superiors. “It will be GATTACA crossed with The Ten Commandments,” Dorkins said, an audible quiver in his voice.

Evolution in religious hands

Some theologians have condemned in vitro fertilization because it normally results in the destruction of unused embryos. However, new gene therapy techniques make it possible to link a “suicide gene” to alternative forms of the desired genes in Joe’s sperm samples; thus, only sperm that carry the traits they want survive to fertilize Mary’s eggs. No embryos are destroyed in the process. This makes in vitro fertilization acceptable to many pro-life Christians.

Joe and Mary dismiss critics who say they are taking evolution into their own hands. “That’s just your theory,” says Joe. They view their decision to choose the religiosity of their unborn child as a command from above. “WWJC?,” Mary asks. “Who would Jesus clone?”

Ironically, as Biblical literalists, the Jukes dismiss Darwinian evolution as “unproven.” To them, the earth is 4,000 years old, and all the types of animals in the world today were on Noah’s Ark. They see themselves as spearheading a Crusade of believers into biomedicine.

His eye acquiring that spark of evangelism that is a tell-tale sign of heavy methylation at Xq66, Joe’s voice deepened and he intoned, “The heresy of modern science will only be righted when human evolution is safely in the hands of people who do not believe in it.”
Read more at https://scienceblog.com/76171/human-theome-project/#07U8Wl33hcVj3mjI.99

Genetic determinism: why we never learn—and why it matters

Here it is, 2014, and we have “Is the will to work out genetically determined?,” by Bruce Grierson in Pacific Standard (“The Science of Society”).

Spoiler: No.

The story’s protagonist is a skinny, twitchy mouse named Dean who lives in a cage in a mouse colony at UC Riverside. Dean runs on his exercise wheel incessantly—up to 31 km per night. He is the product of a breeding experiment by the biologist Ted Garland, who selected mice for the tendency to run on a wheel for 70 generations. Garland speculates that Dean is physically addicted to running—that he gets a dopamine surge that he just can’t get enough of.

Running in place

Addiction theory long ago embraced the idea that behaviors such as exercise, eating, or gambling may have similar effects on the brain as dependence-forming drugs such as heroin or cocaine. I have no beef with that, beyond irritation at the tenuous link between a running captive mouse to a human junkie. What’s troubling here is the genetic determinism. My argument is about language, but it’s more than a linguistic quibble; there are significant social implications to the ways we talk and write about science. Science has the most cultural authority of any enterprise today—certainly more than the humanities or arts!. How we talk about it shapes society. Reducing a complex behavior to a single gene gives us blinders: it tends to turn social problems into molecular ones. As I’ve said before, molecular problems tend to have molecular solutions. The focus on genes and brain “wiring” tends to suggest pharmaceutical therapies.

To illustrate, Grierson writes,

File this question under “Where there’s a cause, there’s a cure.” If scientists crack the genetic code for intrinsic motivation to exercise, then its biochemical signature can, in theory, be synthesized. Why not a pill that would make us want to work out?

I have bigger genes to fry than to quibble over the misuse of “cracking the genetic code,” although it may be indicative of a naiveté about genetics that allows Grierson to swallow Garland’s suggestion about an exercise pill. Grierson continues, quoting Garland,

“One always hates to recommend yet another medication for a substantial fraction of the population,” says Garland, “but Jesus, look at how many people are already on antidepressants. Who’s to say it wouldn’t be a good thing?”

I am. First, Jesus, look at how many people are already on anti-depressants! The fact that we already over-prescribe anti-depressants, anxiolytics, ADHD drugs, statins, steroids, and antibiotics does not constitute an argument for over-prescribing yet another drug. “Bartender, another drink!” “Sir, haven’t you already had too much?” “Y’know, yer right—better make it two.”

Then, what if it doesn’t work as intended? Anatomizing our constitution into “traits” such as the desire to work out is bound to have other effects. Let’s assume Dean is just like a human as far as the presumptive workout gene is concerned. Dean is skinny and twitchy and wants to do nothing but run. Is it because he “wants to exercise” or is it because he is a neurotic mess and he takes out his anxiety on his wheel? Lots of mice in little cages run incessantly—Dean just does it more than most. His impulse to run is connected to myriad variables, genes, brain nuclei, and the reported results say nothing about mechanisms. We now know that the physiological environment influences the genes as much as the genes influence physiological environment. The reductionist logic of genetic determinism, though, promotes thinking in terms of a unidirectional flow of causation, from the “lowest” levels to the “highest.” The more we learn about gene action, the less valid that seems to become as an a priori assumption. The antiquated “master molecule” idea still permeates both science and science writing.

Further, when you try to dissect temperament into discrete behaviors this way, and design drugs that target those behaviors, side effects are sure to be massive. Jesus, look at all those anti-depressants, which decrease libido. Would this workout pill make us neurotic, anxious, jittery? Would we become depressed if we became injured or otherwise missed our workouts? Would it make us want to work out or would it make us want to take up smoking or snort heroin? In the logic of modern pharmacy, the obvious answer to side effects is…more drugs: anti-depressants, anxiolytics, anti-psychotics, etc. A workout pill, then, would mainly benefit the pharmaceutical industry. When a scientist makes a leap from a running mouse to a workout pill, he is floating a business plan, not a healthcare regimen.

And finally, what if it does work as intended? It would be a detriment to society, because, having a pill, it would remove yet another dimension of a healthy lifestyle from the realm of self-discipline, autonomy, and social well-being. It becomes another argument against rebuilding walkable neighborhoods and promoting public transportation and commuting by bicycle. A quarter-mile stroll to an exercise addict would be like a quarter-pill of codeine for a heroin junkie—unsatisfying. Not only is this putative workout pill a long, long stretch and rife with pitfalls, it is not even something worth aspiring to.

And that’s just one article. Scientific American recently ran a piece about how people who lack the “gene for underarm odor” (ABCC11) still buy deodorant (couldn’t possibly have anything to do with culture, could it?). Then there was their jaw-dropping “Jewish gene for intelligence,” which Sci Am had already taken down by the time it appeared in my Google Alert. I’d love to have heard the chewing out someone received for that bone-headed headline. Why do these articles keep appearing?

The best science writers understand and even write about how to avoid determinist language. In 2010, Ed Yong wrote an excellent analysis of how, in the 1990s, the monoamine oxidase A (MAOA) gene became mis- and oversold as “the warrior gene.” What’s wrong with a little harmless sensationalism? Plenty, says Yong. First, catchy names like “warrior gene” are bound to be misleading. They are ways of grabbing the audience, not describing the science, so they oversimplify and distort in a lazy effort to connect with a scientifically unsophisticated audience. Second, there is no such thing as a “gene for” anything interesting. Nature and nurture are inextricable. Third, slangy, catchy phrases like “warrior gene” reinforce stereotypes. The warrior gene was quickly linked to the Maori population of New Zealand. Made sense: “everyone knows” the Maoris are “war-like.” Problem was, the preliminary data didn’t hold up. In The Unnatural Nature of Science, the developmental biologist Lewis Wolpert observed that the essence of science is its ability to show how misleading “common sense” can be. Yet that is an ideal; scientists can be just as pedestrian and banal as the rest of us. Finally, Yong points out that genes do not dictate behavior. They are not mechanical switches that turn complex traits on and off. As sophisticated as modern genomics is, too many of us haven’t moved beyond the simplistic Mendelism that enabled the distinguished psychiatrist Henry H. Goddard to postulate—based on reams of data collected over many years —a single recessive gene for “feeblemindedness.” The best method in the world can’t overcome deeply entrenched preconception. As another fine science writer, David Dobbs, pithily put it in 2010, “Enough with the ‘slut gene’ already…genes ain’t traits.”

As knowledge wends from the lab bench to the public eyeball, genetic determinism seeps in at every stage. In my experience, most scientists working today have at least a reasonably sophisticated understanding of the relationship between genes and behavior. But all too often, sensationalism and increasingly greed induce them to oversell their work, boiling complex behaviors down to single genes and waving their arms about potential therapies. Then, public relations people at universities and research labs are in the business of promoting science, so when writing press releases they strive for hooks that will catch the notice of journalists. The two best hooks in biomedicine, of course, are health and wealth. The journalists, in turn, seek the largest viewership they can, which leads the less scrupulous or less talented to reach for cheap and easy metaphors. And even though many deterministic headlines cap articles that do portray the complexity of gene action, the lay reader is going to take away the message, “It’s all in my genes.”

Genetic determinism, then, is not monocausal. It has many sources, including sensationalism, ambition, poor practice, and the eternal wish for simple solutions to complex problems. Science and journalism are united by a drive toward making the complex simple. That impulse is what makes skillful practioners in either field so impressive. But in clumsier hands, the simple becomes simplistic, and I would argue that this risk is multiplied in journalism about science. Science writing is the delicate art of simplifying the complexity of efforts to simplify nature. This is where the tools of history become complementary to those of science and science journalism. Scientists and science writers strive to take the complex and make it simple. Historians take the deceptively simple and make it complicated. If science and science journalism make maps of the territory, historians are there to move back to the territory, in all its richness—to set the map back in its context.

Studying genetics and popularization over the last century or so has led me to the surprising conclusion that genetic oversell is independent of genetic knowledge. We see the same sorts of articles in 2014 as we saw in 1914. Neither gene mapping nor cloning nor high-throughput sequencing; neither cytogenetics nor pleiotropy nor DNA modification; neither the eugenics movement nor the XYY controversy nor the debacles of early gene therapy—in short, neither methods, nor concepts, nor social lessons—seem to make much of a dent in our preference for simplistic explanations and easy solutions.

Maybe we’re just wired for it.

Why aren’t genome profiles free? A cynic’s view

Charles Seife’s piece over at sciamblogs the other day gave me one of those forehead-slapping, “if-it-was-a-double-helix-it-would-have-bit-me” moments. For 23andMe, the “test” or genome profile is small potatoes. The real product of 23andMe isn’t the saliva test.

It’s the database, stupid.

Think about gmail or Facebook. When you sign up for a “free” account you agree to allow them to send you targeted promotions, in the form of ads that appear in the margins. It’s hilarious how bad their algorithms are. When I rant about some bonehead right-wing politician, I start getting suggestions to follow Mitt Romney. When my wife posted wedding pictures, she got ads for wedding registries. My google profile is amazingly bad–mixed in with a few genuine interests, it lists dozens of things I have no interest in whatsoever (fishing, dolls and accessories, apartments and residential rentals) as well as things so general they say nothing meaningful about me (consumer resources, search engines). I’ve often found that reassuring. Though I know the software is bound to get better, it’s comforting at least for the moment to know that they don’t actually know me that well. I’m not naive enough to think that will last forever.

What 23andMe is really about, says Seife, is doing that same kind of profiling with your genome. Historians and anthropologists of science have long been interested in “biologization” and “medicalization.” Ugly words, useful concepts. Biomedicine tends to shift our gaze from labor to biology (and especially health). Is violence a crime or a disease? Do you identify more strongly as a carpenter, soldier, or professor, vs. as celiac, PTSD, or a breast-cancer survivor? Conceiving some corner of our world in biological terms can have profound implications. If violence is a crime, we treat it with fines, incarceration, or death. If it is an organic disease, we treat it with drugs and counseling. Conceivably, we may someday treat it with gene therapy. As Foucault pointed out, both criminalization and medicalization involve behavior modification–just in different formats. Medicalization can be more humane, but it can also strip away one’s autonomy and subtly and dangerously shift power relationships.

Every time someone sends in their little vial of spit to 23andMe, the company adds to a large-and-growing database of genomic data linked to a broad range of personal tastes and behaviors. Like Google and Facebook, they make you agree to let them send you ads based on the data they collect, which is augmented by their social-media site. It is a genomic version of Google. Welcome to the all-volunteer biological surveillance state.

Seife points out that Anne Wojcicki was married to Google co-founder Sergei Brin, that Google is a heavy investor in 23andMe, and that the price of the saliva test has been dropping steadily and is now below $100.

My cynical view is that the company has an easy end-around available for the FDA letter demanding that they halt marketing their test: give it away. It’s not marketing if they’re not selling, right? Investopedia defines marketing as “The activities of a company associated with buying and selling a product or service.” It does go on to list advertising as one key aspect of marketing, but I wonder whether they could successfully argue that promoting a free product isn’t marketing per se.

Legal definitions aside, if Seife is right, my guess is that before long genome profiling will be like web browsers and email: something almost everyone does, and that except for a few willing to pay for a premium private service, something that provides so many benefits that we tolerate its ads as a necessary trade-off of modern life.

Is this any more insidious than gmail? If we say yes, we risk running headlong into the genetic determinism this blog rails against. We have to be careful about privileging biological information over social information. If Facebook’s suggestions are laughable, they’d likely seem prescient in comparison with what they’d predict based on my genomic profile. The “genes for” most things explain tiny amounts of variance and tend to have low penetrance. Other than a few strongly Mendelian diseases, a genome profile currently says very little about you, simply because it’s based on small probabilities of uncertain precision.

But like the algorithms analyzing your social profile, those combing your genomic profile will improve, and probably at a rate faster than any of us expect. Most importantly, your genomic profile will merge with your social profile, which will greatly enhance the accuracy of both. Your social profile will become biologized–rooted in and interwoven with your DNA.

The gradual way in which 23andMe is heading toward an open-source business model may simply reflect the high cost of getting the biotech version of Google Plus off the ground. As profits increase, they can afford to drop the price. When it hits zero–when they start giving away the test–rest assured that ad revenues will then be enough to keep the shareholders happy.

A blow for personal genome testing

Hey honey–remember when I accidentally left the chicken coop open and they all flew away? Well I think they’ve come back home to roost!

Last summer, we did an analysis of the 23andMe commercial promoting their genetic testing service and the egotistical identity politics it both taps into and contributes to. The ad was all about how your genes were “You” and knowing about them would enable you to predict your genetic future. Genetic profiling can in some cases give robust statistical estimates of likelihood of certain genetic conditions, but it is safe to say that we rarely know what that means. And it’s presented as though we do.

Now we find that FDA is ordering 23andMe to stop marketing their tests.

The 23andMe saliva sample kit, says FDA, is a “medical device,” “intended for use in the diagnosis of disease or other conditions or in the cure, mitigation, treatment, or prevention of disease, or is intended to affect the structure or function of the body.

They cite the company’s claims to allow patients’ genome profiles to help them assess “health risks,” and “drug response,” and specifically as a “first step in prevention” that enables users to “take steps toward mitigating serious diseases” such as diabetes, coronary heart disease, and breast cancer.

23andmescreenshot

This is not a shot over the bow–it’s the last straw. FDA has warned 23andMe repeatedly, going back to July, 2012, that they were making health claims about their product that they couldn’t back up.

The company offers two types of products: a genealogical “panel” or profile, and a health panel. The genealogical panel is popular but is apparently considered a harmless hobby, or at least outside the purview of the Public Health Service. It is not clear whether FDA (which, like the National Institutes of Health and the Centers for Disease Control) falls under the sprawling PHS will have any concerns about genealogical applications of the saliva test, but that would seem unlikely. The problem for 23andMe is that, as shown by the ad we analyzed earlier, they have been pushing the health panel very hard. Family trees are a hobby; health is where the real money is.

Direct-to-consumer medicine trails an appealing democratic, anti-authoritarian perfume that seems to make people slightly drunk. Mild intoxication can be pleasant, need not be dangerous, and sometimes can be a spur to creativity. But it can also impair your judgment. When you’ve gotta drive the kids home, you may need a couple cups of good strong regulatory coffee and a couple hours to sober up before getting behind the wheel.

A good deal of “preventive, participatory, personalized” medicine is profit-driven, and stockholders don’t necessarily have the public’s health foremost in mind. The FDA warning is a good illustration of why it’s important to balance the goal of stimulating innovation and economic growth with the goal of maximizing health. For the former, the free market can be a powerful tool. But for the latter, sometimes you need a little good old-fashioned meritocratic oversight.

h/t Robert Resta, Mark Largent

 

Neonatal genome screening: preventive medicine or prophylactic profiteering?

Thoughtful blog post over at Nature recently by Erika Check, on a $25M set of 4 studies that will sequence the exomes of 1500 neonates, whether ill or not. Called the Genomic Sequencing and Newborn Screening Disorders program, it is essentially a pilot study for universal newborn genome sequencing. One could see such a study coming down the pike. But if this is a direction in which medicine is heading, we should be moving like a wary cat, not like a bounding puppy.

The dominant rhetoric for whole-genome screening sketches a benevolent world of preventive care and healthier lifestyles. “One can imagine a day when every newborn will have their genome sequenced at birth,” said Alan Guttmacher, director of NICHD, which co-sponsors the program with the genome Institute. In his genotopian vision, a baby’s sequence “would become a part of the electronic health record that could be used throughout the rest of the child’s life both to think about better prevention but also to be more alert to early clinical manifestations of a disease.”

But deeper in her article, Check responsibly quotes a skeptic, Stephen Kingsmore of Children’s Mercy Hospital and Clinics in Kansas City, who estimates that the program is likely to find 20 false positives for every true positive. In other words, only around 5% of what will loosely be called “disease genes” will in fact lead to disease. One of the reasons for that low rate of true positives is that many of the disease alleles we can screen for concern diseases of old people: Alzheimer’s, various cancers, and so on. Life experience plays a large and still imperfectly understood role in such diseases. Sure, we can test at birth or even before for the SNPs we know correlate with those diseases, but, Check asks, what does that really tell us?

In Guttmacher’s sunny scenario about early prevention, the parents and later the child could be regularly reminded of this individual’s elevated risk. This itself has not only direct health risks but potentially a significant inadvertent impact on the patient’s social life. Everything from the child’s temperament (is she anxious by nature?) to family situation (ill siblings? Alcoholic parent? Suicide?) to many other factors could profoundly modulate how this genetic knowledge would affect the child. Social context matters.

But such an individualized, lifelong health-maintenance program is unlikely ever to be accessible beyond medicine’s most elite customers. Personalized medicine has been around since the ancient Greeks, and, logically enough, it’s expensive. Only the rich have ever been able to afford truly individualized care. “Personalized medicine” seems to have almost as many meanings as people who use the term, but if what you mean by personalized medicine is a physician who knows you as an individual and tracks your healthcare over a significant part of your lifetime, you’re talking about elite medicine.

Medicine for the middle and lower classes tends to be much more anonymous and impersonal. Throughout medical history, the headcount–if they can afford a doctor at all–get more routinized, generalized care. Even many in that fortunate segment of the population today who have health insurance attend clinics where they do not see the same doctor every time. In any given visit, their doctor is likely to know them only by their chart. No one asks, “Has your family situation settled down yet? Are you sleeping better? How’s your new exercise program going?” What you get is a 15-minute appointment, a quick diagnosis, and, usually, a prescription. Genomic technology is unlikely to change this situation. If anything, it will enhance it.

For the hoi polloi, then, personalized medicine will likely mean personalized pharmacology. Some of those most excited about personalized medicine are biotech and pharma companies and their investors, because some of the most promising results from genomic medicine have been new drugs and tests. Should neonatal genome screening become part of routine medical care, middle and lower-class parents would likely be given a report of their child’s genome, the associated disease risks, and a recommended prophylactic drug regimen. Given an elevated risk of high cholesterol or other heart disease, for example, you might be put on statins at an early age. A SNP associated with bipolar disease or schizophrenia might prompt preventive anti-depressants or anti-psychotics. And so forth.

Such a program would be driven first by the principles of conservative medical practice. Medicine plays it safe. If there’s a risk, we minimize it. If you go to the ER with a bad gash, you’ll be put on a course of antibiotics, not because you have an infection but to prevent one. Second, it would be driven by economics. Drug companies obviously want to sell drugs. So they will use direct-to-consumer marketing and whatever other tools they have to do so. That’s their right, and in a comparatively unregulated market, arguably their duty.

But now recall Kingmore’s figure of 20 false positives for every true positive. This may sound high, but again, medical practice is conservative: we’d rather warn you of a disease you won’t get than fail to notify you of a disease you will get. False positives, in other words, are preferable to false negatives. Add to that the scanty state of our knowledge of gene-environment interactions. We are rapidly accumulating mountains of data on associations between SNPs and diseases, but we still know little about how to interpret the risks. We needn’t invoke any paranoid conspiracy theory: that kind of data is devilishly hard to acquire. Science is the art of the soluble.

If Kingmore is even in the ballpark, then, the more neonatal genome screening reaches into the population, the more unnecessary drugs people will be taking. Unnecessary medication of course can have negative effects, especially over the long term. Indeed, the long-term and developmental effects of many medications–especially psychiatric medications–are unknown.

The Genomic Sequencing and Newborn Screening Disorders program is purely an investigative study. Parents in this study won’t even be given their children’s genome reports. But the study is obviously designed to investigate the impact of widespread neonatal whole-genome screening. Currently, all 50 states administer genetic screening for phenylketonuria and other common diseases. The historian Diane Paul has written a superb history of PKU screening. It’s not hard to imagine a similar scenario playing out, with one state leading the way with a bold new program of universal newborn exome screening and, in a decade or two, all other states following its lead.

“Personalized medicine” is a term that’s used increasingly loosely. It covers a multitude of both sins and virtues, from old-fashioned preventive regimens to corporate profiteering. From here, widespread neonatal genome screening looks like an idea that will benefit shareholders more than patients.

 

The gene for hubris


A recent post by Jon Entine on the Forbes website leads with a complimentary citation of my book– and then goes on to undermine its central thesis. He concludes:

Modern eugenic aspirations are not about the draconian top-down measures promoted by the Nazis and their ilk. Instead of being driven by a desire to “improve” the species, new eugenics is driven by our personal desire to be as healthy, intelligent and fit as possible—and for the opportunity of our children to be so as well. And that’s not something that should be dismissed lightly.

Well, first of all, as the recent revelations of coerced sterilization of prisoners in California shows, “draconian, top-down” measures do still occur. Genetics and reproduction are intensely potent, and wherever we find abuse of power we should be alert to the harnessing of biology in the service of tyranny.

Second, there’s more than one kind of tyranny. Besides the tyranny of an absolute ruler, perhaps the two most potent and relevant here are the tyranny of the commons and the tyranny of the marketplace. The fact that they are more subtle makes them in some ways more dangerous. The healthcare industry does much good in the world, but it is naive to treat it as wholly benign.

Further, putting human evolution in the hands of humans, means accepting long-term consequences for short-term goals. The traits we value–health, intelligence, beauty–are the result of the action of many genes interacting with each other and with a dynamic environment. The entire system is contingent, inherently unpredictable. Yet we treat it as simple and deterministic. Until now, technology has been the major obstacle to guiding human evolution. It may be that now the major obstacle is our reasoning ability, our capacity for grasping contingency and probability and change. We’re tinkering with the machinery of a system whose complexity is still unfolding before us. The probability of unforeseen consequences is 100%. The only question is how severe they will be. We will only know in retrospect.

If we now have the tools to meaningfully guide our own evolution–as eugenicists have always wanted to do–we cannot take a blithe and Panglossian attitude. We have to be alert to the risks and take them seriously. That is not traditionally science’s strong suit. The public face of science is sunny, optimistic, fun. It strides boldly into the future, laughing and making striking promises. The industries behind science and health are wealthy and politically powerful. Not everything they do is benign.

To be a critic of that public-relations machine–of hype, in other words–is not to be a critic of health or knowledge or progress. Genetic science has the potential to bring us enormous benefits in health and well-being, and as they do, I stand in line with my fellow humans for my fair share. But that science also carries huge and unforeseeable risks, the root of which, perhaps, is arrogance. It’s one whose consequences are painfully evident in the historical record.

 

Criminomics: stopping crime before it starts

Criminomics Bears Fruit: 2037 Murder Rate Lowest Since 1964

 

Dylan looks like any normal six-year-old. He is bright and a little mischievous, has many friends, and is praised by his teachers as a model student. But his normalcy is only skin deep. In his cells lies the DNA of a murderer.

Though Dylan has gene variants that give him a more than 90% chance of premeditated mass murder, he will never commit a crime. Thanks to early intervention by doctors, Dylan’s criminal tendencies were identified before birth. Rather than abort the fetus, however, Dylan’s parents agreed to an intensive program of medication and counseling that will all but ensure that Dylan will lead a happy, normal, peaceful life.

Dylan is one of the success stories of the Criminal Genome Project, or CGP, the effort to sequence the complete set of genes involved in murder and other antisocial behaviors. The controversial science on which this project is based—criminomics—is winning converts, now that the latest crime figures are in. Last year, the annual murder rates in 8 American cities dropped to double digits for the first time since the middle of the twentieth century. In Washington, DC, only 90 people died by gunshot last year, down from 103 in 2036. Experts attribute the drop to criminopathy, a medical and public-health approach to crime based on criminomics. The criminomic method uses high-speed genome sequencing to to identify criminal tendencies at birth and begin treatment early in life. Clinical trials for criminal-gene therapy, which would eliminate antisocial tendencies permanently, are underway and, though preliminary, are showing promising early results. The first criminopathic patients are just hitting their 20s now—and the peace is deafening.

The CGP is run by Dr. Bart O’Day, a criminomicist at the Baler Agricultural and Behavioral University in the Republic of Texas. O’Day wrote the grant proposal that funded the project after a tragic shooting at an elementary school in Connecticut in 2012 in which 20 children and 6 adults were killed by a lone gunman. Thankfully, such a crime has since become unimaginable, thanks to the efforts of O’Day and his colleagues.

“It was an obvious thing to do,” O’Day said. “In the years just before the CGP, we had sequenced the cancer genome, the influenza genome, the pseudome, the schizome, and the retardome. The criminome was just lying in wait for us. So the science was there. All we needed was the motivation.”

In fact, the motivation had been there for 150 years. In the 1870s, the Italian criminologist Cesare Lombroso defined a “criminal type,” characterized by distinctive facial features and, ironically, the excessive use of tattooing, which he used in one of the first systematic attempts to prevent crime by biological methods.[1] About the same time, the Victorian polymath Francis Galton developed “composite photography,” in which he superimposed images of faces as a means of identifying the “criminal type.” “If criminals are found to have certain special types of features, that certain personal peculiarities distinguish those who commit certain classes of crime,” observed Edmund DuCane, one of the leading criminologists of Victorian England, “the tendency to crime is in those persons born or bred in them, and either they are incurable or the tendency can only be checked by taking them in hand at the earliest periods of life.”[2]

With the creation of the science of genetics after the turn of the last century, vaguenesses such as “inborn tendencies” and “heredity” hardened into “genes.” In 1914, the American psychologist Henry H. Goddard wrote, “The criminal is not born; he is made.” Goddard traced criminality to mental retardation, or “feeble-mindedness,” in the term of the day. By compassionately treating feeble-mindedness, Goddard believed one could prevent crime. The feeble-minded type, Goddard wrote, was “misunderstood and mistreated, driven into criminality for which he is well fitted by nature. It is hereditary feeble-mindedness not hereditary criminality that accounts for the conditions.”[3] Goddard believed he had found a single Mendelian gene for feeble-mindedness. By breeding it out of the population, he thought he could eliminate crime, as well as poverty, prostitution, and much illness. Though the feeblemindedness gene has been discredited, Goddard’s belief that crime is a genetic disease rather than a perverse exercise of free will has transformed our criminal justice system.

The decisive step was in reframing crime in terms of public health rather than justice. In the early1990s, the National Institute of Alcohol, Drug Abuse, and Mental Health (today subsumed under the National Institute of Genomics) undertook a massive Violence Initiative based on similar principles. It pursued a public health approach to urban crime, which, proponents recognized, was based on biology (and therefore, ultimately, genes).[4] Uncontroversial at first, liberal opposition to the effort mounted, ultimately leading to the canceling of a scientific conference on genetic factors in crime in 1992.[5] This first Violence Initiative died a rather brutal and noisy death. Yet work on the biological basis of crime continued apace. In 1995, a Danish twin study identified the first crime gene, and more were identified shortly after the turn of the century.

But it was high-speed genome sequencing, combined with sophisticated methods of correlating complex behaviors with DNA sequence, that finally provided the technological breakthrough to stop crime before it starts. After the 2012 school shooting, it took a full year for O’Day’s team to sequence the criminal genome (today it could be done in an afternoon). But in 2014, they published paper describing 112 gene variants that together account for more than 99% of predisposition to murder. The genes were patented and licensed to pharmaceutical companies, and seven new targeted therapies were quickly added to the standard psychiatric armamentarium of anti-depressants and anti-psychotics. The federal Violence Initiative was reinstated in 2015 as the Institute of Crime Prevention (ICM), a branch of the National Institutes of Mental Health.

The first mandatory screening for criminal tendencies was put in place in Washington, DC, in 2018. Other states quickly followed; today, only West Dakota and North Virginia lack screening laws. Convicted murderers were the first to be screened. The ICM then tied crime screening to the back-to-school vaccination requirements for students in secondary and primary schools. Most states now test babies at birth, with blood from the standard heel-stick. Babies born with greater than 50% chance of committing murder have their standard RFID chips, implanted in every child at birth, encoded with the designation “Precrim.”

Individuals identified as precriminal are placed under the care of a criminopathic physician, assigned a health care worker, and given criminal prophylaxis: a treatment regimen tailored to their genetic and environmental circumstances. In all cases, this involves a combination of medications and counseling designed to maintain equanimity, promote sociality, and minimize the risk of triggers, including certain music and video games. Teachers and the parents of friends can discretely scan the child and take steps to minimize conflict and quickly intervene should violence erupt. Most states now prohibit the guardians of precrims from keeping firearms in their homes. NRA members oppose such bans, pointing out that since precrims can be dosed so as to ensure docility with a wide margin of safety, prohibiting guns in precrim homes is overkill.

Combined, these methods have proven remarkably effective. Murder rates began dropping as soon as the programs were put in place, but as the first neonatal precrims hit their teens, rates began to plummet. The rates of other violent crimes have also begun to fall, though somewhat more slowly: rapes are down in most states, as are armed robberies and even grafitti and illegal dumping. Scientists at the CGP explain these results by hypothesizing that many criminal behaviors share a common genetic mechanism, possibly related to emotional intelligence.

For all its success, the program has its opponents. Eugene Galton, a member of the Galton dynasty of scientific criminologists, recognizes the benefits of the criminopathy program but thinks the social costs are too high. “Liberty is too high a price to pay for safety,” he says. “We’re ceding our free will to an iatrocracy—a government by the doctors.”

Such philosophical musings carry little weight with inner-city residents who now sleep more peacefully, without the constant pops of gunfire that once punctuated the night. Dylan’s mother sees safety as the best kind of freedom: “I prefer a war with drugs to a War on Drugs,” she says. “I love my son; I’d rather put chemical bars around his mind than steel ones around his body.”


[2] Galton, “Composite portraits,” 143.

[4] Extrapolating slightly from Breggin, Reclaiming our children, p. 52.

[5] New York Times, Sept. 5, 1992, front page. See also Allen, Garland E. “Modern Biological Determinism: The Violence Initiative, the Human Genome Project, and the New Eugenics.” In The Practice of Human Genetics, 1-23, 1999.

 

New findings suggest scientists not getting smarter

Certain critics of rigid genetic determinism have long believed that the environment plays a major role in shaping intelligence. According to this view, enriched and stimulating surroundings should make one smarter. Playing Bach violin concertos to your fetus, for example, may nudge it toward future feats of fiddling, ingenious engineering, or novel acts of fiction. Although this view has been challenged, it persists in the minds of romantics and parents–two otherwise almost non-overlapping populations.

If environmental richness were actually correlated with intelligence, then those who live and work in the richest environments should be measurably smarter than those not so privileged.  And what environment could be richer than the laboratory? Science is less a profession than a society within our society–a meritocracy based on an economy of ideas. Scientists inhabit a world in which knowledge accretes and credit accrues inexorably, as induction, peer review, and venture capital fuel the engines of discovery and innovation. Science has become the pre-eminent intellectual enterprise of our time–and American science proudly leads the world. The American biomedical laboratory is to the 21st century what the German university was to the 19th; what Dutch painting was to the 17th; the Portuguese sailing ship to the 16th; the Greek Lyceum to the minus 5th.

According to this view, then, scientists should be getting smarter. One might measure this in various ways, but Genotopia, being quantitatively challenged, prefers the more qualitative and subjective measure of whether we are making the same dumb mistakes over and over. So we are asking today: Are scientists repeating past errors and thus sustaining and perhaps compounding errors of ignorance? Are scientists getting smarter?

Yes and no. A pair of articles (12) recently published in the distinguished journal Trends in Genetics slaps a big juicy data point on the graph of scientific intelligence vs. time–and Senator, the trend in genetics is flat. The articles’ author, Gerald Crabtree, examines recent data on the genetics of intelligence. He estimates that, of the 20,000 or so human genes, between 2,000 and 5,000 are involved in intelligence. This, he argues, makes human intelligence surprisingly “fragile.” In a bit of handwaving so vigorous it calls to mind the semaphore version of Wuthering Heights, he asserts that these genes are strung like links in a chain, rather than multiply connected, as nodes of a network. He imagines the genes for intelligence to function like a biochemical pathway, such that any mutation propagates “downstream”, diminishing the final product–the individual’s god-given and apparently irremediable brainpower.

IQIQ

Beginning in 1865, the polymath Francis Galton fretted that Englishmen were getting dumber. In his Hereditary Genius (1865) he concluded that “families are apt to become extinct in proportion to their dignity” (p. 140). He believed that “social agencies of an ordinary character, whose influences are little suspected, are at this moment working towards the degradation of human nature,” although he acknowledged that others were working toward its improvement. (1) The former clearly outweighed the latter in the mind of Galton and other Victorians; hence Galton’s “eugenics,” an ingenious scheme for human improvement through the machinations of “existing law and sentiment.” Galton’s eugenics was a system of incentives and penalties for marriage and childbirth, meted out according to his calculations of social worth.This is a familiar argument to students of heredity. The idea that humans are degenerating–especially intellectually–persists independently of how much we know about intelligence and heredity. Which is to say, no matter how smart we get, we persist in believing we are getting dumber.

Galton was just one exponent of the so-called degeneration theory: the counter-intuitive but apparently irresistible idea that technological progress, medical advance, improvements in pedagogy, and civilization en masse in fact are producing the very opposite of what we supposed; namely, they are crippling the body, starving the spirit, and most of all eroding the mind.

The invention of intelligence testing by Alfred Binet just before the turn of the 20th century provided a powerful tool for proving the absurd. Though developed as a diagnostic to identify children who needed a bit of extra help in school–an enriched environment–IQ testing was quickly turned into a fire alarm for degeneration theorists. When the psychologist Robert M. Yerkes administered a version of the test to Army recruits during the first world war, he concluded that better than one in eight of America’s Finest were feebleminded–an inference that is either ridiculous or self-evident, depending on one’s view of the military.

These new ways of quantifying intelligence dovetailed perfectly with the new Mendelian genetics, which was developed beginning in 1900. Eugenics—a rather thin, anemic, blue-blooded affair in Victorian England, matured in Mendelian America into a strapping and cocky young buck, with advocates across the various social and political spectra embracing the notion of hereditary improvement. Eugenics advocates of the Progressive era tended to be intellectual determinists. Feeblemindedness–a catch-all for subnormal intelligence, from the drooling “idiot” to the high-functioning “moron”—was their greatest nightmare. It seemed to be the root of all social problems, from poverty to prostitution to ill health.

And the roots of intelligence were believed to be genetic. In England, Cyril Burt found that Spearman’s g (for “general intelligence”)—a statistical “thing,” derived by factor analysis and believed by Spearman, Burt, and others to be what IQ measures—was fixed and immutable, and (spoiler alert) poor kids were innately stupider than rich kids. In America, the psychologist Henry Goddard, superintendent of the Vineland School for the Feebleminded in New Jersey and the man who had introduced IQ testing to the US, published Feeblemindedness: Its Causes and Consequences in 1914. Synthesizing years of observations and testing of slow children, he suggested–counter to all common sense–that feeblemindedness was due to a single Mendelian recessive gene. This observation was horrifying, because it made intelligence so vulnerable–so “fragile.” A single mutation could turn a normal individual into a feebleminded menace to society.

As Goddard put it in 1920, “The chief determiner of human conduct is the unitary mental process which we call intelligence.” The grade of intelligence for each individual, he said, “is determined by the kind of chromosomes that come together with the union of the germ cells.” Siding with Burt, the experienced psychologist wrote that intelligence was “conditioned by a nervous mechanism that is inborn, and that it was “but little affected by any later influence” other than brain injury or serious disease. He called it “illogical and inefficient” to attempt any educational system without taking this immovable native intelligence into account. (Goddard, Efficiency and Levels of Intelligence, 1920, p 1)

This idea proved so attractive that a generation of otherwise competent and level-headed reserchers and clinicians persisted in believing it, again despite it being as obvious as ever that the intellectual horsepower you put out depends on the quality of the engine parts, the regularity of the maintenance you invest in it, the training of the driver, and the instruments you use to measure it.

The geneticist Hermann Joseph Muller was not obsessed with intelligence, but he was obsessed with genetic degeneration. Trained at the knobby knees of some of the leading eugenicists of the Progressive era, Muller–a fruitfly geneticist by day and a bleeding-heart eugenicist by night–fretted through the 1920s and 1930s about environmental assaults on the gene pool: background solar radiation, radium watch-dials, shoestore X-ray machines, etc. The dropping of the atomic bombs on the Japanese sent him into orbit. In 1946 he won a Nobel prize for his discovery of X-ray-induced mutation, and he used his new fame to launch a new campaign on behalf of genetic degeneration. The presidency of the new American Society of Human Genetics became his bully pulpit, from which he preached nuclear fire and brimstone: our average “load of mutations,” he calculated, was about eight damaged genes–and growing. Crabtree’s argument thus sounds a lot like Muller grafted onto Henry Goddard.

In 1968, the educational psychologist Arthur Jensen produced a 120-page article that asserted that compensatory education–the idea that racial disparities in IQ correlate with opportunities more than innate ability, and accordingly that they can be reduced by enriching the learning environments of those who test low–was futile. Marshaling an impressive battery of data, most of which were derived from Cyril Burt, Jensen insisted that blacks are simply dumber than whites, and (with perhaps just a hint of wistfulness) that Asians are the smartest of all. Jensen may not have been a degenerationist sensu strictu, but his opposition to environmental improvement earns him a data point.

In 1990, Richard Herrnstein and Charles Murray published their infamous book, The Bell Curve. Their brick of a book was a masterly and authoritative rehash of Burt and Jensen, presented artfully on a platter of scientific reason and special pleading for the brand of reactionary politics that is reserved for those who can afford private tutors. They found no fault with either Burt’s data (debate continues, but it has been argued that Burt was a fraud) or his conclusion that IQ tests measure Spearman’s g, that g is strongly inherited, and that it is innate. Oh yes, and that intellectually, whites are a good bit smarter than blacks but slightly dumber than Asians. Since they believed there is nothing we can do about our innate intelligence, our only hope is to “marry up” and try to have smarter children.

The Bell Curve appeared just at the beginning of the Human Genome Project. By 2000 we had a “draft” reference sequence for the human genome, and by 2004 (ck) “the” human genome was declared complete. Since the 1940s, human geneticists had focused on single-gene traits, especially diseases. One problem with Progressive era eugenics, researchers argued, was that they had focused on socially determined and hopelessly complex traits; once they set their sights on more straightforward targets, the science could at last advance.

But once this low-hanging fruit had been plucked, researchers began to address more complex traits once again. Disease susceptibility, multicausal diseases such as obesity, mental disorders, and intelligence returned to the fore. Papers such as Crabtree’s are vastly more sophisticated than Goddard’s tome. The simplistic notion of a single gene for intelligence is long gone; each of Crabtree’s 2,000-5,000 hypothetical intelligence genes hypothetically contributes but a tiny fraction of the overall. If you spit in a cup and send it to the personal genome testing company 23AndMe, they will test your DNA for hundreds of genes, including one that supposedly adds 7 points to your IQ (roughly 6 percent for an IQ of 110).

Thus we are back around to a new version of genes for intelligence. Despite the sophistication and nuance of modern genomic analyses, we end up concluding once again that intelligence is mostly hereditary and therefore also racial, and that it’s declining.

Apart from the oddly repetitious and ad hoc nature of the degeneration argument, what is most disconcerting is this one staring implication: that pointing out degeneration suggests a desire to do something about it. If someone were, say, sitting on the couch and called out, “The kitchen is sure a mess! Look at the plates all stacked there, covered with the remains of breakfast, and ick, flies are starting to gather on the hunks of Jarlsberg and Black Twig apples hardening and browning, respectively, on the cutting board,” you wouldn’t think he was simply making an observation. You’d think he was implying that you should get in there and clean up the damn kitchen. Which would be a dick move, because he’s just sitting there reading the Times, so why the heck doesn’t he do it himself. But the point is, sometimes observation implies action. If you are going to point out that the genome is broken, you must be thinking on some level that we can fix it. Thus, degeneration implies eugenics. Not necessarily the ugly kind of eugenics of coercive sterilization laws and racial extermination. But eugenics in Galton’s original sense of voluntary human hereditary improvement.

And thus, scientists do not appear to be getting any smarter. Despite the enriched environs of the modern biomedical laboratory, with gleaming toys and stimulating colleagues publishing a rich literature that has dismantled the simplistic genetic models and eugenic prejudices of yore, researchers such as Crabtree continue to believe the same old same old: that we’re getting dumber–or in danger of doing so.

In other words, sometimes the data don’t seem to matter. Prejudices and preconceptions leak into the laboratory, particularly on explosive issues such as intelligence and/or race, regardless of how heredity is constructed. Plenty of scientists are plenty smart, of course. But rehashing the degeneracy theory of IQ does not demonstrate it.