Lies, damned lies, and GWAS

[Blueprint cover]

[Edited, lightly, 10/30/18]

Well, I provoked another kerfuffle in the pages of Nature. And it’s a troubling one, because it suggests a widening culture gap between the sciences and humanities—a gap I’ve spent a quarter-century-and-counting trying to bridge.

The piece in question is my review of Robert Plomin’s new book, Blueprint. I want here to give some context and fuller explanation for that piece, without the editorial constraints of a prestigious journal.

Plomin is a distinguished educational psychologist, American-born-and-raised, who works in the UK, at King’s College London. I have nothing for or against Plomin. I and my editors at Nature worked hard to keep ad hominem attacks out of the piece. Plomin’s book, however, is terrible.

  • “DNA isn’t all that matters, but it matters more than everything else put together” (ix).
  • “Nice parents have nice children because they are all nice genetically.” (83)
  • “The most important thing that parents give to their child is their genes.” (83)
  • “The less than 1 per cent of these DNA steps that differ between us is what makes us who we are as individuals” (9)
  • “These findings call for a radical rethink about parenting, education and the events that shape our lives.”(9)

I could go on—it’s teh Interwebs; people do—but you get the idea. However accurately he may represent the statistics, his statistics misrepresent the biology. With apologies to Benjamin Disraeli, there are three kinds of lies in genomics: Lies, damned lies, and GWAS.

*

Unless you’re some sort of third-rate hack, riding out your tenure on the strength of your rotation in a genetics lab in the mid-sixties, you know better than to say things like this. I presume that you, Gentle Reader, are not such a person. Plomin is not a hack scientifically, but as a writer he is. On the page, he is all exclamation points and pom-pons, thwacking the bass drum with his heel as he dances for DNA. DNA is everything. DNA is the Word. DNA is Love.

That’s a weirdly retrograde view in the postgenomic world. Again, see the above mainstream sciences. Either:

  1. Plomin is such a poor writer that he fails utterly to get across his putatively more nuanced understanding of the genome’s role in biology
  2. Plomin is a cynic, writing things he doesn’t believe in support of a repressive ideology that he does believe; or
  3. He is naive and genuinely doesn’t realize (or care) how this powerful new science is used.

I went with Door #3. As a reviewer of his book I took him at face value as a writer. I take him at his word when he says he is center-left, politically. (No conflict there with eugenic values. There’s a long tradition of leftist eugenics, back to the 19th C.) And I decided it would be unproductive to simply brand him an ideologue. Instead, I took the middle road, presumed innocence but naïveté. I gave him the benefit of the doubt in presuming innocence of intent, although I do think it an odd negligence on his part, given his express interest in using PGS to shape social policy. Seems to me that if you’re engaging with social policy you should be capable of thinking in terms of social policy.

Why is it dangerous? Because it enables and encourages social policy of biological control. That shit scares me. And it should scare you. Plomin is so caught up in his DNA delirium that he says that environmental interventions toward human betterment are futile. Parents, teachers, government officials: Relax, there’s nothing you can do that really makes a difference. DNA will out. Kids will be what they will be, regardless, so don’t waste your money and time. Let’s take that apart.

*

First, short-changing environmental and GxE (genes-by-environment) effects in human personality is literally the entire point of Plomin’s book. I’m not talking about his published, peer-reviewed studies, I’m talking about his latest book. Nor am I damning all sociogenomics work—not by a long-shot. I am damning the simplistic message of this particular book: “DNA makes you who you are.”

Second, then, hidden agenda or not, Plomin’s argument is socially dangerous. Sure, genes influence and shape complex behavior, but we have almost no idea how. At this point in time (late 2018), it’s the genetic contributions to complex behavior that are mostly random and unsystematic. Polygenic scores may suggest regions of the genome in which one might find causal genes, but we already know that the contribution of any one gene to complex behavior is minute. Thousands of genes are involved in personality traits and intelligence—and many of the same polymorphisms pop up in every polygenic study of complex behavior. Even if the polygenic scores were causal, it remains very much up in the air whether looking at the genes for complex behavior will ever really tell us very much about those behaviors.

In contrast—and contra Plomin—we have very good ideas about how environments shape behaviors. Taking educational attainment as an example (it’s a favorite of the PGS crowd—a proxy for IQ, whose reputation has become pretty tarnished in recent years), we know that kids do better in school when they have eaten breakfast. We know they do better if they aren’t abused. We know they do better when they have enriched environments, at home and in school.

We also know that DNA doesn’t act alone. Plomin neglects all post-transcriptional modification, epigenetics, microbiomics, and systems biology—sciences that show without a doubt that you can’t draw a straight line from genes to behavior. The more complex the trait in question, the more true that sentence becomes. And Plomin is talking about the most complex traits there are: human personality and intelligence.

Plomin’s argument is dangerous because it minimizes those absolutely robust findings. If you follow his advice, you go along with the Republicans and continue slowly strangling public education and vote for that euphemism for separate-but-equal education, “school choice.” You axe Head Start. You eliminate food stamps and school lunch programs. You go along with eliminating affirmative action programs, which are designed to remediate past social neglect; in other words, you vote to restore neglect of the under-privileged. Those kids with genetic gumption will rise out of their circumstances one way or another…like Clarence Thomas and Ben Carson or something, I guess. As for the rest, fuck ’em.

These environmental factors are “unsystematic” in exactly the same way that genetic factors are: They do not act in the same way with every child. A few kids will always fall on the long tails of the bell curve: Some will do well in school no matter what you throw at them; others will fail, no matter what they have for breakfast. But the mean shifts in a positive direction. Same is true for genetics. That is literally the entire point of polygenic scores! Every single one of the many thousands of variables that shape something like educational attainment is probabilistic. Genes or environment, we’re talking population averages and probabilities, not certainties. There is no certainty—Plomin himself makes this point repeatedly (and then promptly jumps back on the DNA wagon). To venerate genetics and derogate environment on grounds of being “unsystematic” is at best faulty reasoning and at worst hypocritical.

Plomin is spreading a simplistic and insidious doctrine that says “environmental intervention is futile.” I don’t care whether Plomin himself, in his heart of hearts, wants to ban public education; he gives ammunition to people who do want to ban it. “Race realists” and “human biodiversity” advocates—modern euphemisms for white supremacy—read this stuff avidly. I watched them swarm around the discussion of my review on Twitter, many of them newly created accounts, favoriting tweets from my critics, saving those messages for later arguments.

“But does that mean that EVERYONE using PGS is a white supremacist?” people ask me, their keyboards dripping with sarcasm. No, dummies: It means it’s a risk. I’m giving you a qualitative risk score, a probability. Can’t you apply your own logic to other situations? Again, whether you critics are being disingenuous or naive, the effect is the same.

“So does that mean we CAN’T DO this science? Are you a fascist, trying to stifle scientific inquiry?!?” others gently query me. Again, no, dummies: I’m saying if you do this stuff, a) get your genetic bias out of the way and look at genes and environment, and b) be candid and explicit about your intentions and the risks of misinterpreting the data.

*

The last point I want to make is about historical thinking. A lot of critics said and called me things that initially puzzled me. I had no evidence, they said. It wasn’t a review that I wrote, they complained. I didn’t engage with the book but merely promoted my ideology, they protested.

Eventually it dawned on me: These people don’t understand historical reasoning. They fail to see a historical argument as being evidentiary! After all these years, I still find it surprising that people with PhDs should fail to acknowledge an entire branch of knowledge, a fundamental way of explaining the world. But okay, communication, not war, is what I’m after, so let me lay it out, at least as I see it.

By and large, experimental science explains the world in terms of mechanisms, more or less eternal and independent of time and context. Historical reasoning explains the world through the three C’s: context, contingency, and cause-and-effect through time. Context means that science doesn’t occur in a vacuum. It mattered that Nazi Germany arose after Progressive-era Americans had advanced a scientific program of sterilization and institutionalization of the defective. The Germans, sensing the power of rigid social control founded on scientific authority and finding that authority in American eugenics, modeled their infamous sterilization law of 1932 on Harry Laughlin’s “Model Sterilization Law” of 1922. Contingency means that it matters who did what when. Contingency says, “It could have been otherwise: Why did things turn out as they did instead of some other way?” And by continuity and change I mean that when a historian tries to understand the present in terms of the past.

In short, my review used historical evidence to put Plomin’s book in historical and social context, rather than scientific context, and a number of critics cried “Foul!”, failing to even see historical evidence as evidence.

It’s not foul when a reviewer examines the central claims and aims of a book—you all (critics) just aren’t familiar with the way I did it. Especially with a field such as sociogenomics, which defines itself as interdisciplinary and aims to shape society. You may not read Plomin’s book in social context, but it’s a legitimate and—as attested by the many plaudits I’ve also received for the piece—an important one. Learn your damn history.

Historical thinking is not anti-scientific. Darwin was a historical thinker par excellence. Evolutionary biology, cosmology, geology, paleontology—these are historical sciences. To reason historically is by no means to refute the scientific method. In fact, I would wager that no person can get through life without employing historical reasoning). To do so is to refuse to learn from experience.

Because I did not primarily use scientific evidence in my review, some misguided and uncritical critics peg me as a radical relativist. You got the wrong guy, pal. I studied sociobiology and neuroethology as science for years, under some of the greats. Radical relativists do exist in the humanities and social sciences—I argue against them all the time.

Others, like Stuart Ritchie, bash away at me boneheadedly, picking at details of the piece that make no difference to my argument without the slightest effort try to understand what I am in fact saying. Playing “gotcha” is easier than real debate, but it wins you points with your yes-friends, I guess. Sad!

Historical thinking pulls your eye away from the microscope to look at science as an enterprise evolving over time. For really abstruse sciences, doing so may be mostly an intellectual exercise. But the more social salience a branch of science has, the more important it is to take the long view once in a while. And when you’re explicitly advocating using your science to shape society, it’s incumbent on you to do so.

I identify with distinguished, politically alert scientist-critics like Jonathan Beckwith and Richard Lewontin, who have critiqued their science (even their own science!) in order to make it better. I believe that in science as in politics, dissent is the sincerest form of patriotism.

*

In short, I don’t think sociogenomics is wrong; I think it’s being done wrong, and written about wrong, by people like Plomin. Some people are doing it right. In a forthcoming piece in the MIT Technology Review, I discuss the work of Graham Coop at UC Davis. I won’t go into detail here, but for examples of honest, candid, historically sensitive discussion of PGS research, see this and this. This post is my attempt to do for my critique of PGS what Coop did for his own PGS research (which, for the record, I admire). By and large, I think the biologists have been doing better at avoiding deterministic talk than the social scientists like Plomin—although some social scientists, like the sociologist Dalton Conley at Princeton, do at least have complex positions worth taking seriously.

Plomin does none of this. Instead, he gives us a simplistic and distorted view of the role of heredity in behavior that causes much social mischief. We can watch some of that mischief in real time as the white supremacist trolls swarm around this debate like yellow-jackets around an open soda. Other risks we can infer from careful comparison with historical examples—looking at both the similarities and the differences between Blueprint and previous attempts to use heredity to shape social policy. The argument in Blueprint—that “DNA makes us who we are” and environment “is important but it doesn’t matter”—is an idea grossly wrong on many levels. It’s not supported by the evidence and it’s socially dangerous.

And someone’s got to call bullshit on it.

New findings suggest scientists not getting smarter

Certain critics of rigid genetic determinism have long believed that the environment plays a major role in shaping intelligence. According to this view, enriched and stimulating surroundings should make one smarter. Playing Bach violin concertos to your fetus, for example, may nudge it toward future feats of fiddling, ingenious engineering, or novel acts of fiction. Although this view has been challenged, it persists in the minds of romantics and parents–two otherwise almost non-overlapping populations.

If environmental richness were actually correlated with intelligence, then those who live and work in the richest environments should be measurably smarter than those not so privileged.  And what environment could be richer than the laboratory? Science is less a profession than a society within our society–a meritocracy based on an economy of ideas. Scientists inhabit a world in which knowledge accretes and credit accrues inexorably, as induction, peer review, and venture capital fuel the engines of discovery and innovation. Science has become the pre-eminent intellectual enterprise of our time–and American science proudly leads the world. The American biomedical laboratory is to the 21st century what the German university was to the 19th; what Dutch painting was to the 17th; the Portuguese sailing ship to the 16th; the Greek Lyceum to the minus 5th.

According to this view, then, scientists should be getting smarter. One might measure this in various ways, but Genotopia, being quantitatively challenged, prefers the more qualitative and subjective measure of whether we are making the same dumb mistakes over and over. So we are asking today: Are scientists repeating past errors and thus sustaining and perhaps compounding errors of ignorance? Are scientists getting smarter?

Yes and no. A pair of articles (12) recently published in the distinguished journal Trends in Genetics slaps a big juicy data point on the graph of scientific intelligence vs. time–and Senator, the trend in genetics is flat. The articles’ author, Gerald Crabtree, examines recent data on the genetics of intelligence. He estimates that, of the 20,000 or so human genes, between 2,000 and 5,000 are involved in intelligence. This, he argues, makes human intelligence surprisingly “fragile.” In a bit of handwaving so vigorous it calls to mind the semaphore version of Wuthering Heights, he asserts that these genes are strung like links in a chain, rather than multiply connected, as nodes of a network. He imagines the genes for intelligence to function like a biochemical pathway, such that any mutation propagates “downstream”, diminishing the final product–the individual’s god-given and apparently irremediable brainpower.

IQIQ

Beginning in 1865, the polymath Francis Galton fretted that Englishmen were getting dumber. In his Hereditary Genius (1865) he concluded that “families are apt to become extinct in proportion to their dignity” (p. 140). He believed that “social agencies of an ordinary character, whose influences are little suspected, are at this moment working towards the degradation of human nature,” although he acknowledged that others were working toward its improvement. (1) The former clearly outweighed the latter in the mind of Galton and other Victorians; hence Galton’s “eugenics,” an ingenious scheme for human improvement through the machinations of “existing law and sentiment.” Galton’s eugenics was a system of incentives and penalties for marriage and childbirth, meted out according to his calculations of social worth.This is a familiar argument to students of heredity. The idea that humans are degenerating–especially intellectually–persists independently of how much we know about intelligence and heredity. Which is to say, no matter how smart we get, we persist in believing we are getting dumber.

Galton was just one exponent of the so-called degeneration theory: the counter-intuitive but apparently irresistible idea that technological progress, medical advance, improvements in pedagogy, and civilization en masse in fact are producing the very opposite of what we supposed; namely, they are crippling the body, starving the spirit, and most of all eroding the mind.

The invention of intelligence testing by Alfred Binet just before the turn of the 20th century provided a powerful tool for proving the absurd. Though developed as a diagnostic to identify children who needed a bit of extra help in school–an enriched environment–IQ testing was quickly turned into a fire alarm for degeneration theorists. When the psychologist Robert M. Yerkes administered a version of the test to Army recruits during the first world war, he concluded that better than one in eight of America’s Finest were feebleminded–an inference that is either ridiculous or self-evident, depending on one’s view of the military.

These new ways of quantifying intelligence dovetailed perfectly with the new Mendelian genetics, which was developed beginning in 1900. Eugenics—a rather thin, anemic, blue-blooded affair in Victorian England, matured in Mendelian America into a strapping and cocky young buck, with advocates across the various social and political spectra embracing the notion of hereditary improvement. Eugenics advocates of the Progressive era tended to be intellectual determinists. Feeblemindedness–a catch-all for subnormal intelligence, from the drooling “idiot” to the high-functioning “moron”—was their greatest nightmare. It seemed to be the root of all social problems, from poverty to prostitution to ill health.

And the roots of intelligence were believed to be genetic. In England, Cyril Burt found that Spearman’s g (for “general intelligence”)—a statistical “thing,” derived by factor analysis and believed by Spearman, Burt, and others to be what IQ measures—was fixed and immutable, and (spoiler alert) poor kids were innately stupider than rich kids. In America, the psychologist Henry Goddard, superintendent of the Vineland School for the Feebleminded in New Jersey and the man who had introduced IQ testing to the US, published Feeblemindedness: Its Causes and Consequences in 1914. Synthesizing years of observations and testing of slow children, he suggested–counter to all common sense–that feeblemindedness was due to a single Mendelian recessive gene. This observation was horrifying, because it made intelligence so vulnerable–so “fragile.” A single mutation could turn a normal individual into a feebleminded menace to society.

As Goddard put it in 1920, “The chief determiner of human conduct is the unitary mental process which we call intelligence.” The grade of intelligence for each individual, he said, “is determined by the kind of chromosomes that come together with the union of the germ cells.” Siding with Burt, the experienced psychologist wrote that intelligence was “conditioned by a nervous mechanism that is inborn, and that it was “but little affected by any later influence” other than brain injury or serious disease. He called it “illogical and inefficient” to attempt any educational system without taking this immovable native intelligence into account. (Goddard, Efficiency and Levels of Intelligence, 1920, p 1)

This idea proved so attractive that a generation of otherwise competent and level-headed reserchers and clinicians persisted in believing it, again despite it being as obvious as ever that the intellectual horsepower you put out depends on the quality of the engine parts, the regularity of the maintenance you invest in it, the training of the driver, and the instruments you use to measure it.

The geneticist Hermann Joseph Muller was not obsessed with intelligence, but he was obsessed with genetic degeneration. Trained at the knobby knees of some of the leading eugenicists of the Progressive era, Muller–a fruitfly geneticist by day and a bleeding-heart eugenicist by night–fretted through the 1920s and 1930s about environmental assaults on the gene pool: background solar radiation, radium watch-dials, shoestore X-ray machines, etc. The dropping of the atomic bombs on the Japanese sent him into orbit. In 1946 he won a Nobel prize for his discovery of X-ray-induced mutation, and he used his new fame to launch a new campaign on behalf of genetic degeneration. The presidency of the new American Society of Human Genetics became his bully pulpit, from which he preached nuclear fire and brimstone: our average “load of mutations,” he calculated, was about eight damaged genes–and growing. Crabtree’s argument thus sounds a lot like Muller grafted onto Henry Goddard.

In 1968, the educational psychologist Arthur Jensen produced a 120-page article that asserted that compensatory education–the idea that racial disparities in IQ correlate with opportunities more than innate ability, and accordingly that they can be reduced by enriching the learning environments of those who test low–was futile. Marshaling an impressive battery of data, most of which were derived from Cyril Burt, Jensen insisted that blacks are simply dumber than whites, and (with perhaps just a hint of wistfulness) that Asians are the smartest of all. Jensen may not have been a degenerationist sensu strictu, but his opposition to environmental improvement earns him a data point.

In 1990, Richard Herrnstein and Charles Murray published their infamous book, The Bell Curve. Their brick of a book was a masterly and authoritative rehash of Burt and Jensen, presented artfully on a platter of scientific reason and special pleading for the brand of reactionary politics that is reserved for those who can afford private tutors. They found no fault with either Burt’s data (debate continues, but it has been argued that Burt was a fraud) or his conclusion that IQ tests measure Spearman’s g, that g is strongly inherited, and that it is innate. Oh yes, and that intellectually, whites are a good bit smarter than blacks but slightly dumber than Asians. Since they believed there is nothing we can do about our innate intelligence, our only hope is to “marry up” and try to have smarter children.

The Bell Curve appeared just at the beginning of the Human Genome Project. By 2000 we had a “draft” reference sequence for the human genome, and by 2004 (ck) “the” human genome was declared complete. Since the 1940s, human geneticists had focused on single-gene traits, especially diseases. One problem with Progressive era eugenics, researchers argued, was that they had focused on socially determined and hopelessly complex traits; once they set their sights on more straightforward targets, the science could at last advance.

But once this low-hanging fruit had been plucked, researchers began to address more complex traits once again. Disease susceptibility, multicausal diseases such as obesity, mental disorders, and intelligence returned to the fore. Papers such as Crabtree’s are vastly more sophisticated than Goddard’s tome. The simplistic notion of a single gene for intelligence is long gone; each of Crabtree’s 2,000-5,000 hypothetical intelligence genes hypothetically contributes but a tiny fraction of the overall. If you spit in a cup and send it to the personal genome testing company 23AndMe, they will test your DNA for hundreds of genes, including one that supposedly adds 7 points to your IQ (roughly 6 percent for an IQ of 110).

Thus we are back around to a new version of genes for intelligence. Despite the sophistication and nuance of modern genomic analyses, we end up concluding once again that intelligence is mostly hereditary and therefore also racial, and that it’s declining.

Apart from the oddly repetitious and ad hoc nature of the degeneration argument, what is most disconcerting is this one staring implication: that pointing out degeneration suggests a desire to do something about it. If someone were, say, sitting on the couch and called out, “The kitchen is sure a mess! Look at the plates all stacked there, covered with the remains of breakfast, and ick, flies are starting to gather on the hunks of Jarlsberg and Black Twig apples hardening and browning, respectively, on the cutting board,” you wouldn’t think he was simply making an observation. You’d think he was implying that you should get in there and clean up the damn kitchen. Which would be a dick move, because he’s just sitting there reading the Times, so why the heck doesn’t he do it himself. But the point is, sometimes observation implies action. If you are going to point out that the genome is broken, you must be thinking on some level that we can fix it. Thus, degeneration implies eugenics. Not necessarily the ugly kind of eugenics of coercive sterilization laws and racial extermination. But eugenics in Galton’s original sense of voluntary human hereditary improvement.

And thus, scientists do not appear to be getting any smarter. Despite the enriched environs of the modern biomedical laboratory, with gleaming toys and stimulating colleagues publishing a rich literature that has dismantled the simplistic genetic models and eugenic prejudices of yore, researchers such as Crabtree continue to believe the same old same old: that we’re getting dumber–or in danger of doing so.

In other words, sometimes the data don’t seem to matter. Prejudices and preconceptions leak into the laboratory, particularly on explosive issues such as intelligence and/or race, regardless of how heredity is constructed. Plenty of scientists are plenty smart, of course. But rehashing the degeneracy theory of IQ does not demonstrate it.