People fear change.
To know this, you don’t have to venture very far back in history. In the 1940s, people worried television would upend American family life and “vulgarize” the culture. In the 1980s, the term “computerphobia” was born. In the 1990s, I cried when my grandmother got a haircut.
According to a recent Pew Research Center survey, this fear of change—of the unknown and the unknowable—is largely responsible for shaping the way the public views emerging science and technology, such as the gene editing tool CRISPR.
The survey asked 4,726 people how they feel about gene editing and other human-enhancing technologies. More than 60% said that they were “very” or “somewhat” worried about technologies that could make us smarter, healthier and stronger, including gene editing to reduce the risk of disease in babies, brain chips for cognitive enhancement, and synthetic blood transfusions to improve speed, strength and stamina.
In focus groups held by Pew, the public referenced dystopian sci-fi thrillers such as “I, Robot” and “Gattaca” when explaining their views on the new technologies.
“Are we becoming robots, is that what the whole society’s going to become?” one 50-year-old Hispanic woman said at a focus group in Phoenix. “And then pretty soon someone will hack the computer system that you hook up to and throw a little virus in your brain and then what? You lose your identity as a person.”
Americans don’t just fear that such technologies will lead to a degradation of our culture. They fear a “Hitler-like” sci-fi future, a future where technology is “used to make supermen,” exacerbating the gap between the haves and have-nots.
In the minds of America, our terrifying, genetically engineered future looks something like a cross between “Gattaca” and the last season of “True Blood.” (Think of Silicon Valley billionaire Peter Thiel injecting himself with young people’s blood.)
Interestingly, though, the smaller or less permanent the change, the easier people found it to swallow.
Only 28% of adults, for example, said a synthetic blood transfusion would be an appropriate use of technology if it produced enhancements “far above that of any human known to date.” But a significantly larger 47% said synthetic blood products would be okay if the magnitude of change was smaller. More than a third of respondents said implants that could enhance cognitive ability would be more acceptable if they could be turned on and off. More than half said brain chips would be less acceptable if the results were irreversible.
As a population, we are not very risk-inclined. We’re not just afraid of change—we’re afraid of big, consequential changes that we won’t be able to undo.
“People are less accepting of enhancements that produce extreme changes in human abilities,” the survey’s authors wrote. “And, if an enhancement is permanent and cannot be undone, people are less inclined to support it.”
The researchers note that while those surveyed seemed to express excitement about technology in a general sense, the fear seemed to creep in when the details got more specific.
Will technology leave the poor further behind while the rich become immortal superhumans, focus group participants wondered. Will the government abuse it? At what point do we become not human anymore?
“I just started to think about ‘I, Robot’ and those type of movies where you have people out of control just because they [have] all these superpowers all of a sudden,” a 38-year-old black man in Baltimore told Pew.
For all of the technologies discussed, the problem was less a moral one than one of uncertainty. Less than half of survey respondents said that editing a baby’s genes to prevent disease was meddling with nature, while 68% said that they were “very” or “somewhat” worried about the technology. The number of those who felt it was immoral was nearly tied with those who thought it was fine, with 40% responding that they weren’t sure whether it was immoral or moral at all. Interestingly, three-in-ten adults were both enthusiastic and worried.
The anxiety here is clearly that of the unknown. About three-quarters of adults said they thought the technology would be used before the health effects are fully understood. Unsurprisingly, among those who were already aware of gene editing technology, a higher percentage (57%) said that they were inclined to give it a whirl.
The research was aimed in part at understanding where the public might “draw the line” on human enhancements. The survey suggests we’d be most comfortable if society doesn’t advance at all.
“Fear has always been there, but there’s a particular moment of time when it leaps from one domain to another,” Genevieve Bell told my colleague Felix Salmon last fall, of the origins of technophobia.
To fear how technology might change us is a moral code as old as the most rudimentary technologies themselves. The ancient Greeks told stories of Daedalus, who made wings for himself and his son Icarus, who used those wings to fly catastrophically close to the sun, and Prometheus, who stole fire from the gods and was eternally chained to a rock.
The good news is that as people become more familiar with such technologies, the fear is perhaps moving elsewhere. A 2014 Pew Research Center survey asked people’s views about genetically modifying babies under two circumstances and found that the public was slightly less willing to accept the idea than in this more recent survey. Then, 50% said that it was “taking medical advances too far” to do so to prevent disease, but today that number has dropped to 46%.
Technologies like the gene-editing technique CRISPR offer a glimpse of a dazzling future. Our fears of them are rooted in what we do not know, rather than what we do. What we do know is that genetic engineering offers the possibility of curing many diseases, solving food shortages and restoring the ecosystem. Already, researchers have used CRISPR to make wheat resistant to a damaging blight, to edit a human embryo to repair a gene that causes a fatal blood disorder and embarked upon the first human trial to simply edit out cancer.
Change is inevitable. But progress is not.