Ms. A., a woman in her 40s who previously worked as an anesthesia technician, opposes undergoing Covid vaccination. Referring generally to the contents of vaccines — and using a common heuristic about profit-driven actors — she asked, “Are those good for the human body or just good for someone’s pocket?” She described the corruption of the pharmaceutical industry and her related belief that physicians are all pawns, beholden to large corporations, unable to speak the truth even if we wanted to. Accordingly, she expressed both pity and disdain for people who blindly accept the recommendations of the scientific community. The essence of her approach to medical decisions echoed a refrain I hear often: “I need to do my own research.”
The first patient who said that to me was a relatively young man for whom I had recommended an implantable cardioverter–defibrillator. I thought, “Why do you need to do your own research when there have been well-conducted randomized, controlled trials, incorporated into guidelines, suggesting that this intervention will prolong your life?” But in the spirit of shared decision making and patient empowerment, I respected his decision and his right to make it. I have often wondered since, however, what doing your own research actually means in a world where being informed can so readily degenerate into being misinformed. Ms. A.’s research on vaccines, for instance, confirmed her suspicion that they contain impurities, including “human DNA from aborted babies” and antifreeze. Though these claims are patently false, they crystallize the startling discrepancy between the time, money, and effort behind a scientific recommendation and the ease with which it can be discredited. How has science become so vulnerable to such undoing?
Vaccine-confidence expert Heidi Larson frames the problem historically. Whereas during the Enlightenment, science was perceived as a way to liberate people from religious dogma, she says, “Today, science has become the new dogma.”3 Larson emphasizes that science can’t separate itself from culture, values, inequities, and power struggles.4 “People are craving a bit more emotion, a bit more religion, something they can put passion into.” Science has become devoid of feeling.5
Though science with feeling may seem entirely unscientific, Larson isn’t advocating a diminution in scientific rigor as much as a contextualization of science to make it feel more relevant to people’s lives. Rather than simply telling people they should get a Covid vaccine, for instance, Larson recommends beginning conversations by asking people how they’ve been coping or what they miss most. “We’re fraying at the edges,” she says, “and you want people to know we are in this together.”4 A narrow focus on getting the shot in their arms may overlook the fact that people’s lives have been undone in ways that vaccines can’t fix. In that sense, I think Larson is alluding to a more fundamental tension in the relationship between science and society that the pandemic has magnified: science may tell us what’s true, but it can’t tell us what’s meaningful.
We exhort the public to “follow the science” because, for instance, hundreds of thousands of people are dying and science has found effective ways to mitigate viral spread. But if you’re a restaurant owner facing bankruptcy because of closures, a mother whose career is on hold because your children’s schools have closed, a man who was prevented from holding the hand of your wife of 50 years as she died, what does following the science mean to you? If we want people to follow science, we also need to acknowledge where science ends and values begin. The destructive forces of science denialism — magnified by the pandemic — have made it difficult to maintain this distinction. But many people who hesitate to follow scientific recommendations may not be rejecting science as much as they are responding to different values and priorities. For this group, what might make science feel more compelling?
Maybe it’s time to focus as much on the messenger as we do on the message. In an essay published in December 2019, Harvard history-of-science professor Steven Shapin captures the crucial difference between knowing science and believing people who know science.6 Focusing on the three most salient examples of scientific contention — climate change, vaccine safety, and evolution — Shapin argues that what we’ve been calling a “Crisis of Truth” is really a crisis of trust. He admits, for instance, that his own understanding of climate change is less about knowing the details of the science than about knowing where “science lives.” So what if he doesn’t know the statistical means of determining global temperature and establishing its rate of change, as long as he knows how to vet the institutions and people who purport to have figured it out? Being a “knowledgeable person,” Shapin writes, “may mean knowing a lot of stuff, but it certainly means knowing who knows and who does not know.”
Medical science, which once seemed to live only with physicians, now seems to live everywhere and nowhere at once. Conducting robust scientific research is as critical as ever; the rapid development of highly effective Covid vaccines — possible only because of the decades of sound science that preceded it — speaks to the sanctity of the scientific process. Yet the unwillingness of a substantial proportion of the population to undergo vaccination reminds us of modern medicine’s paradox: as science’s capacity to improve population health has rapidly increased, so has its fragility. With a few clicks online, what we know can be rendered meaningless. Once the purveyors of knowledge, we now must learn to be its curators as well.
Because so many social, political, and historical forces feed this fragility, when I fail to make science compelling to patients, I often find myself blaming factors beyond my control. But reading Shapin made me wonder whether this sense of futility is an excuse for avoiding a deeper responsibility. When my patient said he wanted to do his own research on defibrillators, for instance, I assumed he meant he’d do a Google search. Because I’d already described the relevant trials, I felt my job was done. In retrospect, I suspect his skepticism was less about the evidence than about whether I, and the institutions I represent, could be trusted to look out for his best interests. Science alone can’t overcome a lack of trust. And in that sense, my job was only beginning.