Blackmailing a Utilitarian

Often in a TV show—notably in 24 and Arrow—the bad guys kidnap a family member, and our character has no choice but to cooperate with the criminals. “I have your wife” is the name of the tvtrope. Fearing harm to the one hostage, a “good” character often puts a hundred or a thousand or a million people at mortal risk to save one person. Of course, the bad case where a million people die never happens, and the good guys almost always manage to save the city as well as the hostage, all whilst never considering a sacrifice for the greater good. Even in the most dire circumstances, they claim, “There is always a way.”

This method has always seemed super irrational to me. The most classic thought experiment in ethics is the trolley problem, where a trolley is about to run over five people but you have the ability to flip the tracks to run over one different person instead. It seems like you should almost always save the five people, even if the one person were your family. As a utilitarian, I find that the blackmail dilemma makes little sense.

The only case I would let the five die is if the one person on the other track was obviously very good for humanity and many times more so than average. Maybe you could argue that Elon Musk has enough of a climate-change-mitigating effect on the world that it’s better that five random people die than Musk, who might arguably avert the loss of a hundred thousand people to environmental degradation.

I have often made the claim before that even if the one person were family, I’d save the five people in a trolley problem.

“But wait,” multiple people have told me when I express that idea. “You might claim to be a utilitarian, but when you actually have someone you care about, or children someday, there’s no way you would actually decide to abandon them.”

The fact that everyone says that is definitely an update to my belief. But at the same time, I can’t help but to think, “Sure, I understand from evolutionary biology/evolutionary psychology why the base desire to protect one’s genes is so strong. But at a lot of human civilization and progress has occurred because of suppression of base desires encoded in our genes, e.g. murder and tribal warfare. If we’ve learned to not murder, it’s also possible that we’re capable of reasoning about which person to save rationally, not just decide as we’re biologically programmed to.”

“That sounds reasonable,” they respond. “But until you’ve held your own child in your arms, you don’t know what it means. You can’t possibly understand.”

“Sounds right,” I say. “But even knowing that you think that, I can pre-commit to making the utilitarian decision in the trolley problem.”

In the blackmailing trope, the character is often made to directly do something to endanger the lives of lots of people. In some season of 24, some dock worker helps terrorists smuggle in a nuclear bomb(?). The person wasn’t under gunpoint; they had a full day to go to the police, and yet they did nothing.

Would we be better off if everyone claimed to be a utilitarian? Probably not, since the claim wouldn’t be very credible. If there are 5 powerful people in some organization and one of them needs to be blackmailed for some purpose, criminals could target the person they deduce is least utilitarian. Therefore, a few people converting to be utilitarian doesn’t help so much. However, if blackmail were more common, it would probably nudge people to be more utilitarian, as signaling one is a utilitarian has a deterrence effect on criminals from kidnapping one’s family members.

Lightly Held Identities and Ideological Turing Tests

Here is a brilliant passage on identity from Julia Galef’s The Scout Mindset: Why Some People See Things Clearly and Others Don’t:

The problem with identity is that it wrecks your ability to think clearly. Identifying with a belief makes you feel like you have to be ready to defend it, which motivates you to focus your attention on collecting evidence in its favor. Identity makes you reflexively reject arguments that feel like attacks on you or the status of your group.

To counteract this, Galef says to have lightly held identities:

Holding an identity lightly means thinking of it in a matter-of-fact way, rather than as a central source of pride and meaning in your life. It’s a description, not a flag to be waved proudly…

Holding an identity lightly means treating that identity as contingent, saying to yourself, “I’m a liberal, for as long as it continues to seem to me that liberalism is just.” Or “I’m a feminist, but I would abandon the movement if for some reason I came to believe it was causing net harm.” It means maintaining a sense of your own beliefs and values, independent of the tribe’s beliefs and values, and acknowledging—at least in the privacy of your own head—the places where those two things diverge.

So a belief taken to be identity is hard to abandon, because to abandon the belief is to abandon one’s identity. If, instead, a belief is held lightly, it is easy to update the belief upon seeing contradictory evidence.

One example I’ve seen was in the old culture wars of the late 2000s/early 2010s of New Atheism, and I have to admit, I used this argument myself. It went something like, “Shouldn’t Muslims be more willing to leave their religion, after seeing repeated evidence that suicide bombers profess their faith towards Islam right before killing themselves along with dozens of civilians?” Another axis was, “Shouldn’t liberal Muslims be more willing to leave their religion, given they personally support women’s rights and LGBT rights, but Islamic countries around the world have the most sexism and anti-LGBT discrimination?” Needless to say, this was an unconvincing argument.

The reason this wasn’t convincing is that it was psychologically equivalent to an attack on the identity of Muslims. In the New Atheists’ mind, they were trying to help people change to a better belief system, and saying, “Hey, I know you live your life following parts of wisdom in this ancient book, but look, here are some people who are really, really into following the same ancient book—the people who think about this book all the time—and they commit atrocities that are very directly because of their following of this book, so you should probably update to believing in this book a little less than you did otherwise.” But since religion is often not just a belief but an identity, this reinforces belief instead of weakening it.

Religion aside, I’ve always been fascinated by how communism is an acceptable belief amongst educated people—encountered this in both high school and college. Certainly communism isn’t a core part of many people’s identity, though maybe anti-capitalism is? When I mentioned the famines that killed tens of millions as a result of communist economic policies in the Soviet Union or China, the response was not to update to be more skeptical of communism, but instead, something like, “Well, they weren’t doing communism correctly; if we did it correctly, it wouldn’t kill tens of millions of people.” That last argument has a lot of sympathy from well-educated people, yet none of them would sympathize with “Fascism is inherently good because it builds national strength, it’s just that the Germans did it incorrectly by tacking antisemitism and eugenics on it; if we did it correctly…”

What identities would I self-describe as? If you asked me 10 years ago, “liberal” would probably have been at the top of my list. But in 2015, I became quite disenchanted with the left, starting with the now-standard response to the 2015 Charlie Hebdo shooting: “I think the shooting is wrong, but something about how France is a racist, colonialist empire and therefore somewhat deserved it.” (Which isn’t far from the more recent, “I think antisemitism is wrong, but….”) I thought I had a lot in common with liberals until the liberal discussion focused mostly on what the West did wrong to deserve a terror attack. This was basically the quote in Galef’s book, “I’m a liberal, for as long as….” (There’s some mental gymnastics that can be done by arguing, “Well, I’m still a liberal in the definitional sense, it’s the other people claiming to be liberals who are abandoning the core tenets of liberalism,” but that I’ll leave out.)

At this point I don’t carry around much identity, because in my experience, the people most focused on their identities seem to be the least amenable to reason. I guess if I had to pick one, I would consider myself an epistemic rationalist, in the sense that I think using logic and probability is the best way to form accurate beliefs about the world.

Relatedly, Galef mentions in her book the Ideological Turing Test, an idea from Bryan Caplan that in order to truly understand an opposing viewpoint, you should be able to convince someone you actually hold that opposing belief. This is hard to do—the book contains an example of someone claiming to do this but failing horribly. From Galef:

Measured against that standard [the Ideological Turing Test], most attempts fall obviously short. For a case in point, consider one liberal blogger’s attempt at modeling the conservative worldview. She begins, “If I can say anything at this bleak hour, with the world splitting at its seams, it’s this: conservatives, I understand you. It may not be something you expect to hear from a liberal, but I do. I understand you.” An earnest start, but her attempt to empathize with conservatives quickly devolves into caricature. Here’s her impression of the conservative take on various subjects:

On capitalism: “Those at the top should have as much as possible. That’s the natural order . . . It’s not a secret; just don’t be lazy. Why is everyone so poor and lazy?”

On feminists: “Those women make noise, make demands, take up space . . . Who do they think they are?”

On abortion: “What a travesty . . . women making these kinds of radical decisions for themselves.”

On queer and transgender people: “They shouldn’t exist. They’re mistakes. They must be. But wait, no. God doesn’t make mistakes . . . Oh dear. You don’t know what’s happening anymore, and you don’t like it. It makes you feel dizzy. Out of control.”

It’s hardly necessary to run this text by an audience of conservatives to be able to predict that it would flunk their ideological Turing test. Her “conservative” take on capitalism sounds like a cartoon villain. The language about women taking up space and making decisions for themselves is how liberals frame these issues, not conservatives. And her impression of a conservative suddenly realizing his views on transgender and queer people are internally inconsistent (“They’re mistakes . . . But wait, no. God doesn’t make mistakes.”) just looks like a potshot she couldn’t resist taking.

[…]

The ideological Turing test is typically seen as a test of your knowledge: How thoroughly do you understand the other side’s beliefs? But it also serves as an emotional test: Do you hold your identity lightly enough to be able to avoid caricaturing your ideological opponents? Even being willing to attempt the ideological Turing test at all is significant. People who hold their identity strongly often recoil at the idea of trying to “understand” a viewpoint they find disgustingly wrong or harmful. It feels like you’re giving aid and comfort to the enemy. But if you want to have a shot at actually changing people’s point of view rather than merely being disgusted at how wrong they are, understanding those views is a must.

I’ve encountered this so many times where people including myself have a ridiculous notion of what the other side of some ideological divide actually thinks. I’m not sure what I think the solution is. One thing I’ve tried recently and would recommend is to go on Reddit and subscribe to a subreddit—but importantly, not of an identity group you belong to, but of one you don’t belong to. On the home page, you can probably find at least one post that you don’t totally disagree with. And if you really internalize that post, you’d be on your way to passing an Ideological Turing Test. (It doesn’t have to be on Reddit. Follow someone you disagree with on Twitter, etc.)

That said, it is still incredibly hard to empathize with opposing views—as Galef said in the quote just above, “People who hold their identity strongly often recoil at the idea of trying to ‘understand’ a viewpoint they find disgustingly wrong or harmful.” It seems like people should, on the margin, hold their ideas less strongly.