Compassion feels terrible.
Compassion doesn't feel warm and soft to me. It feels like a need. How would you feel if a loved one was drowning in front of you? A sibling. A romantic partner. A child. A pet. About to die. Do you feel warm and soft? No. You feel a need to help. It’s not pleasant. It’s desperate. You might feel a rush of relief when you see there’s a lifesaver nearby, and you quickly throw it to them. But how do you feel when you see that the loved can’t reach the lifesaver? Do you dive into the waters to try to save them? Or will that just mean the both of you will perish? And wait, what the fuck? It’s not just one loved one in the raging river. There’s another one! And another one! . . . and another one?! Where the fuck are they coming from? Why are all of your loved ones drowning? Should you go upstream to figure out where they’re coming from and what’s happening to prevent more? But then all of the loved ones you can see will die right away. This is awful. You decide to go upstream - and discover that other loved ones are pushing more loved ones into the stream. “What the fuck are you doing?” you politely say to the loved ones. “What do you mean,” they say. “We aren’t doing anything” says this woman while she throws your cat into the river. How do you feel when they’re drowning and you don’t know what to do to help? That's what compassion feels like to me. It kinda sucks, to be honest. And yet, if somebody figured out a "cure" to compassion and tried to give me a pill to "fix" it, I'd fight them tooth and nail. Because not all good things in the world feel good. Compassion is like so many other sources of meaning. If you care about something, it means it can make you suffer. And I choose to care.
0 Comments
AI risk denier: Maybe it’d be a good thing if everybody dies.
Me: OK, then you’d be OK with personally killing every single man, woman, and child with your bare hands? Starting with your own family and friends? All the while telling them that it’s for the greater good? Or are you just stuck in Abstract Land where your moral compass gets all out of whack and starts saying crazy things like “killing all humans is good, actually”? AI risk denier: God you’re a vibe-killer. Who keeps inviting you to these parties? --- The Visceral Omicide Thought Experiment: people's moral compasses tend to go off kilter when unmoored from more visceral experiences. To rectify this, whenever you think about omnicide (killing all life), which is abstract, you can make it concrete and visceral by imagining doing it with your bare hands. This helps you more viscerally get what omnicide entails, leading to a more accurate moral compass. Remember: there's never been a movement in the history of the world where there wasn't lots of in-fighting and people disagreeing about strategy, including comms strategy.
People regularly accused Gandhi of being bad for the movement. People regularly accused MLK of being too radical and other people regularly accused him of not being radical enough. This is just the nature of movements and strategy. Once you understand this, you are free. Do what you think is highest impact. Other people will disagree. See if their suggestions or opinions are persuasive. If they are, update. If they aren't, carry on, and accept that there will always be critics and disagreement. Eliezer raising awareness about AI safety is not net negative, actually: a thought experiment7/31/2024 Imagine a doctor discovers that a client of dubious rational abilities has a terminal illness that will almost definitely kill her in 10 years if left untreated.
If the doctor tells her about the illness, there’s a chance that the woman decides to try some treatments that make her die sooner. (She’s into a lot of quack medicine) However, she’ll definitely die in 10 years without being told anything, and if she’s told, there’s a higher chance that she tries some treatments that cure her. The doctor tells her. The woman proceeds to do a mix of treatments, some of which speed up her illness, some of which might actually cure her disease, it’s too soon to tell. Is the doctor net negative for that woman? No. The woman would definitely have died if she left the disease untreated. Sure, she made the dubious choice of treatments that sped up her demise, but the only way she could get the effective treatment was if she knew the diagnosis in the first place. Now, of course, the doctor is Eliezer and the woman of dubious rational abilities is humanity learning about the dangers of superintelligent AI. Some people say Eliezer / the AI safety movement are net negative because us raising the alarm led to the launch of OpenAI, which sped up the AI suicide race. But the thing is - the default outcome is death. The choice isn’t:
You cannot solve a problem that nobody knows exists. The choice is:
|
Popular postsThe Parable of the Boy Who Cried 5% Chance of Wolf
The most important lesson I learned after ten years in EA Why fun writing can save lives Full List Categories
All
Kat WoodsI'm an effective altruist who co-founded Nonlinear, Charity Entrepreneurship, and Charity Science Health Archives
August 2024
Categories
All
|