Eliezer joins the trend of condemning "political" violence with confidence on the far end of the dunning-kruger curve: https://www.lesswrong.com/posts/5CfBDiQNg9upfipWk/only-law-can-prevent-extinction
I've already mocked this attitude down thread and in the previous weekly thread, so I'll try to keep my mockery to a few highlights...
He's admitting nuke the data centers is in fact violence!
It would be beneath my dignity as a childhood reader of Heinlein and Orwell to pretend that this is not an invocation of force.
But then drawing a special case around it.
But it's the sort of force that's meant to be predictable, predicted, avoidable, and avoided. And that is a true large difference between lawful and unlawful force.
I don't think Eliezer has checked the news if he think the US government carries out violence in predictable or fair or avoidable ways! Venezuela! (It wasn't fair before Trump, or avoidable if you didn't want to bend over for the interest of US capital, but it is blatantly obvious under Trump) The entire lead up to Iran consisted of ripping up Obama's attempts at treaties and trying to obtain regime change through surprise assassination! Also, if the stop AI doomers used some clever cryptography scheme to make their policy of property destruction (and assassination) sufficiently predictable and avoidable would that count as "Lawful" in Eliezers book? ~~If he kept up with the DnD/Pathfinder source material, he would know Achaekek's assassins are actually Lawful Evil~~
The ASI problem is not like this. If you shut down 5% of AI research today, humanity does not experience 5% fewer casualties. We end up 100% dead after slightly more time.
His practical argument against non-state-sanctioned violence is that we need a total ban (and thus the authority of state driving it), because otherwise someone with 8 GPUs in a basement could invent strong AGI and doom us all. This is a dumb argument, because even most AI doomers acknowledge you need a lot of computational power to make the AGI God. And they think slowing down AGI (whether through violence or other means) might buy time for another sort of solution that is more permanent (like the idea of "solve alignment" Eliezer originally promised them). Lots of lesswrong posts regularly speculate on how to slow down the AI race and how to make use of the time they have, this isn't even outside the normal window of lesswrong discourse!
Statistics show that civil movements with nonviolent doctrines are more successful at attaining their stated goals
Sources cited: 0
One of the comments also pisses me off:
Which reminds me about another point: I suspect that "bomb data centers" meme causal story was not somebody lying, but somebody recalling by memory without a thought that such serious allegation maybe is worthy to actually look up it and not rely on unreliable memory.
"Drone strike the data centers even if starts nuclear war" is the exact argument Eliezer made and that we mocked. It is the rationalists that have tried to soften it by eliding over the exact details.
