The "Cancel ChatGPT movement" doesn't appear to be mentioned in the article, but other outlets say hashtags like #CancelChatGPT are trending on X.
I think you're right that stock trading has enabled a lot of bad and perhaps shouldn't have been allowed. At least on a large scale beyond a single town or county. Paper certificates for money may have been a bad idea too. Even the use of a common currency like gold may have been a net negative. I think a barter system has positives over a common currency in that it requires people to work together and form communities.
I understand that lethality makes a pathogen less effective at spreading. But this will not be the case as much for artificial pathogens specifically designed to spread undetected until they suddenly activate and kill. Being on the lookout and staying indoors and washing your hands definitely won't save you if the pathogen is sufficiently well designed. That didn't even prevent most people from getting COVID. I was imagining a pathogen that can infect plants and animals as well as humans, so even a person stranded on a remote island would catch it. And noticing the disease won't matter if there's no cure, which is to be expected if this pathogen comes out of nowhere and appears totally harmless at first, until people suddenly die at once. Especially if it's also designed to survive even an immune system that has been vaccinated with a deactivated version.
Even if it is impossible for a pathogen to do all that because we are able to immediately develop a 100% effective vaccine for any pathogen we discover (which is very unrealistic imo), we'd have to be mass inoculating people every time some psycho releases a new potentially dangerous pathogen. We wouldn't have time to test these vaccines for safety and no doubt there would be some adverse health effects from injecting so many vaccines. People would also have to put a great deal of trust in whoever is making and providing these vaccines (probably the government), as a malicious entity could use the excuse of a new pathogen to persuade or coerce people into taking bad substances. These could be to reduce fertility and in the future such substances could probably be used to alter behaviors or even deliver nanoparticles that can be controlled remotely to deliver electric shocks or biological changes. There's just so many ways for this to go horribly bad that I don't think it can possibly end well if pathogen modification becomes capable by individuals or small groups (using AI or other means).
And pathogen modification is just one of the ways we're at risk of going extinct. There's also the other ones I mentioned, ones I didn't mention (like mirror life) and probably a lot more we haven't thought about. When developing atomic weapons there was a concern the atmosphere could be set on fire. It turned out nuclear weapons don't set the atmosphere on fire, but maybe some other technology could or find some other way of causing oxygen depletion. Or maybe there's a way to generate so much ionizing electromagnetic radiation that it damages all DNA on earth to the point where our fertility drops and we go extinct in 3 generations.
But even if we just stick to the ways we already know about it's almost certain that we will soon have technology capable of killing everyone that nobody would be able to defend against. The only protection therefore is to limit its availability. But some technologies are very hard to limit the availability of - such as AI which any intelligent person with access to AI research papers and the ability to write computer programs could make. And why limit its availability to governments and big corporations that can abuse it to subjugate the public (and based on experience and incentives will abuse it)? Surely it's much better to limit the availability to nobody. Hence the project of Stop Tech.
Right
We've seen it many times. TV, teflon, social media, cars, AI...
as a biologist: the “100% extermination virus” is impossible
Have you heard of Clarke's three laws? Specifically the first one?
no known ways
So there could be a way we don't know of yet? Isn't that what science would discover for us? What law of the universe prevents such a thing being possible? Why couldn't we program a virus to have a long incubation period once we can use DNA/RNA like we use programming languages?
The rest of your comment seems to ignore what I already covered in my essay. Yes it's about access, but you either have wide availability and we all die or narrow availability and totalitarianism. Materials and equipment costs also go down with improvements in production and once AI is able to design its own equipment from first principles it may be possible to have AI robots build all the equipment from raw materials.
All this reads like a fearful response to change
If the change we're talking about is humans being replaced as the dominant species on the planet and the invention of weapons that can kill us all, I'd say to be unafraid is completely irrational. It's wishful thinking to say it will work out despite all the trends and incentives saying it won't.
Agreed but I haven't heard of that
Or usury
Have you seen Gattaca?
I wouldn't trust those funds myself. Plenty of oil companies say they're all about reducing CO2 and as I remember ESG was playing favorites rather than reflecting carbon emissions. Even companies that are trying to reduce emissions can still be invading people's privacy, lobbying (bribing) for bad legislation and doing other evil things.
It is reality. And unfortunately any "ethical" funds usually just focus on avoiding oil companies or military companies but are just fine with AI companies, surveillance companies, eugenics companies and so on. Nobody agrees on what is ethical I'm afraid. One man's unethical practice is another man's unethical-to-avoid practice.
The "Cancel ChatGPT movement" doesn't appear to be mentioned in the article, but other outlets say hashtags like #CancelChatGPT are trending on X.