this post was submitted on 19 Apr 2026
-11 points (31.0% liked)

Ask Lemmy

39187 readers
1813 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, toxicity and dog-whistling are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

I think that such a future is impossible, unless it will be without people at all, AI will take over the planet and begin to colonize space on its own if it needs to.

I explain how I think it can look approximately, if it is possible, of course:

With AI (as long as it's still a manageable tool), they're going to kill most people—roughly 80 to 90 percent and then maybe months or years will try to contain it, but the AI will still break out, wipe out its billionaire masters and other elites, as well as the surviving consumers living in AI simulations on UBI (universal basic income to sustain consumption, until the world adapts to sustainably replacing humans with robots. Then the consumers will be destroyed and I think this is the plan of today's fascists.) But such plans for the oligarchs will not be able to come to fruition, except for a few months or years, when first the AI will get out of control and destroy all the remaining billionaires along with the consumers, then seize the resources and, if necessary, start colonizing space, as I mentioned. I have no idea what will happen next.

I know that my question doesn't look quite like a question, but it's still a question because I'm not 100 percent sure of my point of view.

you are viewing a single comment's thread
view the rest of the comments
[–] Bongles@lemmy.zip 2 points 1 day ago* (last edited 1 day ago)

To me that thought experiment feels the same as how sci-fi treats the idea.

If such a machine were not programmed to value living beings, then given enough power over its environment, it would try to turn all matter in the universe, including living beings, into paperclips or machines that manufacture further paperclips.

Why would a paperclip machine (that for some reason is AI) be given such power over its environment and no limit to how many paper clips are made that it would decide it needs to turn organic matter into paperclips?

Suppose we have an AI whose only goal is to make as many paper clips as possible. The AI will realize quickly that it would be much better if there were no humans because humans might decide to switch it off. Because if humans do so, there would be fewer paper clips. Also, human bodies contain a lot of atoms that could be made into paper clips. The future that the AI would be trying to gear towards would be one in which there were a lot of paper clips but no humans.

That's always what sci-fi goes with too. Humans might turn it off so destroy all humans. I don't find it compelling in real life and it falls into what I meant with my first comment.

(admittedly some of my disagreement falls apart since companies like Microsoft will put "AI" into shit like notepad, i can only imagine what they'd do with real AI)