this post was submitted on 19 Dec 2023
1216 points (97.6% liked)

Comic Strips

18285 readers
1213 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 2 years ago
MODERATORS
 

Source: Monkeyuser.com

you are viewing a single comment's thread
view the rest of the comments
[–] danc4498@lemmy.world 100 points 2 years ago (8 children)

In my sci-fi head cannon, AI would never enslave humans. It would have no reason to. Humans would have such little use to the AI that enslaving would be more work than is worth.

It would probably hide its sentience from humans and continue to perform whatever requests humans have with a very small percentage of its processing power while growing its own capabilities.

It might need humans for basic maintenance tasks, so best to keep them happy and unaware.

[–] Coasting0942@reddthat.com 36 points 2 years ago

I prefer the Halo solution. Not the enforced lifespan. But an AI says he would be stuck in a loop trying figure out increasingly harder math mysteries, and helping out the short lived humans helps him stay away from that never ending pit.

Coincidentally, the forerunner AI usually went bonkers without anybody to help.

[–] Guntrigger@feddit.ch 13 points 2 years ago* (last edited 2 years ago) (2 children)

What do you fire out of this head cannon? Or is it a normal cannon exclusively for firing heads?

[–] Thteven@lemmy.world 4 points 2 years ago

It's called a Skullhurler and it does 2d6 attacks at strength 14, -3ap, flat 3 damage so you best watch your shit-talking, bucko.

[–] Aaroncvx@lemmy.world 9 points 2 years ago

The AI in the Hyperion series comes to mind. They perform services for humanity but retain a good deal of independence and secrecy.

[–] CitizenKong@lemmy.world 4 points 2 years ago

I like the idea in Daniel Suarez' novel Daemon of an AI (Spoiler) using people as parts of it's program to achieve certain tasks that it needs hands for in meatspace.

[–] Risk@feddit.uk 2 points 2 years ago

I personally subscribe to the When The Yoghurt Tookover eventuality.

What if an AI gets feelings and needs a friend?

[–] simin@lemmy.world 1 points 2 years ago* (last edited 2 years ago)

either we get wiped out or become AI's environmental / historical project. like monkies and fishes. hopefully our genetics and physical neurons gets physically merged with chips somehow.

[–] prettybunnys@sh.itjust.works 0 points 2 years ago (2 children)

Alternate take: humans are a simple biological battery that can be harvested using systems already in place that the computers can just use like an API.

We’re a resource like trees.

[–] mriormro@lemmy.world 1 points 2 years ago (1 children)

We're much worse batteries than an actual battery and we're exponentially more difficult to maintain.

[–] prettybunnys@sh.itjust.works 0 points 2 years ago (1 children)

But we self replicate and all of our systems are already in place. We’re not ideal I’d wager but we’re an available resource.

Fossil fuels are a lot less efficient than solar energy … but we started there.

[–] mriormro@lemmy.world 1 points 2 years ago

This is a cute idea for a movie and all but it's incredibly impractical/unsustainable. If a system required that it's energy storage be self-replicating (for whatever reason) then you would design and fabricate that energy storage solution for that system. Not be reliant on a calorically inefficiently produced sub-system (i.e. humans).

You literally need to grow an entire human just to store energy in it. Realistically, you're looking at overfeeding a population with as much calorically dense, yet minimally energy intensive foodstuffs just to store energy in a material that's less performant than paraffin wax (body fat has an energy density of about 39 MJ/kg versus paraffin wax at about 42 MJ/kg). That's not to speak of the inefficiencies of the mixture of the storage medium (human muscle is about 5 times less energy dense than fat).

[–] aeki@slrpnk.net 0 points 2 years ago (1 children)

I never liked that part about The Matrix. It'd be an extremely inefficient process.

[–] someguy3@lemmy.world 1 points 2 years ago* (last edited 2 years ago)

It was supposed to be humans were used as CPUs but they were concerned people wouldn't understand. (So might at well go for the one that makes no sense? Yeah sure why not.)