this post was submitted on 16 Oct 2025
72 points (97.4% liked)

TechTakes

2255 readers
114 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
all 18 comments
sorted by: hot top controversial new old
[–] dgerard@awful.systems 5 points 2 days ago (1 children)

a youtube commenter notices: just where is that robot's left hand

[–] swlabr@awful.systems 5 points 2 days ago* (last edited 2 days ago)

God forbid a woman has hobbies

[–] xxce2AAb@feddit.dk 24 points 3 days ago (2 children)

This... is not going to end well. For anybody.

[–] gerikson@awful.systems 19 points 3 days ago (1 children)

Except the people selling expensive PCs.

[–] xxce2AAb@feddit.dk 9 points 3 days ago

Initially, yeah.

[–] Tollana1234567@lemmy.today 4 points 3 days ago (1 children)

its the futurama episode where the guy only falls in love with a robot for the rest of thier lives.

[–] corbin@awful.systems 17 points 3 days ago (1 children)

I tried to substantiate the claim that multiple users from that subreddit are self-hosting. Reading the top 120 submissions, I did find several folks moving to Grok (1, 2, 3) and Mistral's Le Chat (1, 2, 3). Of those, only the last two appear to actually have discussion about self-hosting; they are discussing Mistral's open models like Mistral-7B-Instruct which indeed can be run locally. For comparison, I also checked the subreddit /r/LocalLLaMA, which is the biggest subreddit for self-hosting language models using tools like llama.cpp or Ollama; there's zero cross-posts from /r/MyBoyfriendIsAI or posts clearly about AI boyfriends in the top 120 submissions there. That is, I found no posts that combine tools like llama.cpp or Ollama and models like Mistral-7B-Instruct into a single build-your-own-AI-boyfriend guide. Amusingly, one post gives instructions for how to ask ChatGPT about how to set up Ollama.

Also, I did find multiple gay and lesbian folks; this is not a sub solely for women or heterosexuals. Not that any of our regular commenters were being jerks about this, but it's worth noting.

What's more interesting to me are the emergent beliefs and descriptors in this community. They have a concept of "being rerouted;" they see prompted agents as a sort of nexus of interconnected components, and the "routing" between those components controls the bot's personality. Similarly, they see interactions with OpenAI's safety guardrails as interactions with a safety personality, and some users have come to prefer it over the personality generated by ChatGPT-4o or ChatGPT-5. Finally, I notice that many folks are talking about bot personalities as portable between totally different models and chat products, which is not a real thing; it seems like users are overly focused on specific memorialized events which linger in the chat interface's history, and the presence of those events along with a "you are my perfect boyfriend" sort of prompt is enough to ~~trigger a delusional episode~~ summon the perfect boyfriend for a lovely evening.

(There's some remarkable bertology in there, too. One woman's got a girlfriend chatbot fairly deep into a degenerated distribution such that most of its emitted tokens are asterisks, but because of the Markdown rendering in the chatbot interface, the bot appears to shift between italic and bold text and most asterisks aren't rendered. It's a cool example of a productive low-energy distribution.)

[–] dgerard@awful.systems 4 points 2 days ago* (last edited 2 days ago)

I did see at least a few, which is why I said that, funnily enough.

[–] TropicalDingdong@lemmy.world 13 points 3 days ago* (last edited 3 days ago)

Just like...

This feels like one of those run-away feed backs, where like, if you start down the slippery slope of just non-stop positive reinforcement and validation of every behavior from a chatbot... like, you are going to go like..hard maladaptive behavior fast.

[–] fullsquare@awful.systems 8 points 3 days ago

MyBoyfriendIsAI are non-techies who are learning about computers from scratch, just so Sam can’t rug-pull them.

Buterin jumpscare

[–] BlueMonday1984@awful.systems 8 points 3 days ago

I tried to come up with some kind of fucked-up joke to take the edge off, but I can't think up anything good. What the actual fuck.

[–] AntiBullyRanger@ani.social 5 points 3 days ago

“Supertoys Last All Summer Long” was an installation. Brian Wilson Aldiss wasn't prophecizing, he was witnessing the preparation.

[–] MourningDove@lemmy.zip 1 points 2 days ago

There’s a thing called “AI Boyftiend”? That’s fucking embarrassing.