this post was submitted on 20 Feb 2026
223 points (99.6% liked)

Not The Onion

20532 readers
1412 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] deadbeef79000@lemmy.nz 29 points 3 days ago* (last edited 3 days ago) (2 children)

To be fair, a trained LLM was probably better at identifying DEI than whatever musky chump they had driving it.

The whole premise is evil, but this possibly was more efficient.

[–] foggy@lemmy.world 13 points 3 days ago (1 children)

100% not true if they were using a single session to check multiple grants.

Every prompt you send contains a hashed version of your entire conversation with the chatbot. When this exceeds the chat bots context window, it's answers become less and less relevant.

You'll notice this if you've ever had a chat or guide you through something for an hour or more. It eventually gets something wrong takes you down a rabbit hole, and goes in a big circle. At this point, it can be very difficult to get the chat bot to simply respond to your prompt, i.e. if you say "you know what let's talk about _______ instead." It will keep talking about whatever you were talking about staying in your dumb rabbit hole loop.

So if they did this with multiple grants eventually it would basically realize theyre looking for "yes that's dei" and just responding with different versions of that ad nauseam.

[–] HobbitFoot@thelemmy.club 1 points 3 days ago

Yeah, but if the people who are hired to review grants are checking for DEI, are they smart enough to understand what they're reading?

[–] green_red_black@slrpnk.net 13 points 3 days ago (1 children)

Unfortunately it wouldn’t be better. Rather it would be a coin flip. Sometimes it will use the genuine definition, other times it would use the BS Definition

[–] Quacksalber@sh.itjust.works 13 points 3 days ago (1 children)

And 100% of the time it will agree with the user. So if they ever asked "Are you sure this isn't DEI?", it would agree with them.

[–] shneancy@lemmy.world 9 points 3 days ago

"Good observation! The concept of breathing is associated with DEI by some circles of LGBTQ people. As they say — queer people need air 🌪"

or something like that idk i don't speak AI