this post was submitted on 07 Feb 2025
1086 points (99.5% liked)

People Twitter

7062 readers
1084 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] stevedice@sh.itjust.works 52 points 3 months ago (24 children)

PSA: Don't spray your cats. It not only strains your relationship, it's useless.

[–] Texas_Hangover@sh.itjust.works 16 points 3 months ago (10 children)

That's all lies. I've sprayed my cat like 5 times in his life, and lo and behold, he doesn't do the dumbasses things that got him sprayed anymore.

[–] dependencyinjection@discuss.tchncs.de -5 points 3 months ago* (last edited 3 months ago) (8 children)

edit:

I remove the output from ChatGPT due to people not being happy with that.

I’ve spend some time reading all the results from Google on this topic and the overwhelming consensus is that it is not good to spray your cat with water.

This is true even if you use an unbiased search term.

[–] Magnus@lemmy.dbzer0.com 27 points 3 months ago (1 children)
[–] dependencyinjection@discuss.tchncs.de -2 points 3 months ago (2 children)

😂.

That made me chuckle. Naughty LLM.

On a level though I don’t really get the disdain for them as search is a nightmare now and it’s a lot easier to just get the LLM to do it for you.

[–] Whats_your_reasoning@lemmy.world 12 points 3 months ago* (last edited 3 months ago) (2 children)

Except when it hallucinates, draws from biased sources, or straight-up responds with false information.

I'd rather look through the available links myself and research the direct source things came from. AI isn't trained to look specifically for factual information. Unfortunately, a lot of people aren't trained for that, either. But we can still educate ourselves. Relying on a bot is putting one more space between the information you receive and the source that created it.

I'd rather get my information from as close to the original source as possible. Only then can I determine if the source is even worth trusting in the first place.

Thanks for replying. I prefer when people actually articulate their disapproval to something than just downvote it, as it allows the other person to understand more.

Your comment is very reasonable and it makes me think that I perhaps give them too much credit when it’s a subject I’m not an expert in. We have embraced LLMs at work as software engineers for small company and it allows us to save so much time on the stuff we do over and over again. But that because we know the subject matter and it’s quite easier to see when they’re hallucinating. I should be more cautious when using them for stuff I’m not familiar with.

At work I work for a good company and we save so much time making enterprise software using LLMs as tools that we recently got a pay rise and reduction of hours in the same day.

[–] nednobbins@lemm.ee 2 points 3 months ago

When I use LLMs for search, I always ask for sources and then follow up.

[–] zarkanian@sh.itjust.works 1 points 3 months ago

Why is search a nightmare now?

load more comments (6 replies)
load more comments (7 replies)
load more comments (20 replies)