Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
TLDR: The result of current LLMs will be very bandlimited and one-directional.
I hope that means something to you, because otherwise I'm going to try to explain this very specific thing, and I'm afraid I might not be able to express it in very understandable terms (sorry):
Firstly, one-directionality: when a human wants to write a story, we usually think about the plot twist beforehand and then pave the way by hinting at the upcoming twist without giving too much away. It's just nice when a first time reader is surprised, but struggles on a second time how they missed all the obvious clues.
This process requires a lot of back-and-forth while writing. Humans do this naturally. LLMs and other transformer networks have a huge problem with this. I often hear LLMs referred to as text prediction machines. This is not entirely accurate, but a similar enough. And to keep with this analogy: text prediction doesn't really work backwards to suggest a better start to the sentence, does it? LLMs tend to take a path, from start to finish, even in great detail, but that's it. There's no setup. It's very flat writing.
Secondly, bandlimiting: Over time LLMs tend to mush different characterizations and continuity into a smooth paste, leaving little grit to it. I really struggle to not say the word derivative (like in math). But LLMs just write average characters who do average things in an average way. And then spell out how everything was totally unpredictable, important and meaningful, while using superficially eloquent language. Nothing just is everything serves as. It's a poor writing style that often misses the appropriate tone, trying to sound sophisticated.
I should point out there's a ton of mediocre writers who write just like that, before the advent of LLMs. You're describing good writing, but there's nothing unique to LLMs about it.