this post was submitted on 06 Oct 2025
862 points (99.7% liked)
Microblog Memes
9425 readers
1533 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can't it be trained to do that?
Sort of. There's a relatively new type of LLM called "tool aware" LLMs, which you can instruct to use tools like a calculator, or some other external program. As far as I know though, the LLM has to be told to go out and use that external thing, it can't make that decision itself.
Can the model itself be trained to recognize mathematical input and invoke an external app, parse the result and feed that back into the reply? No.
Can you create a multi-layered system that uses some trickery to achieve this effect most of the time? Yes, that's what OpenAI and Google are already doing by recognizing certain features of the users' inputs and changing the system prompts to force the model to output Python code or Markdown notation that your browser then renders using a different tool.