this post was submitted on 15 Oct 2025
1234 points (99.0% liked)
Microblog Memes
9446 readers
2645 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Not strictly LLMs, but neural nets are really good at protein folding, something that very much directly helps understanding cancer amount other things. I know an answer doesn't magically pop out, but it's important to recognise the use cases where NN actually work well.
I'm trying to guess what industries might do well if the AI bubble does burst. I imagine there will be huge AI datacenters filled with so-called "GPUs" that can no longer even do graphics. They don't even do floating point calculations anymore, and I've heard their integer matrix calculations are lossy. So, basically useless for almost everything other than AI.
One of the few industries that I think might benefit is pharmaceuticals. I think maybe these GPUs can still do protein folding. If so, the pharma industry might suddenly have access to AI resources at pennies on the dollar.
integer calculations are lossy because they're integers. There is nothing extra there. Those GPUs have plenty of uses.
I don't know too much about it, but from the people that do, these things are ultra specialized and essentially worthless for anything other than AI type work:
https://weird.autos/@rootwyrm/115361368946190474
AI isn't even the first or the twentieth use case for those operations.
All the "FP" quotes are about floating point precision, which matters more for training and finely detailed models, especially FP64. Integer based matrix math comes up plenty often in optimized cases, which are becoming more and more the norm, especially with China's research on shrinking models while retaining accuracy metrics.
But giving all the resources to LLMs slows/prevents those useful applications of AI.