this post was submitted on 15 Oct 2025
1234 points (99.0% liked)
Microblog Memes
9446 readers
2406 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Either (you genuinely belive) you are 18 (24, 36 does not matter) months away from curing cancer or you're not.
What would we as outsiders observe if they told their investors that they were 18 months away two years ago and now the cash is running out in 3 months?
Now I think the current iteration of AI is trying to get to the moon by building a better ladder, but what do I know.
The thing about AI is that it is very likely to improve roughly exponentially¹. Yeah, it's building ladders right now, but once it starts turning rungs into propellers, the rockets won't be far behind.
Not saying it's there yet, or even 18/24/36 months out, just saying that the transition from "not there yet" to "top of the class" is going to whiz by when the time comes.
¹ Logistically, actually, but the upper limit is high enough that for practical purposes "exponential" is close enough for the near future.
Then it doesn't make sense to include LLMs in "AI." We aren't even close to turning runs into propellers or rockets, LLMs will not get there.
The problem with that is they can't actually point to a metric where when the number goes beyond that point we'll have ASI. I've seen graphs where they have a dotted line that says ape intelligence, and then a bit higher up it has a dotted line that says human intelligence. But there's no meaningful way they can possibly have actually placed human intelligence on a graph of AI complexity, because brains are not AI so they shouldn't even be on the graph.
So even if things increase exponentially there's no way they can possibly know how long until we get AGI.