this post was submitted on 06 Nov 2025
804 points (98.1% liked)

Technology

76675 readers
1770 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 3) 50 comments
sorted by: hot top controversial new old
[–] ImmersiveMatthew@sh.itjust.works 4 points 2 days ago (3 children)

It is a gamble for sure against innovation and a blind one too. I say this as it is clear right now that scaling up LLMs while very effective at substantially improving many AI metrics, it really did not have much impact on logic. I have been calling this the Cognitive Gap and it is really holding back AI.

Clearly the big LLM companies do not have a solution to this gap despite efforts like the reasoning models and that likely means we need an entirely different tech to front end LLMs or replace them.

This begs the question…who has a line of sight on how to scale up logic and the answer as near as I can tell is no one right now. Maybe there is something in a lab somewhere, or even with just a small team or individual, but it is not presently visible. It could come out any day now and make all those Data Center investments worthwhile or may take years before we see the Cognitive Gap close which will really make those same Data Centers completely out of alignment with the value they bring.

Shorting the AI industry is a roll of the dice, but less so than the blind investments still happening in Data Centres despite no clear path to improve logic and close the Cognitive Gap. In fact shorting seems like the safer bet.

Going to be interesting as if the Cognative Gap is not closed for years to come, those Data Center investments are never going to pay off as the value will just not be there. The entire USA economy is tied to AI it seems right now so the roll of the dice is perhaps the biggest risk / reward in history.

load more comments (3 replies)
[–] Alpha71@lemmy.world 5 points 2 days ago* (last edited 2 days ago) (3 children)

I don't follow the AI bubble trend at all. But I have been seeing alot of videos all of a sudden, popping up in my recommended talking about it. Who knows.

[–] BarneyPiccolo@lemmy.today 3 points 2 days ago (3 children)

I just saw a TV commercial for a special series the local news is doing on traffic issues, and they used several quick background graphics, and at least one quick film clip that were obviously AI-generated.

You can argue over the appropriateness of AI, but most people would agree that it doesn't belong anywhere near a news broadcast. If you don't have footage, it is highly unethical to create footage to juice up your news reports.

load more comments (3 replies)
load more comments (2 replies)
[–] Pulsar@lemmy.world 2 points 2 days ago* (last edited 2 days ago) (3 children)

I have been trying to make sense of all AI Capex announcements for a while and I don't get it. So please help me out if you know the answer.

US Investment in 2024 ~$400b, 2025 ~$500b, 2026 ~600. Global investment 2026 1.5~$2.2t. Let's say $2t US Investment by end of 2026. Investment will continue into the future but let's assume that is not the case. Also that GPU will be obsolete in 5y. So, they have 60m to recoup $2t +ROI ~10%. So about $40b a month, US has a labor force of 170m. Thus, AI industry needs the equivalent of ~$240 per month per each employee. I don't see myself or my employer paying this for AI any time soon.

load more comments (3 replies)
load more comments
view more: ‹ prev next ›