this post was submitted on 20 Jun 2025
11 points (92.3% liked)

Hardware

2764 readers
70 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] themoonisacheese@sh.itjust.works 12 points 1 week ago (3 children)

Sure grandpa, let's get you to bed.

I work in semiconductors, and the very best "AI" has to offer in this space is just searching documentation. It is fundamentally useless at anything silicon-design related, much to the dismay of EDA tool vendors who would really like you to buy more licenses for shit you don't need.

[โ€“] nutt_goblin@lemmy.world 1 points 5 days ago

These articles tend to overstate "using AI to design" technical systems when conceptually they're really fuzzing the performance characteristics of different variations on an algorithm against a reference implementation, and validating its correctness.

I'm coming from software so I'm not sure how much simulation/verification can be done automatically in chip design, but I'd bet money this sort of process is what they're talking about, similar to the alpha evolve paper from a bit ago. Rather than going "hey chatgpt how do I design this chip??"

load more comments (2 replies)