this post was submitted on 25 Feb 2026
907 points (99.2% liked)

Technology

82001 readers
3276 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] XLE@piefed.social 15 points 2 days ago* (last edited 2 days ago) (2 children)

In the original announcement that they added translation, they didn't call it AI. They didn't even call it machine "learning" or machine translation there.

They just called it local, automated translation.

Maybe you should take your own advice about reading, and double-check my comment ;)

[–] tigeruppercut@lemmy.zip 4 points 2 days ago (1 children)

When I turned it off the translation thingy went away, so I'm not sure if it was AI all along and they were lying about it or not. Just as well, there's an extension that works fine and it doesn't reload the page every time I toggled it like the built in one did.

[–] XLE@piefed.social 7 points 2 days ago* (last edited 2 days ago) (1 children)

The translation is technically AI, but it's a distant cousin to the LLMs and image generators that have repulsed so many people. (The term AI is such a broad and vague umbrella that Netflix recommendations count as AI.) And, even more notably, this is before Mozilla started marketing things as AI.

It was also a joint non-profit venture with a university, rather than today's weird gimmicks or for-profit partnerships.

[–] ricecake@sh.itjust.works 1 points 1 day ago (1 children)

It's less a vague umbrella and more an academic category. It just feels odd to call it vague in the same way you wouldn't call "chemistry" vague, despite it having applications ranging from hand soap to toxic waste.

[–] XLE@piefed.social 2 points 1 day ago (1 children)

In this case, the vagueness of the term AI is abused by its fans. "Aha, you claim to hate AI, and yet..." they say. They should know better.

"Chemicals" is actually a great example. If someone said "Chemicals are coming out of that factory", you'd rightfully cringe if a factory manager said "well actually soap is made of chemicals too"

[–] ricecake@sh.itjust.works 1 points 1 day ago (1 children)

I take your point. :)

It's worth mentioning in my opinion though, because if someone were to say "we should ban chemicals" it'd be worthwhile to point out what that actually means.

I don't actually think the broadness of the category is intentionally abused, it's just that it's an incredibly common thing to remove anything from the AI category that's explicable.

I feel slightly more hanlons razor about it since there's people in my city talking about and petitioning on the popular notion of banning all data centers from the state, and how it would be awful if s data center came here. I know what they mean, but it's not what they're trying to get the law to do, and our city already has six data centers I know of off the top of my head. The language drift is fine, but when it starts to conflate with policy it's another issue.

[–] XLE@piefed.social 1 points 1 day ago (1 children)

Considering data centers tend to harm the local communities, yeah, I can't blame them for wanting data centers out of their community. Make sure they don't break the law to poison the air of local communities like Elon Musk's data center did. Fix the other electricity loopholes. Make sure they don't destroy local water tables. Tax them appropriately. Don't let them lie about employment opportunities... And maybe then we can talk about whether they should be built.

[–] ricecake@sh.itjust.works 2 points 1 day ago (1 children)

I think the part you're missing is that 1) it's my community too 2) they're not talking about AI data centers, or new data centers or anything like that, they're petitioning to ban all data centers, and 3) we have multiple data centers in the city already that no one complained about until AI data centers became a thing people felt concerned about.

There's a major difference between the 2 square mile hyper scale AI data center that requires a nuclear reactor and a full water treatment plant to cool and the 2 acre data center that's air cooled and has no more ground pollution than any other parking lot and essentially a warehouse.
The state government has two in the city, at least, for processing electronic tax records, applications and hosting service sites. We have a few national insurance companies that need to process all the things they process. A research university, and a web hosting company round out the list of ones I know about.

This is my entire point about why sometimes it's really necessary to point out that what someone is referring to is only a small part of what the words they're using describe. The language being imprecise doesn't matter until someone proposes a law outlawing chemicals, shuttering all data centers, or banning AI.

LLMs are problematic. My fancy rice maker isn't.

[–] XLE@piefed.social 1 points 1 day ago (1 children)

Touche. It is interesting that in this case, the differentiation between "AI data center" and "non-AI data center" is almost as important a distinction between "bad chemicals" and "chemicals" in general. I was previously familiar with the harm of living close to a Bitcoin mining farm, but a conventional data center, not so much.

[–] ricecake@sh.itjust.works 2 points 1 day ago

Yeah, the conventional ones still draw a good chunk of power, and they're not clean but they're not dirty. Same as how a grocery store isn't good for the environment but you're not looking at them first for places to clean.

They tend to be boring, and are usually not a public thing but just something owned by a company to house their computers. The only reason I know about the ones near me is I used to work at one and people would move jobs to or from other ones. (As an aside, a datacenter is a great place to nap if you like white noise).

For a sense of scale:

This is the site of an open AI data center. The yellow square is about 1 square mile and mostly encompasses the area they plan to/have filled.

That angle shows more build out.

This photo has two normal data centers in it. The yellow square is also about 1 square mile. I've highlighted the data centers in red. One is to the left of the square near the middle, and the other is down from the right side near the big piles of what looks like rocks. (Spoilers: it's rocks. They make asphalt). The sprawling complex in the upper right is a refrigerated grocery store distribution complex. The middle on the other side of the block from the asphalt is a coal power plant.

Of the things in this picture, I'm most upset about the giant freeway interchange. Coal is shit, but it's a modern plant so it's not belching soot, just co2, and the utility is phasing it out anyway. The grocery traffic is mostly dead except between the hours of midnight and 7am when they do restocks.
I can hear the freeway if I go outside.

[–] Orygin@sh.itjust.works 0 points 1 day ago* (last edited 1 day ago) (1 children)

And?
Because the term AI was not in vogue at the time, even though it's clearly the same technology, it doesn't count? It's literally packaged under the same umbrella now.

Anyway, the big issue is still tech ppl thinking their viewpoint is the only one valid, and that every generic user will have the same exact needs as them.

[–] XLE@piefed.social 1 points 1 day ago (1 children)

I already addressed all of these arguments in another comment in this thread...

[–] Orygin@sh.itjust.works 0 points 1 day ago* (last edited 1 day ago) (1 children)

Not all these arguments no.
You're defending your position that this AI feature is not really AI so it's ok, but the others are all bad because of the two letters of the devil.
Still AI is a marketing term, always has been. AI in the form of machine learning has been around for more than a decade, and lots of things already use that.
The knee jerk reaction of tech circles saying mozilla will sell their soul because there is no "kill switch" is so fucking dumb. Even more dumb is thinking no other users may want any of these features. Unless you work at Mozilla, and/or do product research for browsers, chances are you most likely have no idea how people will want to use these features in their day to day.
Even working on one's own product in a company, few really understand the users needs and wants, especially tech persons.
I can guarantee you, the weird gimmick you don't understand is crucial to some.

[–] XLE@piefed.social 2 points 1 day ago (1 children)

You're defending your position that this AI feature is not really AI so it's ok

I literally say "The translation is technically AI," so no. I give reasons how the other features are different, which you seem to acknowledge a little, at least.

the weird gimmick you don't understand is crucial to some

Can you describe how to access the gimmick and which people find it crucial? I'm pretty confident in my understanding of it and how hilariously unhelpful it is.

[–] Orygin@sh.itjust.works 1 points 1 day ago (2 children)

Being technically something implies it's not really or to be considered apart from the group.

The "gimmick" is proposing alt text based on the image when editing PDFs. I don't see how it's unhelpful. I'm not into editing PDFs in firefox, but I do use it to read them.
Inciting editors to include an alt text for accessibility seems like the ideal use case for this tech. The human still has to review and approve the generated text.
Unless I missed something as I cannot try the feature now, it seems to me a great application of ai, to augment humans in their work, and to a useful cause.
Image classification and description is "old" tech now, and I already use it in my work to auto tag images for editors to find more easily later. Nothing crazy.

[–] XLE@piefed.social 1 points 1 day ago

There enlies my point. Mozilla added a feature that is buried so deep in Firefox that you don't even know if it's actually there.

It is there, believe it or not. I criticize these things, but I also tested it the day it arrived.

But despite the way Mozilla buried it, the code is still there. It still makes Firefox more complex to maintain, and Mozilla still spent time and money putting it in. Imagine if Mozilla spent those resources actually trying to help people, instead of treating AI like companies used to treat blockchain: as a solution looking for a problem.

[–] XLE@piefed.social 0 points 1 day ago (1 children)

The "gimmick" is proposing alt text based on the image when editing PDFs. I don't see how it's unhelpful.

A gambling toolbar that links to Polymarket could be helpful. But I think we both said "crucial".

If you know someone who uses Firefox to add images to PDFs so often that the alt text generation would be crucial to them, or even more than a gimmick, please introduce me to them. I have so many burning questions. Several things related to "why not a dedicated PDF editor?!"

[–] Orygin@sh.itjust.works 1 points 1 day ago (1 children)

Please never develop any software for other humans without first developing any kind of compassion or empathy for others.
You are the stereotypical nerd that doesn't understand people may have different needs than you, so I have to justify how a feature relating to accessibility can be useful...

[–] XLE@piefed.social 1 points 1 day ago

I don't know which angle is more interesting, the fact your "compassion and empathy" started with implying people who disagree with you are stupid, or the fact that we both know that a hidden feature tangentially "relating to" accessibility isn't remotely the same as something that's crucial...