It's always interesting seeing the hatred of AI clash with the hatred of Reddit.
FaceDeer
I'm "pro-AI" in the sense that I don't knee-jerk oppose it.
I do in fact use AI to summarize things a lot. I've got an extension in Firefox that'll do it to anything. It generally does a fine job.
And when I saw the reply it had plenty of downvotes already, because this is technology@lemmy.world and people are quick to pounce on anything that sounds like it might be pro-AI. You're doing it yourself now, eyeing me suspiciously and asking if I'm one of those pro-AI people. Since there were plenty of downvotes the ambiguity of your comment meant my interpretation should not be surprising.
It just so happens that I am a Wikipedia editor, and I'm also pro-AI. I think this would be a very useful addition to Wikipedia, and I hope they get back to it when the dust settles from this current moral panic. I'm disappointed that they're pausing an experiment because that means that the "discussion" that will be had now will have less actually meaningful information in it. What's the point in a discussion without information to discuss?
No Wikipedia editor has to work on anything, if they don't want to interact with those summaries then they don't have to.
And no, it wasn't quite obvious that that's what you were talking about. You said "Looks like the vast majority of people disagree D:". Since you were directly responding to a comment that had been heavily downvoted by the technology@lemmy.world community it was a reasonable assumption that those were the people you were talking about.
Disabling would necessarily mean disabling it wiki-wide,
No it wouldn't, why would you think that? Wikipedia has plenty of optional features that can be enabled or disabled on a per-user basis.
Did Anthropic accept the ToS? Reddit's publishing their information on a public website that anyone can visit and read without agreeing to any terms. If they didn't accept the ToS then the only thing regulating what you can do with that public information is the usual copyright. AI training has yet to be shown to be a violation of copyright.
I'm not talking about them at all. I'm talking about the technology@lemmy.world Fediverse community. It's an anti-AI bubble. Just look at the vote ratios on the comments here. The guy you responded to initially said "Finally, a good use case for AI" and he got close to four downvotes per upvote. That's what I'm talking about.
The target of these AI summaries are not Wikipedia editors, it's Wikipedia readers. I see no reason to expect that target group to be particularly anti-AI. If Wikipedia editors don't like it there'll likely be an option to disable it.
Miguel's claims are:
- The summaries are factually inaccurate
- Generating the summaries are environmentally damaging.
- Summarization is "largely already being done by someone"
There's an anecdote in a talk page about one summary being inaccurate. A talk page anecdote is not a usable citation.
Survey results aren't measuring environmental impact.
An the whole point of AI is to take the load off of someone having to do things manually. Assuming they actually are - even in this thread there are plenty of complaints about articles on Wikipedia that lack basic summaries and jump straight into detailed technical content.
A lot of stubs should be deleted until they are expanded
How does one expand a deleted article?
Wikipedia is not intended to be presenting a finished product, it's an eternal work in progress. A stub is the start of an article. If you delete an article whenever it gets started that seems counterproductive.
You realize this is just a proposal at this stage? Their proposed next step is an experiment:
If we introduce a pre-generated summary feature as an opt-in feature on a the mobile site of a production wiki, we will be able to measure a clickthrough rate greater than 4%, ensure no negative effects to session length, pageviews, or internal referrals, and use this data to decide how and if we will further scale the summary feature.
Note, an opt-in clickthrough that they intend to monitor for further information on how to implement features like this and whether they should monitor them at all. As befits Wikipedia, they're planning to base these decisions on evidence.
If "they're gathering evidence and making proposals" is the threshold for you to jump ship to some other encyclopedia, I guess you do you. It's not going to be much of an exodus though since nobody who actually uses Wikipedia has seen anything change.
The problem is that the bubble here are the editors who actually create the site and keep it running
No it isn't, it's the technology@lemmy.world Fediverse community.
What an unbiased view. Got any citations?
A real thing.