this post was submitted on 18 May 2025
238 points (94.7% liked)

Ask Lemmy

31793 readers
1021 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

Lots of people on Lemmy really dislike AI’s current implementations and use cases.

I’m trying to understand what people would want to be happening right now.

Destroy gen AI? Implement laws? Hoping all companies use it for altruistic purposes to help all of mankind?

Thanks for the discourse. Please keep it civil, but happy to be your punching bag.

top 50 comments
sorted by: hot top controversial new old
[–] AsyncTheYeen@lemmy.world 12 points 2 days ago (1 children)

People have negative sentiments towards AI under a captalist system, where the most successful is equal to most profitable and that does not translate into the most useful for humanity

We have technology to feed everyone and yet we don't We have technology to house everyone and yet we don't We have technology to teach everyone and yet we don't

Captalist democracy is not real democracy

[–] Randomgal@lemmy.ca 2 points 2 days ago (1 children)

This is it. People don't have feelings for a machine. People have feelings for the system and the oligarchs running things, but said oligarchs keep telling you to hate the inanimate machine.

load more comments (1 replies)
[–] psion1369@lemmy.world 14 points 2 days ago

I want disclosure. I want a tag or watermark to let people know that AI was used. I want to see these companies pay dues for the content used in the similar vein that we have to pay for higher learning. And we need to stop calling it AI as well.

[–] HeartyOfGlass@lemm.ee 8 points 2 days ago

My fantasy is for "everyone" to realize there's absolutely nothing "intelligent" about current AI. There is no rationalization. It is incapable of understanding & learning.

ChatGPT et al are search engines. That's it. It's just a better Google. Useful in certain situations, but pretending it's "intelligent" is outright harmful. It's harmful to people who don't understand that & take its answers at face value. It's harmful to business owners who buy into the smoke & mirrors. It's harmful to the future of real AI.

It's a fad. Like NFTs and Bitcoin. It'll have its die-hard fans, but we're already seeing the cracks - it's absorbed everything humanity's published online & it still can't write a list of real book recommendations. Kids using it to "vibe code" are learning how useless it is for real projects.

[–] Bytemeister@lemmy.world 13 points 2 days ago

I'd like to have laws that require AI companies to publicly list their sources/training materials.

I'd like to see laws defining what counts as AI, and then banning advertising non-compliant software and hardware as "AI".

I'd like to see laws banning the use of generative AI for creating misleading political, social, or legal materials.

My big problems with AI right now, are that we don't know what info has been scooped up by them. Companies are pushing misleading products as AI, while constantly overstating the capabilities and under-delivering, which will damage the AI industry as a whole. I'd also want to see protections to keep stupid and vulnerable people from believing AI generated content is real. Remember, a few years ago, we had to convince people not to eat tidepods. AI can be a very powerful tool for manipulating the ranks of stupid people.

[–] fmstrat@lemmy.nowsci.com 16 points 2 days ago

Lots of copyright comments.

I want those building it at scale to stop killing my planet.

[–] Treczoks@lemmy.world 19 points 2 days ago (1 children)

Serious investigation into copyright breaches done by AI creators. They ripped off images and texts, even whole books, without the copyright owners permissions.

If any normal person broke the laws like this, they would hand out prison sentences till kingdom come and fines the size of the US debt.

I just ask for the law to be applied to all equally. What a surprising concept...

[–] Susurrus@lemm.ee 8 points 2 days ago* (last edited 2 days ago)

We are filthy criminals if we pirate one textbook for studies. But when Facebook (Meta) pirates millions of books (anywhere between 30 million and 200 million ebooks, depending on their file size), they are a brilliant and successful business.

[–] Pulptastic@midwest.social 10 points 2 days ago

Reduce global resource consumption with the goal of eliminating fossil fuel use. Burning nat gas to make fake pictures that everyone hates is just the worst.

[–] calcopiritus@lemmy.world 8 points 2 days ago (4 children)

Energy consumption limit. Every AI product has a consumption limit of X GJ. After that, the server just shuts off.

The limit should be high enough to not discourage research that would make generative AI more energy efficient, but it should be low enough that commercial users would be paying a heavy price for their waste of energy usage.

Additionally, data usage consent for generative AI should be opt-in. Not opt-out.

load more comments (4 replies)
[–] Adderbox76@lemmy.ca 6 points 2 days ago

I don't have negative sentiments towards A.I. I have negative sentiments towards the uses it's being put towards.

There are places where A.I can be super exciting and useful; namely places where the ability to quickly and accurately process large amounts of data can be critically life saving, ie) air traffic control, language translation, emergency response preparedness, etc...

But right now it's being used to paint shitty pictures so that companies don't have to pay actual artists.

If I had a choice, I'd say no AI in the arts; save it for the data processing applications and leave the art to the humans.

[–] jjjalljs@ttrpg.network 25 points 3 days ago (11 children)

Other people have some really good responses in here.

I'm going to echo that AI is highlighting the problems of capitalism. The ownership class wants to fire a bunch of people and replace them with AI, and keep all that profit for themselves. Not good.

load more comments (11 replies)
[–] Retro_unlimited@lemmy.world 6 points 2 days ago

I was pro AI in the past, but seeing the evil ways these companies use AI just disgusts me.

They steal their training data, and they manipulate the algorithm to manipulate the users. It’s all around evil how the big companies use AI.

[–] Paradachshund@lemmy.today 163 points 4 days ago (14 children)

If we're going pie in the sky I would want to see any models built on work they didn't obtain permission for to be shut down.

Failing that, any models built on stolen work should be released to the public for free.

[–] venusaur@lemmy.world 1 points 1 day ago (1 children)

Genuine curiosity. Not an attack. Did you download music illegally back in the day? Or torrent things? Do you feel the same about those copyrighted materials?

[–] Paradachshund@lemmy.today 1 points 1 day ago

Nah not really. I think piracy is a complex issue though, with far less wide reaching collateral damage. I wouldn't compare the two, personally.

[–] pelespirit@sh.itjust.works 66 points 4 days ago

This is the best solution. Also, any use of AI should have to be stated and watermarked. If they used someone's art, that artist has to be listed as a contributor and you have to get permission. Just like they do for every film, they have to give credit. This includes music, voice and visual art. I don't care if they learned it from 10,000 people, list them.

load more comments (12 replies)
[–] BackgrndNoize@lemmy.world 6 points 2 days ago

Make it unprofitable for the companies peddling it, by passing laws that curtail its use, by suing them for copyright infringement, by social shaming and shitting on AI generated anything on social media and in person and by voting with your money to avoid anything that is related to it

[–] Furbag@lemmy.world 30 points 3 days ago (3 children)

Long, long before this AI craze began, I was warning people as a young 20-something political activist that we needed to push for Universal Basic Income because the inevitable march of technology would mean that labor itself would become irrelevant in time and that we needed to hash out a system to maintain the dignity of every person now rather than wait until the system is stressed beyond it's ability to cope with massive layoffs and entire industries taken over by automation/AI. When the ability of the average person to sell their ability to work becomes fundamentally compromised, capitalism will collapse in on itself - I'm neither pro- nor anti-capitalist, but people have to acknowledge that nearly all of western society is based on capitalism and if capitalism collapses then society itself is in jeopardy.

I was called alarmist, that such a thing was a long way away and we didn't need "socialism" in this country, that it was more important to maintain the senseless drudgery of the 40-hour work week for the sake of keeping people occupied with work but not necessarily fulfilled because the alternative would not make the line go up.

Now, over a decade later, and generative AI has completely infiltrated almost all creative spaces and nobody except tech bros and C-suite executives are excited about that, and we still don't have a safety net in place.

Understand this - I do not hate the idea of AI. I was a huge advocate of AI, as a matter of fact. I was confident that the gradual progression and improvement of technology would be the catalyst that could free us from the shackles of the concept of a 9-to-5 career. When I was a teenager, there was this little program you could run on your computer called Folding At Home. It was basically a number-crunching engine that uses your GPU to fold proteins, and the data was sent to researchers studying various diseases. It was a way for my online friends and I to flex how good our PC specs were with the number of folds we could complete in a given time frame and also we got to contribute to a good cause at the same time. These days, they use AI for that sort of thing, and that's fucking awesome. That's what I hope to see AI do more of - take the rote, laborious, time consuming tasks that would take one or more human beings a lifetime to accomplish using conventional tools and have the machine assist in compiling and sifting through the data to find all the most important aspects. I want to see more of that.

I think there's a meme floating around that really sums it up for me. Paraphrasing, but it goes "I thought that AI would do the dishes and fold my laundry so I could have more time for art and writing, but instead AI is doing all my art and writing so I have time to fold clothes and wash dishes.".

I think generative AI is both flawed and damaging, and it gives AI as a whole a bad reputation because generative AI is what the consumer gets to see, and not the AI that is being used as a tool to help people make their lives easier.

Speaking of that, I also take issue with that fact that we are more productive than ever before, and AI will only continue to improve that productivity margin, but workers and laborers across the country will never see a dime of compensation for that. People might be able to do the work of two or even three people with the help of AI assistants, but they certainly will never get the salary of three people, and it means that two out of those three people probably don't have a job anymore if demand doesn't increase proportionally.

I want to see regulations on AI. Will this slow down the development and advancement of AI? Almost certainly, but we've already seen the chaos that unfettered AI can cause to entire industries. It's a small price to pay to ask that AI companies prove that they are being ethical and that their work will not damage the livelihood of other people, or that their success will not be born off the backs of other creative endeavors.

load more comments (3 replies)
[–] noxypaws@pawb.social 17 points 3 days ago

Admittedly very tough question. Here are some of the ideas I just came up with:

Make it easier to hold people or organizations liable for mistakes made because of haphazard reliance on LLMs.

Reparations for everyone ever sued for piracy, and completely do away with intellectual privacy protections for corporations, but independent artists get to keep them.

Public service announcements campaign aimed at making the general public less trustful of LLMs.

Strengthen consumer protection such that baseless claims of AI capabilities in advertising or product labeling are legally dangerous to make.

Fine companies for every verifiably inaccurate result given to a customer or end user by an LLM

[–] drmoose@lemmy.world 10 points 2 days ago

I generally pro AI but agree with the argument that having big tech hoard this technology is the real problem.

The solution is easy and right there in front of everyone's eyes. Force open source on everything. All datasets, models, model weights and so on have to be fully transparent. Maybe as far as hardware firmware should be open source.

This will literally solve every single problem people have other than energy use which is a fake problem to begin with.

[–] Tehdastehdas@lemmy.world 8 points 2 days ago* (last edited 2 days ago)

We're making the same mistake with AI as we did with cars; not planning human future.

Cars were designed to atrophy muscles, and polluted urban planning and the air.
AI is being designed to atrophy brains, and pollutes the air, the internet, public discourse, and more to come.

We should change course towards AI that makes people smarter, not dumber: AI-aided collaborative thinking.
https://www.quora.com/Why-is-it-better-to-work-on-intelligence-augmentation-rather-than-artificial-intelligence/answer/Harri-K-Hiltunen

[–] november@lemmy.vg 90 points 4 days ago (24 children)

I want people to figure out how to think for themselves and create for themselves without leaning on a glorified Markov chain. That's what I want.

load more comments (24 replies)
[–] Bwaz@lemmy.world 16 points 3 days ago

I'd like there to be a web-wide expectation by everyone that any AI generated text, comment, story or image be clearly marked as being AI. That people would feel incensed and angry when it wasn't labeled so. Rather than wondering whether there were a person with a soul producing the content, or losing faith that real info could be found online.

[–] BertramDitore@lemm.ee 70 points 4 days ago (15 children)

I want real, legally-binding regulation, that’s completely agnostic about the size of the company. OpenAI, for example, needs to be regulated with the same intensity as a much smaller company. And OpenAI should have no say in how they are regulated.

I want transparent and regular reporting on energy consumption by any AI company, including where they get their energy and how much they pay for it.

Before any model is released to the public, I want clear evidence that the LLM will tell me if it doesn’t know something, and will never hallucinate or make something up.

Every step of any deductive process needs to be citable and traceable.

load more comments (15 replies)
[–] wetbeardhairs@lemmy.dbzer0.com 13 points 3 days ago

I want the LLMs to be able to determine their source works during the query process to be able to pay the source copyright owners some amount. That way if you generate a Ms Piggy image, it pays the Henson Workshop some fraction of a penny. Eventually it would add up.

AI overall? Generally pro. LLMs and generative AI, though, I'm "against", mostly meaning that I think it's misused.

Not sure what the answer is, tbh. Reigning in corporations would be good.

I do think we as a society need to radically alter our relationship to IP law. Right now we 'enforce' IP law in a way that benefits corporations but not individuals. We should either get rid of IP law altogether (which would protect people from corporations abusing the laws) or we should enforce it more strictly, and actually hold corporations accountable for breaking it.

If we fixed that, I think gen AI would be fine. But we aren't doing that.

[–] Taleya@aussie.zone 25 points 3 days ago* (last edited 3 days ago) (1 children)

What do I really want?

Stop fucking jamming it up the arse of everything imaginable. If you asked for a genie wish, make it it illegal to be anything but opt in.

load more comments (1 replies)
[–] daniskarma@lemmy.dbzer0.com 21 points 3 days ago

I'm not against it as a technology. I use it for my personal use, as a toy, to have some fun or to whatever.

But what I despise is the forced introduction everything. AI written articles and AI forced assistants in many unrelated apps. That's what I want to disappear, how they force in lots of places.

[–] mad_djinn@lemmy.world 7 points 2 days ago

force companies to pay for the data they scraped from copyrighted works. break up the largest tech conglomerates so they cannot leverage their monopolistic market positions to further their goals, which includes the investment in A.I. products.

ultimately, replace the free market (cringe) with a centralized computer system to manage resource needs of a socialist state

also force Elon Musk to receive a neuralink implant and force him to hallucinate the ghostly impressions of spongebob squarepants laughing for the rest of his life (in prison)

[–] umbraroze@slrpnk.net 31 points 3 days ago (1 children)

The technology side of generative AI is fine. It's interesting and promising technology.

The business side sucks and the AI companies just the latest continuation of the tech grift. Trying to squeeze as much money from latest hyped tech, laws or social or environmental impact be damned.

We need legislation to catch up. We also need society to be able to catch up. We can't let the AI bros continue to foist more "helpful tools" on us, grab the money, and then just watch as it turns out to be damaging in unpredictable ways.

load more comments (1 replies)
[–] Saleh@feddit.org 20 points 3 days ago* (last edited 3 days ago) (5 children)

First of all stop calling it AI. It is just large language models for the most part.

Second: immediate carbon tax in line with the current damage expectations for emissions on the energy consumption of datacenters. That would be around 400$/tCO2 iirc.

Third: Make it obligatory by law to provide disclaimers about what it is actually doing. So if someone asks "is my partner cheating on me". The first message should be "this tool does not understand what is real and what is false. It has no actual knowledge of anything, in particular not of your personal situation. This tool just puts words together that seem more likely to belong together. It cannot give any personal advice and cannot be used for any knowledge gain. This tool is solely to be used for entertainment purposes. If you use the answers of this tool in any dangerous way, such as for designing machinery, operating machinery, financial decisions or similar you are liable for it yourself."

load more comments (5 replies)
[–] boaratio@lemmy.world 20 points 3 days ago* (last edited 3 days ago)

For it to go away just like Web 3.0 and NFTs did. Stop cramming it up our asses in every website and application. Make it opt in instead of maybe if you're lucky, opt out. And also, stop burning down the planet with data center power and water usage. That's all.

Edit: Oh yeah, and get sued into oblivion for stealing every copyrighted work known to man. That too.

Edit 2: And the tech press should be ashamed for how much they've been fawning over these slop generators. They gladly parrot press releases, claim it's the next big thing, and generally just suckle at the teet of AI companies.

[–] justOnePersistentKbinPlease@fedia.io 43 points 4 days ago (16 children)

They have to pay for every copyrighted material used in the entire models whenever the AI is queried.

They are only allowed to use data that people opt into providing.

load more comments (16 replies)
load more comments
view more: next ›