this post was submitted on 18 Dec 2023
236 points (97.6% liked)

Technology

34832 readers
1 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] thanks_shakey_snake@lemmy.ca 66 points 2 years ago (3 children)

This is going to get soooo much more treacherous as this becomes ubiquitous and harder to detect. Apply the same pattern, but instead of wood carvings, it's an election, or sexual misconduct trial, or war.

Our ability to make sense of things that we don't witness personally is already in bad shape, and it's about to get significantly worse. We aren't even sure how bad it is right now.

[–] KISSmyOS@lemmy.world 27 points 2 years ago (3 children)

Imagine an image like Tank Man, or the running Vietnamese girl with napalm burns on her skin, but AI generated at the right moment.
It could change the course of nations.

[–] Damage@feddit.it 23 points 2 years ago (2 children)

As lies always could.

There were no WMDs in Iraq.

[–] cygnus@lemmy.ca 13 points 2 years ago

Sure, but now you can make a video of Saddam giving a tour of a nuclear enrichment facility.

[–] usualsuspect191@lemmy.ca 3 points 2 years ago

Sure, but now you'll be able to sway all those people who were on the fence about believing the lie until they see the "evidence"

[–] duncesplayed@lemmy.one 7 points 2 years ago (1 children)

It's already happening to some extent (I think still a small extent). I'm reminded of this Ryan Long video making fun of people who follow wars on Twitter. I can say the people who he's making fun of are definitely real: I've met some of them. Their idea of figuring out a war or figuring out which side to support basically comes down to finding pictures of dead babies.

At 1:02 he specifically mentions people using AI for these images, which has definitely been cropping up here and there in Twitter discussions around Israel-Palestine.

[–] PipedLinkBot@feddit.rocks 2 points 2 years ago

Here is an alternative Piped link(s):

I'm reminded of this Ryan Long video making fun of people who follow wars on Twitter

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] thanks_shakey_snake@lemmy.ca 5 points 2 years ago

And it almost certainly will. Perhaps has already.

[–] otter@lemmy.ca 12 points 2 years ago (2 children)

and the flipside is also a problem

Now legitimate evidence can be dismissed as "AI generated"

[–] thanks_shakey_snake@lemmy.ca 5 points 2 years ago (1 children)

Exactly-- They're two sides of the same coin. Being convinced by something that isn't real is one type of error, but refusing to be convinced by something that is real is just as much of an error.

Some people are going to fall for just about everything. Others are going to be so apprehensive about falling for something that they never believe anything. I'm genuinely not sure which is worse.

[–] Anticorp@lemmy.ml 3 points 2 years ago (1 children)

We already saw that with nothing more than two words. Trump started the "fake news" craze, and now 33% of Americans dismiss anything that contradicts their views as fake news, without giving it any thought or evaluation. If a catch phrase is that powerful, imagine how much more powerful video and photography will be. Even in 2019 there was a deep fake floating around of Biden with a Gene Simmons tongue, licking his lips, and I personally know several people who thought it was real.

[–] thanks_shakey_snake@lemmy.ca 2 points 2 years ago

Great example. Yeah, I've had to educate family members about deepfakes because they didn't even know that they were possible. This was on the back of some statement like "the only way to know for sure is to see video." Uh... Sorry fam, I have some bad news...

[–] DrPop@lemmy.ml 4 points 2 years ago

Analog is the way to go now

[–] Bitrot@lemmy.sdf.org 1 points 2 years ago* (last edited 2 years ago)

It already happening. Adobe is selling them but even if they weren't it's not hard to do.

I think the worst of it is going to be places like Facebook where people already fall for terrible and obvious Photoshop images. They won't notice it there are mistakes, even as AI gets better and there are fewer mistakes (Dall-E used to be awful at hands, not so bad now). However even smart folks will fall for these.