this post was submitted on 26 Aug 2025
1203 points (96.9% liked)

interestingasfuck

8023 readers
4 users here now

interestingasfuck

founded 2 years ago
 
(page 2) 50 comments
sorted by: hot top controversial new old
[–] Matriks404@lemmy.world 4 points 5 days ago

People will still create deepfakes (you know what I mean, lol). Although they'd need to be stupid to share them online.

[–] Artisian@lemmy.world 4 points 5 days ago (5 children)

I for one would like much less copyright law; it really hasn't been good to me.

[–] corsicanguppy@lemmy.ca 3 points 5 days ago* (last edited 5 days ago)

A tool used against you isn't always a bad tool; sometimes it needs to be used better.

load more comments (4 replies)
[–] pyre@lemmy.world 2 points 4 days ago (1 children)

i thought that was already copyright law? isn't that why you can't photograph people without model release forms?

[–] FatCrab@slrpnk.net 2 points 4 days ago

No, that's due to likeness rights and privacy concerns. Copyright protects creative expression and your face and body are not themselves creative expressions-- they just are. This is why you also don't get copyright protection over purely statistical data.

[–] daniskarma@lemmy.dbzer0.com 2 points 4 days ago

I don't get it. Deep fakes were still ilegal as it's an attempt against honor and fabricated defamation. Training would still fall under "fair use" as any other copyright media. What's changed?

[–] ConstantPain@lemmy.world 2 points 4 days ago (1 children)

Until techs put waivers in the EULAs...

[–] TeddE@lemmy.world 1 points 4 days ago

Well, lawyers, but yes this would need more teeth to be effective. They need to introduce some friction to slow business down on that front.

I admittedly don't understand how it has some how become easy for people, especially young people, to utilize this stuff. You can do image generation with a sufficiently strong GPU. But training requires power and VRAM.

As far as I can tell it's also limited to nVidia (except it appears all the image stuff for AMD works on Linux?) so it's expensive, you have to do so many things to set up simple image generation, and I imagine training for particular people (or anything) has to be harder to set up.

Otherwise deepfakes are just doing what Photoshop always did? Arguably Photoshop was a cheaper and easier method of creating them.

I have this feeling that generative AI is being used to normalize the idea of weaponizing it. "It took people's jobs! It made people naked and created libelous things!" Or as a means to crack down on hardware used for... Video games?

I could just be insane, but it always seems like when something seems bad, something worse is behind it.

[–] houstoneulers@lemmy.world 2 points 5 days ago

Feel like this more than just deepfakes toox would it apply to assholes recording you in public without the person’s consent?

[–] chuso@fedia.io 2 points 5 days ago

Didn't that already exist as the right to one's own image?

At least here in Spain such righ is mentioned in section 18.1 of the the Constitution from 1978 and was developed by a law in 1982 banning the capture, reproduction, use or publication by photograph, film, or any other means of a person's voice or image.

I would expect similar laws to exist in other countries. Having control of your own image and not allowing anyome to take your voice and image and make their own public use of them seems like a pretty basic right to not be regulated already before GenAI appeared.

Actually, targeting it just to GenAI and framing it as intellectual property or copyright sounds quite limited. Do you mean that as long as I don't use it for GenAI or use it for purposes not covered by copyright I can still publicly use your image in Denmark? I wouldn't expect so. The right one's own image is rooted in human dignity, privacy, and autonomy, which go beyond what a copyright law can protect.

[–] ChonkyOwlbear@lemmy.world 2 points 5 days ago

Pretty sure we don't even own our DNA.

load more comments
view more: ‹ prev next ›