FaceDeer

joined 2 years ago
[–] FaceDeer@fedia.io 1 points 7 months ago

Well, yes, why would you believe something without seeing it? But given how litigious the publishing industry is about this kind of thing I don't see it as likely that they wouldn't fight.

[–] FaceDeer@fedia.io 4 points 7 months ago

They'll compare the amount the publishers are demanding against how much it would cost them to lawyer up to prevent that and any future payments. Meta's heavyweight enough that they can use "lobbying their way out of the law, aka changing the law so that they're not violating it at all" as a strategy.

If they do simply pay the publishers off, oh well, at least it's just the status quo. But I don't see a reason to assume that's the way this is going to go. Other countries have already carved explicit exceptions to copyright for AI training, Meta would be in favor of that kind of thing.

[–] FaceDeer@fedia.io 7 points 7 months ago (3 children)

You think Meta will just roll over and hand out whatever penalties the publishers demand of them?

Meta isn't going to be defending us. It's going to be defending itself. Because it is now one of us.

[–] FaceDeer@fedia.io 44 points 7 months ago (12 children)

I think this is still going to be a net benefit to us, though. Meta may not have contributed much bandwidth, which is leeching in the short term, but in the long term they're now forced to contribute something much more important; lawyer power. Meta is going to have to fight to defend piracy.

[–] FaceDeer@fedia.io 2 points 7 months ago (7 children)

Amazon is not a startup.

[–] FaceDeer@fedia.io 35 points 7 months ago

Oh, wow. Better than I imagined.

[–] FaceDeer@fedia.io 6 points 7 months ago (1 children)

Trump imagines himself a real estate magnate who builds luxury vacation properties, this might genuinely be one of his own ideas this time. It's terrible enough that I'd believe it.

[–] FaceDeer@fedia.io 1 points 7 months ago

AI would be able to do a good first pass on it. Except that an AI that was able to reliably recognize child porn would be a useful tool for creating child porn, so maybe don't advertise that you've got one on the job.

[–] FaceDeer@fedia.io 15 points 7 months ago

I expect that a lot of this isn't "bowing down to Trump", it's just people going "oh good, I can stop wasting money on initiatives I didn't really believe in without facing backlash over it."

[–] FaceDeer@fedia.io 1 points 7 months ago

generative ai though? absolutely not, we need to burn it down.

If it's really not useful then there's no need to burn it down. It's expensive to run so anyone using it must just be burning money themselves.

That's not true, though. I know it's not true because I'm making extensive use of AI myself, and it is indeed useful. I even run local models for some of the tasks I use AI with. I can assure you it's not going away because I have all the tools I need to keep on using it indefinitely, even if for some reason the companies producing this stuff all shut down or stopped right this moment.

It may not be useful to you, and that's fine - use it or don't, it's up to you. But it's not going away because other people do want to use it for various things.

[–] FaceDeer@fedia.io 2 points 7 months ago (2 children)

Abolition is simply not going to happen, though. It's not a realistic goal. AI has proven to be useful and enough technology has been released as open source that it's going to continue to be developed even if the big obvious targets like OpenAI stop.

[–] FaceDeer@fedia.io -1 points 7 months ago (4 children)

Okay, so you don't collaborate with them, and they carry on developing AI their own way without your input. Probably not going to lead to the outcome you hope for.

This is a problem I see for a lot of the stridently anti-AI commenters I've encountered both here and on Reddit; all they want is for AI to not exist, and refuse to engage in any way beyond that. But AI does exist, it's not going to "go away", and so by approaching it that way they give up any opportunity to influence it.

view more: ‹ prev next ›