forkDestroyer

joined 10 months ago
[–] forkDestroyer 2 points 2 days ago

"if you do it correctly" holds a lot of weight to this argument. I'd be worried of anyone who wants to start from scratch, instead of building on the current foundation.

You make good points, though. I'm just more skeptical than you.

[–] forkDestroyer 10 points 5 days ago

I regret to inform you that salaries in tech are not as glorious as I thought they'd be. I'd be surprised to have enough to own a farm any time soon.

Would be nice to be able to afford a house, though.

[–] forkDestroyer 3 points 1 week ago (2 children)

instead voting for independent anarchist parties which try to get rid of as many laws and government institutions and also nationalize anything the people will be better served by, under collective ownership.

Once the laws are gone/things are deregulated, the corpos will likely take over. Nationalizing sounds good, but likely won't end well without regulations, imo.

[–] forkDestroyer 22 points 1 week ago (5 children)

I'm being a bit extra but...

Your statement:

The article headline is wildly misleading, bordering on being just a straight up lie.

The article headline:

A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It

The general story in reference to the headline:

  • He found csam in a known AI dataset, a dataset which he stored in his account.
  • Google banned him for having this data in his account.
  • The article mentions that he tripped the automated monitoring tools.

The article headline is accurate if you interpret it as

"A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It" ("it" being "csam").

The article headline is inaccurate if you interpret it as

"A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It" ("it" being "reporting csam").

I read it as the former, because the action of reporting isn't listed in the headline at all.

^___^

[–] forkDestroyer 1 points 1 week ago

Google isn't the only service checking for csam. Microsoft (and other file hosting services, likely) also have methods to do this. This doesn't mean they also host csam to detect it. I believe their checks use hash values to determine if a picture is already clocked as being in that category.

This has existed since 2009 and provides good insight on the topic, used for detecting all sorts of bad category images:

https://technologycoalition.org/news/the-tech-coalition-empowers-industry-to-combat-online-child-sexual-abuse-with-expanded-photodna-licensing/

[–] forkDestroyer 1 points 1 week ago (1 children)

Lived a block away from a Catholic church for several years. Bells did not ring for half an hour, but I do remember them ringing a little longer for some holidays.

I guess other Catholic churches handle it differently?

[–] forkDestroyer 7 points 1 week ago (3 children)

Idk, bells sounding the hour are basically like any clock to me (I like clocks so I'm biased). Spoken prayer over a loud speaker is much more invasive imo.

[–] forkDestroyer 4 points 1 week ago (1 children)

There's no point in taking moderation seriously. It's a volunteer position and the tools to handle slop aren't there.

I hear they're bringing back digg.

I also hear it's a former Reddit C-suite that's bringing it back, so it'll probably be more of the same.

[–] forkDestroyer 10 points 1 month ago (1 children)

And his lyrics for Osmosis Jones indicate he might be singing for another type of fan

[–] forkDestroyer 0 points 1 month ago

I dunno but their discord has been locked for some time and I used to like it for the memes.

[–] forkDestroyer 2 points 1 month ago

Surprised this system wasn't already in place, honestly.

[–] forkDestroyer 7 points 1 month ago

Last thing I want is for someone remoting into it so they can set up a bunch of Home Alone traps while I'm at work.

view more: next ›