FaceDeer

joined 2 years ago
[–] FaceDeer@fedia.io 2 points 1 year ago (2 children)

It's complicated, but this might be considered a war crime. A key quote from the article:

A booby trap is defined as “any device designed or adapted to kill or injure, and which functions unexpectedly when a person disturbs or approaches an apparently harmless object,” according to Article 7 of a 1996 adaptation of the Convention on Certain Conventional Weapons, which Israel has adopted. The protocol prohibits booby traps “or other devices in the form of apparently harmless portable objects which are specifically designed and constructed to contain explosive material.”

The prohibition is presumably intended to make it less likely that a civilian or other uninvolved person will get injured or killed by one of these seemingly harmless objects. If you're booby-trapping military equipment or military facilities then that's not a problem, civilians wouldn't be using those.

[–] FaceDeer@fedia.io 2 points 1 year ago

Ignoring the weird pill-related part, the rest of your comment is actually sound. There are genuine medical benefits to be had, at least for males. I don't know if there's equivalents for women, but I recall reading a study that found that regular ejaculation significantly reduces the chances of prostate cancer later in life.

Everybody should be free to feel comfortable with their own bodies, IMO. Society's concerns should only matter when it comes to interactions with others.

[–] FaceDeer@fedia.io 2 points 1 year ago

Given that this is the Internet it's a relatively safe assumption to go "aw no he's going to be made fun of."

I'm pleasantly surprised that this doesn't appear to be the case this time, though. Every once in a while there's a good surprise.

[–] FaceDeer@fedia.io 17 points 1 year ago

It's actually cromulent technical terminology to call those extra degrees of freedom "dimensions", it's only in common parlance that "dimension" is restricted specifically to spatial dimension. Having hundreds or even thousands of dimensions is not unknown in data science.

[–] FaceDeer@fedia.io 2 points 1 year ago

I'm Canadian. I would say that I don't think much about it in terms of current events, I haven't heard much in the news about it in recent years. And my assumption from that is that's probably a good sign. There used to be a steady stream of bad news, and "no news" lies along the path in between "bad news" and "good news."

I did see a video recently about Iraq's plans for a giant new port facility on that little tidbit of Persian Gulf shoreline it has and road/rail link from it up through to Turkey, and thence onward into Europe. It sounded like a very optimistic development if it can be seen through to fruition, opening an alternative trade corridor to the Suez Canal. Anything that diversifies a country's economy is a good thing, and anything that removes single points of failure in global shipping networks is also a good thing. I can't imagine the Houthi obstruction of the Red Sea would still be a problem by the time that route opens up but at least it'll be an option if something like it happens again.

[–] FaceDeer@fedia.io 2 points 1 year ago

I'll gladly take the karmic hit on your behalf and wish it on Kissinger twice. Once going out, then again going back in.

[–] FaceDeer@fedia.io 9 points 1 year ago (2 children)

Albertan here. A couple of years back my brother and my dad both died of cancer (an unrelated coincidence) and I had the same experience - there was never a moment of stress about money. There also never felt like there were any untoward delays; when a situation was urgent we were able to jump straight to the surgery/MRI/whatever. There were a few times where we had to wait a few weeks for an appointment, but those were always the low-priority or followup things.

I know a lot of people think of Alberta as "North Texas" and imagine it's an American-style hellscape, but even if it might be a little below the general Canadian standards on some things it's nowhere near. It's important to be aware of the baselines that things are measured relative to.

[–] FaceDeer@fedia.io 7 points 1 year ago

Yet another case of the Russian military apparatus looking impressive but turning out to be made of paper mache and corruption when put to the actual test.

I assume that some of those bunkers had nice mixtures of explosives and incendiaries in them, and when they went off they fountained themselves all over their neighbors.

[–] FaceDeer@fedia.io 0 points 1 year ago

Also, what do you mean by synthetic data? If it's made by AI, that's how collapse happens.

But that's exactly my point. Synthetic data is made by AI, but it doesn't cause collapse. The people who keep repeating this "AI fed on AI inevitably dies!" Headline are ignorant of the way this is actually working, of the details that actually matter when it comes to what causes model collapse.

If people want to oppose AI and wish for its downfall, fine, that's their opinion. But they should do so based on actual real data, not an imaginary story they pass around among themselves. Model collapse isn't a real threat to the continuing development of AI. At worst, it's just another checkbox that AI trainers need to check off on their "am I ready to start this training run?" Checklist, alongside "have I paid my electricity bill?"

The problem with curated data is that you have to, well, curate it, and that's hard to do at scale.

It was, before we had AI. Turns out that that's another aspect of synthetic data creation that can be greatly assisted by automation.

For example, the Nemotron-4 AI family that NVIDIA released a few months back is specifically intended for creating synthetic data for LLM training. It consists of two LLMs, Nemotron-4 Instruct (which generates the training data) and Nemotron-4 Reward (which curates it). It's not a fully automated process yet but the requirement for human labor is drastically reduced.

the only way to guarantee training data isn't from its own model is to make it yourself

But that guarantee isn't needed. AI-generated data isn't a magical poison pill that kills anything that tries to train on it. Bad data is bad, of course, but that's true whether it's AI-generated or not. The same process of filtering good training data from bad training data can work on either.

[–] FaceDeer@fedia.io 6 points 1 year ago

It's not wrong for either to draw inspiration from the other. It's the hypocrisy that's wrong.

[–] FaceDeer@fedia.io 1 points 1 year ago

I've made similar points in the past in discussions about robot soldiers going to war. There's an upside to these things that people insist on overlooking; they follow their programming. If you program a robot soldier to never shoot at an ambulance, then it will never shoot at an ambulance even if it's having a really bad day. Same here, if the security robot has been programmed never to leave the public sidewalk then it'll never leave the public sidewalk.

It's always possible for these sorts of things to be programed to do the wrong things, of course. But at least now we have the ability to audit that sort of thing.

[–] FaceDeer@fedia.io 0 points 1 year ago

Are you suggesting that the same amount of crime is happening but they're deciding not to report it because there's a robot there? That's the measure they're touting, the reduction in crime reports.

view more: ‹ prev next ›