tetris11

joined 2 years ago
[–] tetris11@lemmy.ml 1 points 11 months ago

That's why John Kent had to die in that Superman film, so that his son would know how to tie

[–] tetris11@lemmy.ml 2 points 11 months ago (2 children)

One of these is real: "Jungle Road", "Arctic Blast", "Alpine Machete", "Lost at Sea", "Mayday Mayday", "Why is nobody reading these and calling for help"

[–] tetris11@lemmy.ml 1 points 11 months ago

still smell better than anything men's deoderants put out.

Want to have a musk like a spiced up skunk in a damp forest? Then right this way, gents!

[–] tetris11@lemmy.ml 2 points 11 months ago

Just your daily reminder that "Goodenough" is a real last name

[–] tetris11@lemmy.ml 2 points 11 months ago (1 children)

Fair enough, I was basing my opinion on what some of the FAANG companies were doing to get rid of veteran staff by giving them the WFH ultimatum.

[–] tetris11@lemmy.ml 0 points 11 months ago* (last edited 11 months ago)

(Ah, the joyful tantrum). Educate yourself on how a simple JPEG works and exactly how little features are needed to produce an image that is almost indistinguishable from the source.

[–] tetris11@lemmy.ml 5 points 11 months ago (3 children)

after employees with decades of experience left the company for remote work jobs.

Corporate still won. Those were the most expensive employees, and companies are proving time and time again that they just want output and not quality.

[–] tetris11@lemmy.ml 9 points 11 months ago (2 children)

The median would be interesting. Just in case there's one guy out there stocking a 12 footer.

[–] tetris11@lemmy.ml 2 points 11 months ago (2 children)

...first time I've seen it, but I'm likely subbed to different communities

[–] tetris11@lemmy.ml 0 points 11 months ago* (last edited 11 months ago) (2 children)

(nice ad hominem) Christ. When you reduce a high dimensional object into an embedded space, yes you keep only the first N features, but those N features are the most variable, and the loadings they contain can be used to map back to (a very good) approximation of the source images. It's akin to reverse engineering a very lossy compression to something that (very strongly) resembles the source image (otherwise feature extraction wouldn't be useful), and it's entirely doable.

[–] tetris11@lemmy.ml 1 points 11 months ago (4 children)

(thanks for the insult, stay classy) so the network training stage was pulled out of thin air then? Huh, I didn't know these models could self-bootstrap themselves out of nothing.
I guess inverting models to do a tracing attack is impossible. Huh.

view more: ‹ prev next ›