Newtra

joined 2 years ago
[–] Newtra@pawb.social 16 points 2 years ago (2 children)

Replace mindless entertainment with enriching entertainment. E.g. YouTube video essays, lectures, history podcasts, DuoLingo, Anki, Brilliant, artsy/niche movies/games, etc. Always be learning something, even if you'll never need it. Try to limit yourself to memorable, unique, or mind-opening content.

It's no fix, but it trains your brain to be able to wait just a little longer for its dopamine. Also you get to feel like you're sort-of achieving something, not just losing time every time your impulsive brain takes over.

I'm intermediate level in 3 languages, know a shit ton of science, and have played thousands of unique indie games. Is any of this useful? lol no. But do I feel accomplished and in control of some big parts of my life? Hell yes.

[–] Newtra@pawb.social 13 points 2 years ago* (last edited 2 years ago)

I had to unsub from Luke Stephens after hearing him throw that insult about hbomberguy needing more testosterone. Most toxic thing I've heard all week...

I love the playlist of worthy queer creators he gave. Not only was it an almost 4 hour video, now I have to watch another 12ish hours of random video essays to decide who else I want to follow!

[–] Newtra@pawb.social 2 points 2 years ago (1 children)

I just did a big cleanup! I think it's down to about 40 on my work computer and 70ish on my computer.

Don't ask about the phone I'm typing this on... It's a lost cause.

[–] Newtra@pawb.social 1 points 2 years ago

Yeah, I was over-enthusiastic based on their cherry-picked examples. SeamlessExpressive still leaves a lot to be desired.

It has a limited range of emotions and can't change emotion in the middle of the clip. It can't produce the pitch shifts of someone talking excitedly, making the output sound monotonous. Background noise in the input causes a raspy, distorted output voice. Sighs, inter-sentence breaths, etc. aren't reproduced. Sometimes the sentence pacing is just completely unnatural, with missing pauses or pauses in bad places (e.g. before the sentence-final verb in German).

IMO their manual dataset creation is holding them back. If I was in this field, I would try to follow the LLM route: Start with a next-token predictor trained indiscriminately on large-scale speech+text data (e.g. TV shows, movies, news radio, all with subtitles even if the subs need to be AI generated), fine-tune it for specific tasks (mainly learning to predict and generate based on "style tokens" (speaker, emotion, accent, pacing)), then generate a massive "textbook" synthetic dataset. The translation aspect could be almost completely outsourced to LLMs or multilingual subtitles.

[–] Newtra@pawb.social 0 points 2 years ago (1 children)

This is so exciting!

I can't wait to see how well the Expressive model does on anime and foreign films. I wouldn't be surprised if this was the end of terrible dubs.

This is gonna be great for language learning as well. Finally being able to pick any media and watch it in any language. It might even be possible to rig it up to an LLM to tune the vocab to your exact level...

[–] Newtra@pawb.social 3 points 2 years ago (1 children)

Oh awesome, I hadn't started looking out for Brandon. Thanks for the recommendation

[–] Newtra@pawb.social 3 points 2 years ago (3 children)

David Cronenberg, especially his stuff in the 90s and 80s. He has made so many movies that just got stuck in my brain. Everything's weird, but memorable-weird. eXistenZ was my favorite movie for years.

[–] Newtra@pawb.social 44 points 2 years ago (11 children)

In two languages I'm learning, German and Chinese, I've found it to suffer from "translationese". It's grammatically correct, but the sentence structure and word choice feel like the answer was first written in English then translated.

No single sentence is wrong, but overall it sounds unnatural and has none of the "flavor" of the language. That also makes it bad for learning - it avoids a lot of sentence patterns you'll see/hear in day to day life.

[–] Newtra@pawb.social 2 points 2 years ago* (last edited 2 years ago) (1 children)

Thanks! That's a well-written paper. I don't know why I keep falling for science journalism's simplified explanations.

I've so-far only skimmed it, but to answer my question they find light dark matter to be the simplest case (I didn't see a specific range, but they used 250keV as an example), but they also considered a scenario where "dark-zillas" (mass >> 10^10 GeV) are plausible. At least that still narrows the search space a bit 😅

[–] Newtra@pawb.social 2 points 2 years ago* (last edited 2 years ago) (3 children)

Sadly archive.li seems to be in a broken CAPTCHA loop, so I can't see the full article. However, I'm struggling to imagine a fundamental universe-spanning interaction that triggers weeks after the big bang, given that the universe has already expanded/cooled enough by 20 minutes to stop fusing nuclei. If there is evidence for a Dark Matter big bang for weeks after the Matter big bang, surely this must have some extreme implications about the possible mass range of DM particles?

One thing nobody seems to be talking about: Just like String Theory, the more new phenomena are needed to make the Dark Matter model work, the further we stray from the edge of Occam's Razor. While all the research into detecting hypothetical particles has been fun to follow, I can't help but feel we're just a few equations away from discovering that the universe is actually pretty MONDane.

[–] Newtra@pawb.social 0 points 2 years ago

With teacher hours, isn't that still often over 40 hours a week?

[–] Newtra@pawb.social 1 points 2 years ago

Keto-ade can be great for taking the edge off keto flu. It's basically a home-made concentrated sports drink - artificial sweetener, potassium & magnesium salts and a lot of water. All the fluid & electrolytes help your kidneys filter out the metabolized fat that would otherwise linger in your bloodstream.

view more: ‹ prev next ›