Evinceo
I was trying to see if Paul Graham was in the Epstein files (seems to mostly be due to Twitter spam) but then I found this email from 2016 with Scooter's powerword:
https://www.justice.gov/epstein/files/DataSet%209/EFTA00824072.pdf
The context is that AI guy Joscha Bach wants to "have a brainstorm" on "forbidden research" (you best believe IQ is in there, but also climate change prepping which in phrased in a particularly omenous fashion) and there's a long list of people at the end. Besides slatescott it includes
Epstein Himself Paul Graham Max Teigmark Stephen Wolfram Stephen Pinker (ofc) Reid Hoffman
It's unclear if this brainstorm ever happened or if Astral Scottdex was even contacted. The next email features Epstein chastising Joscha Bach for not shutting up in a discussion with Noam Chomsky and Bach's last email is just groveling and trying to smooth over the relationship with his benefactor.
I think this is (at least a little bit) interesting because it's back in 2016, a year before 'intellectual dark web' was coined and that whole ball got rolling.
Has Scooter addressed his presence in the files the way other-scott did?
people who think ‘it kinda works’ is the standard to aim for
I swear that this is a form of AI psychosis or something because the attitude is suddenly ubiquitous among the AI obsessed.
Wasn't he on YouTube trying to convince people that Nuclear Energy is Fine Actually? Figures.
Looks like cypherpunk slop
Market can stay irrational longer than you can stay solvent/etc.
but with Haskell replaced with a much worse ML with a much less coherent type system
Urbit moment
You can't take TSMC by force. Any fighting there would trash the fabs, and anyway you need imported equipment to keep it running. So if China did invade Taiwan and wreck it, there'd be little point in trying to take it back.
I genuinely wonder how I'm gonna be a programmer long term because the industry has been so thoroughly infested with this nonsense.
"They might as well be Latvians" is a straight out of Arrested Development.
Eliezer is a major contributor to the rationalist attitude of venerating super-forecasters and super-predictors and promoting the idea that rational smart well informed people should be able to put together super accurate predictions!
This is a necessary component of his imagined AGI monster. Good thing it's bullshit.
When someone says they can do this, I try to say 'ok, well can you do it right now to show me?' and so far the answer has always been deflection.