this post was submitted on 17 Dec 2025
323 points (99.4% liked)
Technology
77768 readers
2232 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Every time I see a headline like this I'm reminded of the time I heard someone describe the modern state of AI research as equivalent to the practice of alchemy.
Long before anyone knew about atoms, molecules, atomic weights, or electron bonds, there were dudes who would just mix random chemicals together in an attempt to turn lead to gold, or create the elixir of life or whatever. Their methods were haphazard, their objectives impossible, and most probably poisoned themselves in the process, but those early stumbling steps eventually gave rise to the modern science of chemistry and all that came with it.
AI researchers are modern alchemists. They have no idea how anything really works and their experiments result in disaster as often as not. There's great potential but no clear path to it. We can only hope that we'll make it out of the alchemy phase before society succumbs to the digital equivalent of mercury poisoning because it's just so fun to play with.
Not sure if you're referencing the same thing, but this actually came from a presentation at NeurIPS 2017 (the largest and most prestigious machine learning/AI conference) for the "Test of Time Award." The presentation is available here for anyone interested. It's a good watch. The presenter/awardee, Ali Rahimi, talks about how over time, rigor and fundamental knowledge in the field of machine learning has taken a backseat compared to empirical work that we continue to build upon, yet don't fully understand.
Some of that sentiment is definitely still true today, and unfortunately, understanding the fundamentals is only going to get harder as empirical methods get more complex. It's much easier to iterate on empirical things by just throwing more compute at a problem than it is to analyze something mathematically.