Is the drop all due to AI?
LocalLLaMA
Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.
Rules:
Rule 1 - No harassment or personal character attacks of community members. I.E no namecalling, no generalizing entire groups of people that make up our community, no baseless personal insults.
Rule 2 - No comparing artificial intelligence/machine learning models to cryptocurrency. I.E no comparing the usefulness of models to that of NFTs, no comparing the resource usage required to train a model is anything close to maintaining a blockchain/ mining for crypto, no implying its just a fad/bubble that will leave people with nothing of value when it burst.
Rule 3 - No comparing artificial intelligence/machine learning to simple text prediction algorithms. I.E statements such as "llms are basically just simple text predictions like what your phone keyboard autocorrect uses, and they're still using the same algorithms since <over 10 years ago>.
Rule 4 - No implying that models are devoid of purpose or potential for enriching peoples lives.
The fast drop yes, but really it’s been in decline for around a decade before that.
Interesting! When I first read your comment, I looked at the chart and thought “it looks to me like the drop starts at the end of 2022. Isn’t that before LLMs started being used broadly?”
Nope. Looks like ChatGPT was released in November 2022. It doesnt feel like it’s been around that long, but I guess it has.
They also announced their AI stuff in July 2023 https://stackoverflow.blog/2023/07/27/announcing-overflowai/
The drop starts in 2013, but people were certainly ready to all bail at once by the time LLMs came around.
That sucks, is there an alternative people are using? seems like it would still be a useful knowledge base to have
The common alternative is to just ask ChatGPT your software questions, get false information from the AI, and then try and push that horrible code to production anyway if my past two jobs are any indicator.
Stack Overflow is still useful to find old answers, but fucking sucks to ask new questions on. If you aren't getting an AI answer to your question, then you're getting your question deleted for some made up reason.
The real answer that everyone hates is: If you have a question about something, read the documentation and experiment with it to figure that something out. If the documentation seems wrong, submit an issue report to the devs (usually on GitHub) and see what they say.
The secondary answer is that almost everything FOSS has a slack channel or even sometimes discord channels. Go to the channels and ask people who use/make whatever tool you need help with.
The common alternative is to just ask ChatGPT your software questions, get false information from the AI, and then try and push that horrible code to production anyway if my past two jobs are any indicator.
If you have developers pushing bad and broken code to production your problem isn't AI.
That and they cover up half the fucking page when you try to view it. Google login, giant cookie popup etc
I believe it’s more of a generational shift.
The age groups who used to rely on SO are now skilled enough not to rely on it as much (or they more often have the types of questions SO can’t answer).
Younger age groups probably prefer other means of learning (like ChatGPT, Discord and YouTube videos).
Yeah I’m working in some niche and there is a stackoverflow that they refer newbies to because "no developer support on their discord“. But if you ask a question there no one will ever answer, otoh if you know where and how to ask you’ll actually get help on discord. I feel like SO is pretty much dead with anything where change happens quickly.
There’s also only so many ways to ask how to sort a list or whatever and SO removes duplicate questions. So at some point the number of unique questions asked begins to plateau. I think that explains the slow drop before LLMs came on the scene.
I assumed it was because stackoverflow already had all the answers I needed except for the things too obscure to search for that result in my crying and trying to piece it together from scraps of info on 50 different tabs.
Yes. But not just in the "obvious" way.
I first started to contribute back when LLMs first appeared. Then SO allowed became LLM training grounds. Which made me stop contributing instantly.
I guess a not-insignificant amount of people stopped answering questions, which means less search results, which ends in less traffic.
I'm sure the fall wouldn't be as big as it is if they didn't allow LLMs to train on their data.
How do you disallow LLMs to train on their data while still allowing humans to train on their data?
If they can charge for it. It means they can block it. https://www.wired.com/story/stack-overflow-will-charge-ai-giants-for-training-data/
You can also rate-limit. Blacklist known scrapper IPs.
And if it doesn't work. You make signing-in not optional. Which makes rate-limiting way easier.
The rate of human data consumption is much lower than LLM's. The humans won't even notice that they have a rate limit. At most they would only notice the need to create a stack overflow account.
It probably started off when reddit/discord became a friendly place for troubleshooting (code among other things), then the AI dropped it off the cliff.
Rammed it off the cliff
Nah, that drop comes WELL before AI answers. Look at the dates. They've had a culture of people overly aggressively closing new questions for pointless/irrelevant reasons as well as being generally nasty to new users for ages. Sure, it started dropping way faster post 2020 because of AI, but the problem was already there.
It’s not LLMs — see the peak at 2013. They aggressively started closing any “duplicate” questions around then. The whole premise was that experts were supposed to answer questions for clout that would bolster their resume, but after getting silenced a few times, why would they come back? And anyone with the temerity to ask a question that was asked (with or without a good answer, ten years ago) would also never come back after getting shut down.
They couldn’t decide if they were a forum or Wikipedia and became neither.
Note that the decline began well before "AI" stuff became a thing. Stack Overflow has had a major culture problem as well as not treating their users with respect for ages.
For the part about respecting users, they have a history of ignoring Meta (their site specifically for talking about Stack Overflow site itself) while acting like they use it.
In the future we will be dependent on LLMs for everything because the only people with enough money to maintain libraries of data which are untainted by LLMs will be the people who own the LLMs.
Step 1: Steal all of the data (including copyrighted stuff)
Step 2: Poison the well
Step 3: Profit
Posted once in stack overflow in college and got absolutely destroyed. I was not let down lmao
Not surprised, even without the LLM boom, StackOverflow was doomed for the same reason reddit is doomed: power tripping bastards, gatekeeping everything which is not part of their narrow minded world.
I'm not surprised. StackOverflow has moderated itself out of relevance. Ask a question and get flamed. DDG a question plus "stackoverflow" and get something that may well have been correct and useful in 2012 but tech moves on and it's now archaic trivia, somewhat akin to facts about punched cards. "Help me StackOverflow, you're my only hope" hasn't been true for quite some time now.
They moderated themselves out of relevance because when you ask new questions that aren't duplicates they still close them as duplicates.
I keep getting the Cloudflare checks