I’ve read, of course, about the decline of StackOverflow now that people pose their coding-related questions to LLM (or let the LLM write the code by itself). But out of curiosity I logged into my account for the first time in a while to see for myself, and boy is it true. There are questions there that were asked several hours ago and have had no replies, no votes, and just a couple of dozen views. And those are not obviously dumb questions. Before November 2022, posts like that would have received multiple upvotes, downvotes, edits and replies within minutes.
I think they were cooked even before 2022 due to a culture of closing relevant questions because they were too close to a question from 15 years ago that was overcome by changing technology in the meantime. I had stopped using it around that time frame for that reason.
Previous to that, it was really useful to find specific answers to specific questions. Over time, as the answers aged and were not allowed to be updated, the information started to get less specifically useful as time went on. In the year or so before AI really took off, I went back to my first source of information being the tech company’s individual websites rather than SO as my first resource.
The problem is that the AI themselves trained from Stack Overflow, if Stack Overflow dries up the AI responses will start to dry too.
Yeah, I stopped using it because of the craptacular asshole power hungry moderators, not because of AI.
I deleted my accounts last week, and the main thing I felt was not nostalgia but relief at not having to deal with their stupid moderation anymore. Good riddance.
I don’t think so. The AIs have gotten so good at ingesting first-party documentation, code, and comments, along with Github, MCPs, purpose-made Markdown files, etc., that they have already far exceeded what Stack ever was even at its best. And it’ll never half read your question and close it because something sort of similar was asked fifteen years ago.
Maybe in the era of ChatGPT 2 that was true, but the AIs have long since moved on beyond that already. And these days they are often trained on synthetic data. Code, unlike much of the real world, can be generated and tested for correctness and performance without external sources. It’s an ideal case for LLMs even without Stack.
And besides, there’s always reddit…
Man, is this ever true. If I wanted to do something a little odd, I’d see some ideas at Stack Overflow, maybe buried in a reply that almost addressed what I was trying to do, other code in Reddit, etc.
Now, ask Claude and I get the full answer immediately, with code better than I could write. I can imagine that SO is in real trouble.
But part of StackOverflow is people who personally had the problem, fixed it, did not document it anywhere, but seeing the question asked by someone else wrote an answer.
For example I had a problem with a Windows Service installation, tinkered with it for about a day and found a workaround.
Months later I was searching StackOverflow for a related issue and saw a question by someone who had that exact problem, so I answered it with the solution I had found.
That kind of thing will not happen if StackOverflow dies.
Traffic was on the decline even before gpt 3 but a big use case for SO was “Whats the code/command for this thing I’ve done a dozen times but will never commit to memory.” Another common use case is just finding ideas to try while troubleshooting, which AI is also fine at. But overall, the whole internet is getting enshittified, and SO was doomed because Google is getting so bad, even without AI. The best way to use the internet now is to use Google to find reddit threads, and that’s sad. RIP internet.
True, that part was helpful.
I’m not too worried about that part. Stack was good for a few years, but it was in decline for a while already, especially once it was bought out. Other forums and Q&A sites existed before it and more will pop up, if they can find a solution to prevent AI spam.
I’d rather see smaller organic communities rather than big centralized websites anyway. The convenience of centralized answers was not, in the end, worth the kind of egotistical moderation and power games that kind of platform always inevitably seems to draw. Speaking for myself only.
In theory, the idea of closing questions as duplicates would be that any new answers should go on the existing question. But that rarely seemed to happen.
What I saw was a bunch of technically minded people who didn’t really think about the social aspects too much. It never seemed to occur to them that “Closing” a question makes it seem unwelcoming and pushes people away. Or that pushing people away was not the best way to handle things.
People make fun of AI for being too obsequious. But that attitude is far more welcoming to people. Combine that with the lower cognitive load of not having to search for answers, and it’s no wonder it’s eating their lunch.
The idea of gamification of Q/A was good. Give “rewards” to people who give better answers and those who ask better questions that get good answers. I wonder if there could possibly be a way to integrate that idea with AI.
Because, yeah, I do think AI relies a lot on humans having answered various questions before. Documentation only gets you so far. And even a lot of that is not very good, once you get outside the things everyone uses.
Stack Exchange isn’t all about coding, but I worry losing StackOverflow would kill the rest. I actually read very little of SO.
Is it, though? I have mixed feelings about that.
Usenet was a treasure trove in its day, as is the SDMB today. Gamification can help increase the visibility of good questions and answers, yes, but it can also artificially drive behavior towards karma farming and trolling. On the whole, I think it was a good idea in the beginning that quickly lost its luster once it was met with the real-world hordes of unwashed masses… and bots.
On karmaless sites, posters have to rely on the merit of each individual post (or their reputation by name, at most), rather than the more easily-gamed previous post histories. All the best forums I’ve seen rely less on points than human curation and moderation. Not having karma also makes account fostering and resale (which happens on reddit, for example) much less enticing.