It's not that developers are switching to AI tools it's that stack overflow is awful and has been for a long time. The AI tools are simply providing a better alternative, which really demonstrates how awful stack overflow is because the AI tools are not that good.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
TBH asking questions on SO (and most similar platforms) fucking sucks, no surprise that users jump at the first opportunity at getting answers another way.
Removed. Someone else already said this before. Also, please ensure you stick to the stlye guides next time, and be less ambiguous. SO could mean a plethora of things.
Spoiler
Last time this question was answered was for several years older software versions, and the old solutions don't work anymore. Whoops!
SO PTSD is real.
I was in the middle of making a reply like this but yours is better. Closed as duplicate.
I will never forget the time I posted a question about why something wasn't working as I expected, with a minimal example (≈ 10 lines of python, no external libraries) and a description of the expected behaviour and observed behaviour.
The first three-ish replies I got were instant comments that this in fact does work like I would expect, and that the observed behaviour I described wasn't what the code would produce. A day later, some highly-rated user made a friendly note that I had a typo that just happened to trigger this very unexpected error.
Basically, I was thrashed by the first replies, when the people replying hadn't even run the code. It felt extremely good to be able to reply to them that they were asshats for saying that the code didn't do what I said it did when they hadn't even run it.
I post there every 6-12 months in the hope of receiving some help or intelligent feedback, but usually just have my question locked or removed. The platform is an utter joke and has been for years. AI was not entirely the reason for its downfall imo.
Not common I'm sure, but I once had an answer I posted completely rewritten for grammar, punctuation, and capitalization. I felt so valued. /s
According to a Stack Overflow survey from 2025, 84 percent of developers now use or plan to use AI tools, up from 76 percent a year earlier. This rapid adoption partly explains the decline in forum activity.
As someone who participated in the survey, I'd recommend everyone take anything regarding SO's recent surveys with a truckfull of salt. The recent surveys have been unbelievably biased with tons of leading questions that force you to answer in specific ways. They're basically completely worthless in terms of statistics.
Realistically though, asking an LLM what’s wrong with my code is a lot faster than scrolling through 50 posts and reading the ones that talk about something almost relevant.
It's even faster to ask your own armpit what's wrong with your code, but that alone doesn't mean you're getting a good answer from it
If you get a good answer just 20% of the time, an LLM is a smart first choice. Your armpit can't do that. And my experience is that it's much better than 20%. Though it really depends a lot of the code base you're working on.
Also depends on your level of expertise. If you have beginner questions, an LLM should give you the correct answer most of the time. If you’re an expert, your questions have no answers. Usually, it’s something like an obscure firmware bug edge case even the manufacturer isn’t aware of. Good luck troubleshooting that without writing your own drivers and libraries.
How do you know it's a good answer? That requires prior knowledge that you might have. My juniors repeatedly demonstrate they've no ability to tell whether an LLM solution is a good one or not. It's like copying from SO without reading the comments, which they quickly learn not to do because it doesn't pass code review.
Honestly just funny to see. It makes perfect sense, based on how they made the site hostile to users.
I was contributing to SO in 2014-2017 when my job wanted our engineers to be more "visible" online.
I was in the top 3% and it made me realize how incredibly small the community was. I was probably answering like 5 questions a week. It wasn't hard. For some perspective, I'm making like 4-5 posts on Lemmy A DAY.
What made me really pissed was how often a new person would give a really good answer, then some top 1% chucklefuck would literally take that answer, rewrite it, and then have it appear as the top answer. And that happened to me constantly. But again, I didn't care since I'm just doing this to show my company I'm a "good lil engineer".
I stopped participating because of how they treated new users. And around 2020(?), SO made a pledge to be not so douchy and actually allow new users to ask questions. But that 1% chucklefuck crew was still allowed to wave their dicks around and stomp on people's answers. So yeah, less "Duplicate questions", more "This has been answered already [link to their own answer that they stole]".
So they removed the toxic attitude with asking questions, but not the toxicity when answering. SO still had the most sweaty people control responses, including editing/deleting them. And you can't grow a community like that.
This is not because AI is good at answering programming questions accurately, it’s because SO sucks. The graph shows its growth leveling off around 2014 and then starting the decline around 2016, which isn’t even temporally correlated with LLMs.
Sites like SO where experienced humans can give insightful answers to obscure programming questions are clearly still needed. Every time I ask AI a programming question about something obscure, it usually knows less than I do, and if I can’t find a post where another human had the same problem, I’m usually left to figure it out for myself.
Reported for duplicate.
LLM's won't be helping but SE/SO have been fully enshitifying themselves for years.
It was amazing in the early days.
It was a vast improvement over expert sex change, which was the king before SO.
expertSEXchange dot com hahahahaahahahahahahahaha oh that brought me some dreadful memories! Thanks for the laugh and rhe chills
Even before AI I stopped asking any questions or even answering for that matter on that website within like the first few months of using it. Just not worth the hassle of dealing with the mods and the neck beard ass users and I didn't want my account to get suspended over some BS in case I really needed to ask an actual question in the future, now I can't remember the last time I've been to any stack website and it does not show up in the Google search results anymore, they dug their own grave
The humans of StackOverflow have been pricks for so long. If they fixed that problem years ago they would have been in a great position with the advent of AI. They could've marketed themselves as a site for humans. But no, fuckfacepoweruser found an answer to a different question he believes answers your question so marked your question as a duplicate and fuckfacerubberstamper voted to close it in the queue without critically thinking about it.
Yeah because either you get a "how dumb are you?" Or none
Locking this comment. Duplicate of https://lemmy.world/comment/21433687
I've posted questions, but I don't usually need to because someone else has posted it before. this is probably the reason that AI is so good at answering these types of questions.
the trouble now is that there's less of a business incentive to have a platform like stack overflow where humans are sharing knowledge directly with one another, because the AI is just copying all the data and delivering it to the users somewhere else.
Works well for now. Wait until there's something new that it hasn't been trained on. It needs that Stack Exchange data to train on.
The hot concept around the late 2000's and early 2010's was crowdsourcing: leveraging the expertise of volunteers to build consensus. Quora, Stack Overflow, Reddit, and similar sites came up in that time frame where people would freely lend their expertise on a platform because that platform had a pretty good rule set for encouraging that kind of collaboration and consensus building.
Monetizing that goodwill didn't just ruin the look and feel of the sites: it permanently altered people's willingness to participate in those communities. Some, of course, don't mind contributing. But many do choose to sit things out when they see the whole arrangement as enriching an undeserving middleman.
imho the experience is miserable, they went out of their way to strip all warmth from messages (they have a whole automated thing to get rid of all greetings and things considered superfluous) and there are many incentives to score points by answering which frankly I find sad, it doesn't look like a forum where people exchange, it looks like a permanent run to answer and grow your point total
Stackexchange sites aren't intended as forums, they're supposed to be "places to find answers to questions".
The more you get away from stack overflow itself the worse they get, though, because anything beyond "how can I fix this tech problem" doesn't necessarily have an answer at all, much less a single best one
Oh no, poor AI won't know where to feed anymore. Anyway...
Already before the LLMs for me it was the last chance before I would post over there. The desperation move. It was too toxic and I would always get pissed to get my question closed because too similar or too easy or whatever. Hey I wasted 15 minutes to type that, if the other question solved the problem I wouldn't post again...
In the beginning it wasn't like that...
I went to watch my stack overflow account and the first questions that I posted (and that gave me 2000 karma) would have been almost all of them rejected and removed
But what will the mods close for arbitrary reasons before there are any responses?
I'm sorry, but I've had to close your comment because it was too speculative.
"Search before asking!" - Stack Overflow
go ai, go broke
What are the odds the classic "expertsexchange" ends up out lasting stack exchange?
What? People would rather have their balls licked by AI rather than have some neckbeard moderator change the entire language of their question and not answer shit? Fuck SO. That shit was so ass to interact with.
