this post was submitted on 20 Nov 2025
378 points (99.0% liked)
Technology
76962 readers
3248 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
And as per usual, those hating AI the most are the ones who don’t use it, don’t understand it, and/or hate it out of some misguided ideology.
Imitative is fine, great even in software development. You don’t need to reinvent the wheel. Programming languages/class libraries/etc all exist to give standard and functioning ways to do things the way they’re supposed to be done.
It’s funny that developers the world over absolutely loved and embraced tools like resharper, which was basically AI 0.5 for devs, yet now when AI is the evolution of that, everyone’s losing their mind.
Knowledge of AI tools absolutely will and should be a part of developer competencies that are evaluated during interviews in the near future, and that includes being able to explain why and when you would/would not use specific AI tools.
I'm a software engineer and I use AI on a regular basis.
This shit isn't fit to take on the vast majority of jobs dipshit CEOs or the pseudointellectuals who fondle their balls claim they can.
Fine as a tool for software engineers to figure out complications with understanding code syntax or generating an example of some not so complicated code.
It is fucking unreliable for full software development, which is what these tech oligarchs are trying to put it in charge of.
And AI is shit at making full implementations of that, let alone objectively or even rationally testing itself. If it doesn't recognize an error in its own coding, why the hell would we trust it to recognize that error in testing?
Because dumb fucks in power think AI is this magical tool that can do no wrong and do everything humans can do and better.
We are FAR AWAY from that being a reality for the reasons I already covered, and more.
Also, absolutely no company worth a damn has ever pushed anything from Resharper or AI to its millions of customers without human verification first. CEOS WANT TO ELIMINATE THAT HUMAN VERIFICATION! THATS A PROBLEM!
Except, and I want you to pay close attention to this,
CEOS WANT TO TOTALLY ELIMINATE THE HUMAN FACTOR FROM SOFTWARE DEVELOPMENT ENTIRELY
Not partially
Not kinda sorta
ENTIRELY
Because they simply fundamentally do not understand what AI is, nor it's restrictions.
And it's very clear, you don't either.