this post was submitted on 20 May 2025
391 points (97.8% liked)

Programming

20663 readers
225 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] ooo@sh.itjust.works 92 points 2 weeks ago (3 children)

Ironically, processing large amounts of data and making soft decisions and planning based on such data makes AI ideal for replacing C-suite members.

[–] sirdorius@programming.dev 49 points 2 weeks ago (1 children)

Let's make a community powered, open source project to do this and watch them squirm when investors demand that million dollar CEOs get replaced with AI for higher investor returns.

load more comments (1 replies)
[–] masterspace@lemmy.ca 33 points 2 weeks ago* (last edited 2 weeks ago)

Pointing this out in company wide meetings is a fun past time.

[–] taco@piefed.social 6 points 2 weeks ago

Not to mention the cost savings difference. Developer salaries make a ChatGPT subscription look like a bargain. C-level salaries make racks of dedicated hardware to run local models look like one.

[–] 30p87@feddit.org 60 points 2 weeks ago (1 children)

it means more ambitious, higher-quality products

No ... the opposite actually.

[–] masterspace@lemmy.ca 10 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Read the article before commenting.

The literal entire thesis is that AI should maintain developer headcounts and just let them be more productive, not reduce headcount in favour of AI.

The irony is that you're putting in less effort and critical thought into your comment than an AI would.

[–] hallettj@leminal.space 33 points 2 weeks ago

For the sake of benefit of the doubt, it's possible to simultaneously understand the thesis of the article, and to hold the opinion that AI doesn't lead to higher-quality products. That would likely involve agreeing with the premise that laying off workers is a bad idea, but disagreeing (at least partially) with the reasoning why it's a bad idea.

[–] pezhore@infosec.pub 19 points 2 weeks ago (4 children)

I get what you're saying, but the problem is that AI seems to need way more hand holding and double checking before it can be considered ready for deployment.

I've used copilot for Ansible/Terraform code and 40-50% of the time it's just... wrong. It looks right, but it won't actually function.

For easy, entry programs it's fine, but I wouldn't (and don't) let it near complex projects.

load more comments (4 replies)
[–] pastel_de_airfryer@lemmy.eco.br 46 points 2 weeks ago (3 children)

My theory is that C-suites are actually using "AI efficiency gain" as an excuse for laying off workers without scaring the shareholders.

"I didn't lay off 10% of the workforce because the company is failing. It's because... uhmmmm... AI! I have replaced them with AI! Please give us more money."

[–] mesamunefire@piefed.social 14 points 2 weeks ago

It's the next RTO.

load more comments (2 replies)
[–] hperrin@lemmy.ca 39 points 2 weeks ago (12 children)

I don’t honestly believe that AI can save me time as a developer. I’ve tried several AI agents and every single one cost me time. I had to hold its hand while it fumbled around the code base, then fix whatever it eventually broke.

I’d imagine companies using AI will need to hire more developers to undo all the damage the AI does to their code base.

[–] Flamekebab@piefed.social 18 points 2 weeks ago

I've found it can just about be useful for "Here's my data - make a schema of it" or "Here's my function - make an argparse interface". Stuff I could do myself but find very tedious. Then I check it, fix its various dumb assumptions, and go from there.

Mostly though it's like working with an over-presumptuous junior. "Oh no, don't do that, it's a bad idea because security! What if (scenario that doesn't apply)" (when doing something in a sandbox because the secured production bits aren't yet online and I need to get some work done while IT fanny about fixing things for people that aren't me).

Something I've found it useful for is as a natural language interface for queries that I don't have the terminology for. As in "I've heard of this thing - give me an overview of what the library does?" or "I have this problem - what are popular solutions to it?". Things where I only know one way to do it and it feels like there's probably lots of other ways to accomplish it. I might well reject those, but it's good to know what else exists.

In an ideal world that information would be more readily available elsewhere but search engines are such a bin fire these days.

[–] AlecSadler@sh.itjust.works 9 points 2 weeks ago (1 children)

I was in the same boat about...3mos ago. But recent tooling is kind of making me rethink things. And to be honest I'm kind of surprised. I'm fairly anti-AI.

Is it perfect? Fuck no. But with the right prompts and gates, I'm genuinely surprised. Yes, I still have to tweak, but we're talking entire features being 80% stubbed in sub 1 minute. More if I want it to test and iterate.

My major concern is the people doing this and not reviewing the code and shipping it. Because it definitely needs massaging...ESPECIALLY for security reasons.

[–] cornshark@lemmy.world 8 points 2 weeks ago

Which tools are you finding success with?

[–] Carol2852@discuss.tchncs.de 4 points 2 weeks ago (1 children)

I mostly use AI as advanced autocomplete. But even just using it for documentation is wrong so often that I do't use it for anything more complex than tutorial level.

I got pretty far with cursor.com when doing basic stuff that i have to spend more time looking up documentation than writing code, but I wouldn't trust it with complex usec cases at this point.

I check back every 6 months or so, to keep track of the progress. Maybe I can spent my days as a software developer drinking cocktails by the pool yelling prompts into the machine soon, but so far I am not concerned I'll be replaced anytime soon.

load more comments (1 replies)
load more comments (9 replies)
[–] MagicShel@lemmy.zip 39 points 2 weeks ago (20 children)

That middle graph is absolute fucking bullshit. AI is not fucking ever going to replace 75% of developers or I've been working way too fucking hard for way to little pay these past 30 years. It might let you cut staff 5-10% because it enables folks to accomplish certain things a bit faster.

Christ on a fucking crutch. Ask developers who are currently using AI (not the ones working for AI companies) how much time and effort it actually saves them. They will tell you.

[–] nullPointer@programming.dev 23 points 2 weeks ago (1 children)

I use it here and there. it just seems to shift effort from writing code to reading and fixing code. the "amount" of work is about the same.

[–] Flamekebab@piefed.social 7 points 2 weeks ago

I hear that. Given I need practice in refactoring code to improve my skills, it's not useless to me right now but overall it doesn't seem like a net gain.

[–] Zenith@lemm.ee 9 points 2 weeks ago (1 children)

It doesn’t have to make sense or make the outcome be better, the only thing it has to do is make the company look better on paper to its shareholders. If something can make the company look better on paper it will be done, the quality of the work is not relevant

load more comments (18 replies)
[–] BlameTheAntifa@lemmy.world 32 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Forward-thinking companies should use AI to transform each developer into a "10x developer,"

Developer + AI ≠ Developer x 10

At best, it means 1.25 x Developer, but in most cases, it will mean 0.5 x Developer. Because AI cannot be trusted to generate safe, reliable code.

[–] helopigs@lemmy.world 9 points 2 weeks ago (2 children)

I think 10x is a reasonable long term goal, given continued improvements in models, agentic systems, tooling, and proper use of them.

It's close already for some use cases, for example understanding a new code base with the help of cursor agent is kind of insane.

We've only had these tools for a few years, and I expect software development will be unrecognizable in ten more.

[–] ZILtoid1991@lemmy.world 9 points 2 weeks ago

It also depends on the usecase. It likely can help you better at throwing webpages together from zero, but will fall apart once it has to be used to generate code for lesser-discussed things. Someone once tried to solve an OpenGL issue I had with ChatGPT, and first it tried to suggest me using SDL2 or GLFW instead, then it spat out a barely working code that was the same as mine, and still wrong.

A lot of it instead (from what I've heard from industry connections) being that the employees are being forced to use AI so hard they're threatened with firings, so they use most of their tokens to amuse themselves with stuff like rewriting the documentation in a pirate style or Old English. And at the very worst, they're actually working in constant overtime now, because people were fired, contracts were not extended, etc.

load more comments (1 replies)
load more comments (2 replies)
[–] rayquetzalcoatl@lemmy.world 28 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Genuinely a bit shocked to see the number of robolovers in these comments. Very weird, very disheartening. No wonder so much shit online doesn't work properly lol

load more comments (1 replies)
[–] mesamunefire@piefed.social 17 points 2 weeks ago (1 children)

Also is substack the new meduim? I cant keep up with these freemium wordpress/blog clones.

[–] onlinepersona@programming.dev 14 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Why do people always have to use some freemium offering when there's an opensource, self-hosted or already hosted variant out there? I don't get it. Just riding the wave I guess.

Anti Commercial-AI license

[–] mesamunefire@piefed.social 8 points 2 weeks ago (2 children)

My guess? The freemium stuff gives the promise of $$ after a certain level of popularity. And they make it VERY easy to use.

Personally, ive been thinking of using writefreely for its seamless integration of fediverse...but I really dont have a lot to say in the traditional space. IE screaming at the wailing wall (or at least it feels like screaming at the wailing wall).

load more comments (2 replies)
[–] Aceticon@lemmy.dbzer0.com 15 points 2 weeks ago* (last edited 2 weeks ago)

Even if AI is an actual tool that improves the software development speed of human developers (rather than something that ends up taking away in time spending reviewing, correcting and debugging the AI generated code, the time savings it gives in automatically writing the code), it's been my experience in almost 30 years of my career as a Software Engineer that every single tooling improvements that makes us capable of doing more in the same amount of time is eaten up by increasing demands on the capabilities of the software we make.

Thirty years ago user interfaces were either CLI or pretty simple with no animations. A Software Systems was just a software application - it ran on a single machine with inputs and outputs on that machine - not a multi-tiered octopus involving a bunch of back end data stores, then control and data retrieval middle tiers, then another tier doing UI generation using a bunch of intermediate page definition languages and a frontends rendering those pages to a user and getting user input, probably with some local code thrown into the mix. Ditto for how cars are now mostly multiple programs running of various microcontrollers with one or more microprocessors in the mix all talking over a dedicated protocol. Ditto for how your frigging "smart" washing machine talking to your dedicated smartphone app for it probably involves a 3rd machine in the form of some server from the manufacturer and the whole thing is running over TCP/IP and using the Internet (hence depending on a lot more machines with their dedicated software such as Routers and DNS servers) rather than some point-to-point direct protocol (such as Serial) like in the old days.

Anyways, the point being that even if AI actually delivers more upsides than downsides as a tool to improve programmer output, that stuff is going to be eaten up by increasing demands on the complexity of the software we do, same as the benefits of better programming languages were, the benefits of better IDEs were, of the widespread availability of pre-made libraries for just about everything were, of templating were, of the easiness to find solutions for the problem one is facing from other people on the Internet were, of better software development processes were, of source control were, of colaborative development tools were and so on.

Funnily enough, for all those things there were always people claiming it would make the life of programmers easier, when in fact all it did was make the expectations on the software being implemented go up, often just in terms of bullshit that's not really useful (the "smart" washing machine using networking to talk to a smartphone app so that the machine manufacturers can save a few dollars by not putting as many physical controllers in it, is probably a good example)

[–] Carol2852@discuss.tchncs.de 12 points 2 weeks ago (2 children)

This assumes it is about output. 20 years of experience tell me it's not about output, but about profits and those can be increased without touching output at all. 🤷‍♂️

[–] Flocklesscrow@lemm.ee 10 points 2 weeks ago* (last edited 2 weeks ago)

*specifically short-term profits. Executives only care about the next quarter and their own incentives/bonuses. Sure the company is eventually hollowed out and left as a wreck, but by then, the C Suite has moved on to their next host org. Rinse and repeat.

[–] SoleInvictus@lemmy.blahaj.zone 4 points 2 weeks ago

Often they only want the illusion of output, just enough to keep the profits eternally rising.

[–] thingsiplay@beehaw.org 12 points 2 weeks ago

Its not that dumb as you think, its way dumber.

[–] OmegaLemmy@discuss.online 11 points 2 weeks ago* (last edited 2 weeks ago)

I'm 90% sure it's something to do with the stock market, buy backs and companies having to do cryptic shit to keep up with a fake value to their shares

[–] NotMyOldRedditName@lemmy.world 10 points 2 weeks ago

~~Developers~~ ~~developers~~ ~~developers~~ ~~developers~~, ~~developers~~ ~~developers~~ ~~developers~~ ~~developers~~ AI

[–] Kissaki@programming.dev 8 points 2 weeks ago

AI-assisted coding […] means more ambitious, higher-quality products

I'm skeptical. From my own (limited) experience, my use-cases and projects, and the risks of using code that may include hallucinations.

there are roughly 29 million software developers worldwide serving over 5.4 billion internet users. That's one developer for every 186 users,

That's an interesting way to look at it, and that would be a far better relation than I would have expected. Not every software developer serves internet users though.

[–] MTK@lemmy.world 5 points 2 weeks ago (1 children)

What do you expect? Half of these decision makers are complete idiots that are just good at making money and think that that means they are smarter than anyone who makes less than them. They then see some new hyped up tech, they chat with ChatGPT and they are dump enough to be floored by it's "intelligence" and now they think it can replace workers but since it's still early, they assume that it will quickly surpass the workers. So in their mind, firing ten programmers and saving like two million a year, while only spending maybe a few tens of thousands a year on AI will be a crazy success that will show how smart they are. And as time goes on and the AI gets better, they will save even more money. So why spend more money to help the programmers improve, when you can just fire them and spend a fraction of it on AI?

load more comments (1 replies)
load more comments
view more: next ›