this post was submitted on 06 May 2025
570 points (96.4% liked)

Programmer Humor

23231 readers
1115 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 16 points 6 days ago (1 children)

No one can predict the future. One way or the other.

The best way to not be let behind is to be flexible about whatever may come.

[–] [email protected] 9 points 6 days ago

Can't predict the future, but I can see the past. Specifically the part of the past that used standards based implementations and boring technology. Love that I can pull up html with elements using ALL CAPs and table aligned content. It looks like a hot mess but it still works, even on mobile. Plain text keeps trucking along. Sqlite will outlive me. Exciting things are exciting but the world is made of boring.

[–] [email protected] 6 points 6 days ago (2 children)

What the fuck is Silverlight

[–] [email protected] 18 points 6 days ago (3 children)

Microsoft Flash. Netflix used it for a while. I don't remember anything else using it.

[–] [email protected] 10 points 6 days ago (1 children)

A bunch of Disney movie sites did for a while, back in the day when every movie had it's own website with trailers, promo, and a link to buy tickets and/or the DVD release.

[–] [email protected] 2 points 6 days ago

Ahh good times

[–] [email protected] 5 points 6 days ago (1 children)

The League of Legends launcher used it at one point. Not sure if it still does.

[–] [email protected] 2 points 6 days ago

I was going to say there's no way they still are since Silverlight was discontinued by Microsoft in 2013, but it is Riot Games so ¯\(ツ)

[–] [email protected] 1 points 6 days ago

EA Tiburon in Orlando used flash for a while to do the menus in Madden and other sports games.

[–] [email protected] 3 points 5 days ago

Be glad, you never had to interact with that 'technology'. I once did at an internship in 2016.

[–] [email protected] 2 points 5 days ago

What a glorious site. I wish every webpage looked something like this

[–] [email protected] 2 points 6 days ago

Once both major world militaries and hobbists are using it, it's jover. You can't close Pandora's Box. Whatever you want to call the current versions of "AI", it's only going to get better. Short of major world catastrophes, I expect it to drive not only technological advances but also energy/efficiency advances as well. The big internet conglomerates are already integrating it into search, and I fully expect within the next 5 years to have search transformed into an assistant-like chatbot (or something thereof).

I think it's shortsighted not to see the potential of accumulating society's knowledge and being able to present that to people in an understandable way.

I don't expect it to happen overnight. I'm not expecting iRobot or Android levels of consciousness any time soon, but the world is progressing toward the automation of many things - driven by Capital(ism) - which is powerful in itself.

[–] [email protected] 94 points 1 week ago* (last edited 1 week ago) (1 children)

(let me preach a little, I have to listen to my boss gushing about AI every meeting)

Compare AI tools: now vs 3 years ago. All those 2022 "Prompt engineer" courses are totally useless in 2025.

Extrapolate into the future and realize, that you're not losing anything valuable by not learning AI tools today. The whole point of them is they don't require any proficiency. It "just works".

Instead focus on what makes you a good developer: understanding how things work, which solution is good for what problem, centering your divs.

[–] [email protected] 11 points 1 week ago* (last edited 1 week ago) (1 children)

Key skill is to be able to communicate your problem and requirements which turns out to be really hard.

[–] [email protected] 7 points 6 days ago (1 children)

It’s also a damn useful skill whether you’re working with AI or humans. Probably worth investing some effort into that regardless of what the future holds.

[–] [email protected] 3 points 6 days ago

Though it's more work with current AI at least compared to another team member - the AI cannot have access to a lot of context due to data security rules.

[–] [email protected] 80 points 1 week ago (3 children)

As an old fart you can’t imagine how often I heard or read that.

[–] [email protected] 53 points 1 week ago (1 children)

You should click the link.

[–] [email protected] 39 points 1 week ago

Hehe. Damn, absolutely fell for it. Nice 😂

[–] [email protected] 29 points 1 week ago (1 children)

Yeah but it's different this time!

[–] [email protected] 14 points 1 week ago* (last edited 1 week ago) (6 children)

I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history, or that did get adopted but do not have mass adoption. Hindsight is 20/20, but we live in the present and have to make our guesses about what will succeed and what will fail, and it would be nice to have better guesses.

load more comments (6 replies)
[–] [email protected] 20 points 1 week ago* (last edited 1 week ago) (3 children)

I'd love to read a list of those instances/claims/tech

I imagine one of them was low-code/no-code?

/edit: I see such a list is what the posted link is about.

I'm surprised there's not low-code/no-code in that list.

[–] [email protected] 13 points 1 week ago

"We're gonna make a fully functioning e-commerce website with only this WYSIWYG site builder. See? No need to hire any devs!"

Several months later...

"Well that was a complete waste of time."

load more comments (2 replies)
[–] [email protected] 68 points 1 week ago (9 children)

Remember when "The Cloud" was going to put everyone in IT out of a job?

[–] [email protected] 26 points 1 week ago (2 children)

I don't think it was supposed to replace everyone in IT, but every company had system administrators or IT administrators that would work with physical servers and now there are gone. You can say that the new SRE are their replacement, but it's a different set of skills, more similar to SDE than to system administrators.

[–] [email protected] 1 points 6 days ago

I just think this is patently false. Or at least there are/were orgs where cloud costs so much more than running their own servers that are tended by maybe 1 FTE across a bunch of admins mostly doing other tasks.

Let me just point out one recent comparison - we were considering cloud backup for a couple petabytes of data, with a few hundred GB changing or adding / restoring every week or less. I think the best deal, where we held the software costs equal was $5/TB/Month.

This is catastrophically more expensive over a 10 year lifespan of a server or two and a small/mid sized LTO9 tape library and tapes. For one thing, we'd have paid more than the server etc in about a year. After that, tape prices have always tended down over time, and the storage costs for us for tape is basically $0 once in archive storage. We put it in a cabinet in another building - and you can fit A LOT of data in these tapes in a small room. That'll cost basically $0 additional for 20 years, forget about 10. So let's add in electricity etc - I still have doubts those will be over ~$100k over the lifetime of the project. Labor is about a wash cause you still need people to manage the backups to the cloud, and I think actually moving tapes might be ~.05 FTE in our situation. Literally anyone can be taught how to do it once the backup admin puts the tapes in the hopper or tells them which serial # to put in the hopper.

I also think that many companies are finding something similar for straight servers - at least it was in the news quite a bit for a while. Now, if you can be entirely cloud native - maybe it washes out, but for large groups of people that's still not possible due to controlling hardware (think factory,scientific, etc)or existing desktop software for which the cloud isn't really a replacement and throughput isn't great (think Adobe products, video, scientific, financial etc data).

load more comments (1 replies)
[–] [email protected] 18 points 1 week ago

Naming it "The Cloud" and not "Someone else's old computer running in their basement" was a smart move though.

It just sounds better.

[–] [email protected] 17 points 1 week ago

Many of our customers store their backups in our "cloud storage solution".

I think they'd be rather less impressed to see the cloud is in fact a jumble of PCs scattered all around our office.

load more comments (6 replies)
[–] [email protected] 48 points 1 week ago

This technology solves every development problem we have had. I can teach you how with my $5000 course.

Yes, I would like to book the $5000 Silverlight course, please.

[–] [email protected] 29 points 1 week ago (1 children)

I still think PWAs are a good idea instead of needing to download an app on your phone for every website. Like, for example, PWAs can easilly replace most banking apps, which are already just PWAs with added tracking.

[–] [email protected] 15 points 1 week ago

They're great for users, which is why Google and Apple are letting them die from lack of development so apps can make them money.

[–] [email protected] 29 points 1 week ago (3 children)
[–] [email protected] 20 points 1 week ago (4 children)

Which is honestly its best use case. That and occasionally asking it to generate a one-liner for a library call I don't feel like looking up. Any significant generation tends to go off the rails fast.

[–] [email protected] 2 points 6 days ago

If you use it basically like you'd use an intern or junior dev, it could be useful.

You wouldn't allow them to check anything in themselves. You wouldn't trust anything they did without carefully reading it over. You'd have to expect that they'd occasionally completely misunderstand the request. You'd treat them as someone completely lacking in common sense.

If, with all those caveats, you can get this assistance for free or nearly free, it might be worth it. But, right now, all the AI companies are basically setting money on fire to try to drive demand. If people had to pay enough that the AI companies were able to break even, it might be so expensive it was no longer worth it.

load more comments (3 replies)
load more comments (2 replies)
[–] [email protected] 26 points 1 week ago* (last edited 1 week ago)

Thanks for summing it up so succinctly. As an aging dev, I've seen quite a lot of tech come and go. I wish more people interested in technology would spend more time learning the basics and the history of things.

[–] [email protected] 23 points 1 week ago* (last edited 1 week ago) (2 children)

it's funny, but also holy moly do I not trust a "sign in with github" button

load more comments (2 replies)
[–] [email protected] 20 points 1 week ago (2 children)

I'm skeptical of author's credibility and vision of the future, if he has not even reached blink tag technology in his progress.

load more comments (2 replies)
[–] [email protected] 20 points 1 week ago

Good thing I hate web development

[–] [email protected] 11 points 1 week ago

10/10. No notes.

[–] [email protected] 10 points 1 week ago (2 children)

It pains me so much when I see my colleagues pay OpenAI to do programming assignments.. they see it is faster to ask gpt, than learn it properly. Sadly, I can say nothing to them, or I would risk worsening relations with them.

[–] [email protected] 16 points 1 week ago (1 children)

I'm glad they do. This is going to generate so much work opportunities to undo their messes.

[–] [email protected] 13 points 1 week ago (1 children)

Except that they are research students in PhD course, it would exacerbate code messiness in research paper codebases.

[–] [email protected] 12 points 1 week ago

Or open source projects..

[–] [email protected] 10 points 1 week ago

You should probably click the link

load more comments
view more: next ›