this post was submitted on 26 Dec 2025
255 points (95.1% liked)

Showerthoughts

38805 readers
793 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.

Here are some examples to inspire your own showerthoughts:

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. No politics
    • If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
    • A good place for politics is c/politicaldiscussion
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct and the TOS

If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.

Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.

founded 2 years ago
MODERATORS
 

Windows 11 often requires new hardware. But that will be extremely pricey or have very little RAM for a while.

I dont believe that a single competent person works at Micro$oft anymore, but maybe maybe this could lead them to make a less shitty OS?

And garbage software like Adobe Creative Cloud too?

They obviously dont care about users, but the pain could become too big.

top 50 comments
sorted by: hot top controversial new old
[–] orbitz@lemmy.ca 2 points 29 minutes ago

If you've ever watched cinema sins (or related videos), hahaha hahahaha...haha (no offense meant but it did make me do that laugh in my head is all)

Mean I wish it would but programmers aren't going to be more memory efficient due to hardware prices unfortunately.

The laugh was in good nature, not laughing at you but the concept of a company being efficient for hardware costs, mean technically I guess games were otherwise we'd wait a half hour for a render but for the most part as long as it works without that half hour render it's probably fine with settings adjustments.

They'll just make things with current specs in mind for longer...well once they realize people can't afford better hardware.

[–] rumba@lemmy.zip 4 points 1 hour ago

Nah, you'll get 8GB and swap on nvme. Or, you'll get to rent a terminal server slot for just $30 a month.

[–] michaelmrose@lemmy.world 4 points 6 hours ago

It still costs more to rewrite all your existing code sooo no.

[–] fenrasulfr@lemmy.world 14 points 8 hours ago

Naaaah, you are just going to have to run it in the cloud optimised by AI for the low low price of both your kidneys so Bezos, Mark and Elon can continue partying.

[–] buzz86us@lemmy.world 6 points 8 hours ago
[–] Cocodapuf@lemmy.world 6 points 11 hours ago* (last edited 11 hours ago)

Wouldn't that be nice! Yeah I think it'll totally work.

Hey, I think I see someone right now, they're switching from writing in Python to writing in assembly! "Hey buddy, don't forget to clear that register! And don't forget you'll need to write this all over from scratch to get it to work on any other platform!"

[–] HexesofVexes@lemmy.world 11 points 13 hours ago

One of those little truisms folks forget is that optimising software takes a LOT longer than making something that just works.

[–] lichtmetzger@discuss.tchncs.de 8 points 13 hours ago* (last edited 13 hours ago)

I'm currently running Fedora Linux with Firefox and YouTube opened up. The whole system uses ~4GB of memory. That's totally fine and I couldn't care less about what Microsoft is doing with their OS.

With that said, I don't think we'll see a lot of optimizations in commercial software. Maybe a few here and there, but a lot of developers nowadays don't even know how to optimize their code. Especially people working in web development or adjacent frameworks. Let's just throw hundreds of npm packages into one project and bundle them up with webpack, here's your 12MB JavaScript - take it or leave it. Projects like this aren't the exception, they are the norm.

Even if the devices that can run that code without running out of memory get more expensive, companies will just pay for those and write them off on the taxes. And if not, more apps will just get pushed into the cloud.

[–] DeathByBigSad@sh.itjust.works 34 points 18 hours ago* (last edited 18 hours ago) (1 children)

🤣 Nah, they'll enforce mandatory cloud computing.

You'll just have a "terminal"

[–] ragebutt@lemmy.dbzer0.com 18 points 15 hours ago (1 children)

It’s crazy that people don’t see this is where computers are heading.

The day tech bros realized they could squeeze recurring monthly subscriptions out of you for basically increasingly banal shit the writing was on the wall. The end game is that you have a chromebook with 800 subscriptions to streaming services for your os, music, movies, tv, games, image editing software, music DAWs, plugins for both the aforementioned softwares, subscriptions for hardware associated with the software (eg drawing tablets or midi keyboards), etc but covering every niche you can possibly think of and not just graphic art and music.

And when you bitch about it tech bros and weird alphas and young zoomers who were raised on this ecosystem and indoctrinated by it will go “well you see it’s fair because updates cost money to develop” as if the old system of expecting bug fixes and security patches to be free but not necessarily feature updates was unfair. Like if I buy a car and it’s fucked up I expect it to be fixed for free but I don’t expect them to feature match the next model year.

Tech workers are disproportionately high paid and so whiney when they have to provide even a modicum of support because then they have to potentially cut into that disproportionate high pay. Like “oh no i make 80-150,000+ a year but if i support this I’ll have to work more without generating sales and will maybe only make 60-130,000+. The horror!” fuck those libertarian shitstains that are literally overthrowing an entire government (and possibly more) with technofacism so that they can justify their “I know python, I should be able to earn as much as I want, fuck ethics, I never emotionally matured past 16” bullshit

[–] Surp@lemmy.world 6 points 13 hours ago

Username checks out and i love it

[–] melsaskca@lemmy.ca 1 points 10 hours ago (1 children)

Just like the early days of programming when you really had to manage your memory, often down to the last bit. Those were the days when programming was more difficult. Now it's mostly just point and click for middle-schoolers.

[–] michaelmrose@lemmy.world 5 points 6 hours ago (1 children)

I'm presuming you know nothing about programming because this is complete and utter nonsense.

[–] TranquilTurbulence@lemmy.zip 1 points 2 hours ago (1 children)

Well, the point and click part was a bit extreme. Still true in some rare cases, but actual programming still requires a keyboard.

However the RAM thing is interesting. Haven't actually written any code in the 70's and 80's, but what I've heard from people who did, RAM was a huge bottle neck. Well, pretty much everything was. Even the bandwidth between your terminal and the mainframe was a bottle neck that made you suffer.

Back in those days, programmers were painfully aware of the hardware limitations. If you wanted your code to run within a reasonable amount of time, you absolutely had to focus on optimizing it.

[–] michaelmrose@lemmy.world 1 points 1 hour ago

It's not a "bit extreme" its absolute nonsense

[–] myfunnyaccountname@lemmy.zip 29 points 21 hours ago (1 children)

Not when AI is writing the code.

[–] ryannathans@aussie.zone 6 points 19 hours ago (1 children)

Maybe it'll write native apps instead of garbage web/electron/chrome apps

[–] Threeme2189@sh.itjust.works 9 points 16 hours ago

Narrator:

'It didn't'

[–] tomkatt@lemmy.world 22 points 21 hours ago (1 children)

There's plenty of "unbloated" software available. It's just not on Windows.

[–] Cevilia@lemmy.blahaj.zone 7 points 17 hours ago (1 children)

Which unbloated browser do you use?

(This isn't a dig or a gotcha, I'm serious, I'm looking to switch browsers)

[–] Yoshi@futurology.today 6 points 13 hours ago (1 children)

Shouldnt Firefox or a Fork oft Firefox Mike Waterfox or ZenBrowser be fine?

[–] Squirrelanna@lemmy.blahaj.zone 3 points 8 hours ago

Michael Waterfox is pretty chill yeah

[–] roofuskit@lemmy.world 39 points 1 day ago (2 children)

No, everything will just become subscription based.

[–] someacnt@sh.itjust.works 4 points 16 hours ago

Why is this painful truth not the top comment? Maybe people are still hopeful after all the time?

[–] Witchfire@lemmy.world 19 points 1 day ago

And powered by the cloud

[–] ColeSloth@discuss.tchncs.de 18 points 23 hours ago (1 children)

Linux Mint Cinnamon is pretty easy to move to.....

[–] Bamboodpanda@lemmy.world 5 points 17 hours ago

As someone who recently made the switch with zero Linux experience, I completely agree.

[–] CMDR_Horn@lemmy.world 166 points 1 day ago (4 children)

Not likely. I expect the AI bubble will burst before those software optimization gears even start to turn.

[–] ScreaminOctopus@sh.itjust.works 4 points 9 hours ago

Even returning to JVM languages would be huge over the current js based electron slop. Things are so bad "optimized software" doesn't need to mean C++ or Rust.

[–] jimmy90@lemmy.world 1 points 14 hours ago

with Rust getting popular the architecture is there to make huge savings without having to be a rocket scientist

the rocket scientists are also getting involved and regularly outperforming even optimised C code

[–] potatopotato@sh.itjust.works 24 points 1 day ago

Yeah, the systems in place right now took 40 years to build

[–] riskable@programming.dev 11 points 1 day ago

Big AI is a bubble but AI in general is not.

If anything, the DRAM shortages will apply pressure on researchers to come up with more efficient AI models rather than more efficient (normal) software overall.

I suspect that as more software gets AI-assisted development we'll actually see less efficient software but eventually, more efficient as adoption of AI coding assist becomes more mature (and probably more formalized/automated).

I say this because of experience: If you ask an LLM to write something for you it often does a terrible job with efficiency. However, if you ask it to analyze an existing code base to make it more efficient, it often does a great job. The dichotomy is due to the nature of AI prompting: It works best if you only give it one thing to do at a time.

In theory, if AI code assist becomes more mature and formalized, the "optimize this" step will likely be built-in, rather than something the developer has to ask for after the fact.

[–] mushroommunk@lemmy.today 98 points 1 day ago* (last edited 1 day ago) (15 children)

It's not just garbage software. So many programs are just electron apps which is about the most inefficient way of making them. If we could start actually making programs again instead of just shipping a webpage and a browser bundled together you'd see resource usage plummet.

In the gaming space even before the RAM shortage I've seen more developers begin doing optimization work again thanks to the prevalence of steam deck and such so the precedent is there and I'm hopeful other developers do start considering lower end hardware.

load more comments (15 replies)
[–] mycodesucks@lemmy.world 21 points 1 day ago

It's a really nice idea, but bad developers are already so deep in the sunk cost fallacy that they'll likely just double down.

Nobody reassesses their dogma just because the justification for it is no longer valid. That's not how people work.

[–] Camille_Jamal@lemmy.zip 9 points 1 day ago

no, they don't care about users or if they're literally cooking ram, they'll keep it bloated, and probably make it more bloated

[–] flandish@lemmy.world 28 points 1 day ago (1 children)

there is no “shortage” just capitalism testing the limits of various bubbles.

load more comments (1 replies)
[–] ChillPC@programming.dev 29 points 1 day ago (2 children)

You fool, humans are flexible enough to get used to slow experiences. Even if the average user needs to have discord, slack, 100 chrome tabs, word and any other electron app opened simultaneously, he will just go through his work. He may not be happy with it but still continue without changing his habits.

But to be honest, I goddamn hope you are right!

load more comments (2 replies)
[–] kboos1@lemmy.world 17 points 1 day ago

The "shortage" is temporary and artificial, so that's a hard NO. The ram shortage doesn't present any incentive to make apps more efficient because the hardware and software that is already in people's homes won't be effected by the shortage and people who currently use the software won't be affected by the shortage. The very small percentage of people that will be affected by the temporary shortage wouldn't justify making changes to software that is currently in development.

There's no incentive for software companies to make their code more efficient until people stop using their software so stop using it and it will get better. Just as an example Adobe reader is crap, just straight up garbage, but people still use it so the app stopped getting improvements many years ago. Then Adobe moved to a subscription based system, and cloud service for selling your data but guess what, it's still the same app that it was 10 years ago, just more expensive.

[–] Kolanaki@pawb.social 20 points 1 day ago
[–] CarbonatedPastaSauce@lemmy.world 20 points 1 day ago (5 children)

Found the silver lining guy.

Love the optimism but yeah, the impact on software dev will be minimal, if there even is one.

load more comments (5 replies)
[–] shiroininja@lemmy.world 4 points 1 day ago (2 children)

Do people really use that much Ram with normal use? Like I rarely even fill my 16gb, even with gaming, etc. I mean I just don’t leave 16 tabs open in a browser because that feels really disorganized. And I turn my computer off every night and start fresh every day

[–] ryannathans@aussie.zone 3 points 19 hours ago (1 children)

Each web based program now uses 2-5GB ram typical

[–] shiroininja@lemmy.world 1 points 16 hours ago

Yeah I don’t use any of that really

[–] pantherina@feddit.org 7 points 23 hours ago (1 children)

Yes. 16GB is the bare minimum for regular usage on Windows. On Linux, it is a minimum for "regular to advanced" usage (i.e. more than 5 more complex programs open, Flatpak, Electron apps)

[–] shiroininja@lemmy.world 1 points 16 hours ago

Oh. See I don’t use a lot of web apps or electron. No streaming/cloud services.

load more comments
view more: next ›