this post was submitted on 15 Dec 2025
129 points (99.2% liked)

Hardware

4818 readers
90 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
 

With DDR5 RAM prices skyrocketing, some mid-range laptops could soon ship with budget-level specs. TrendForce expects companies like Dell and Lenovo to stock more notebooks with 8GB of memory. These reasonably priced options may no longer handle intense office and gaming tasks.

top 42 comments
sorted by: hot top controversial new old
[–] nesc@lemmy.cafe 57 points 3 days ago* (last edited 3 days ago) (2 children)

This is hell, not only they are soldering fucking ram you can't even buy laptop with viable amount of memory.

[–] markz@suppo.fi 9 points 3 days ago

First invetion from AGI was manufactured e-waste

[–] boonhet@sopuli.xyz 4 points 3 days ago (1 children)

Now that we're getting CAMM2 modules, we might see more non-soldered RAM again.

LPDDR just couldn't be done with DIMMs. LPDDR saves power and laptop manufacturers want to compete on battery life. CAMM allows for LPDDR.

[–] nesc@lemmy.cafe 2 points 2 days ago

Hope this is true, but it really makes no sense for them to allow people upgrades. Endgame here is to make laptops like phones

[–] Valmond@lemmy.world 39 points 3 days ago (1 children)
[–] sundray@lemmus.org 29 points 3 days ago (1 children)

Instead of AGI, they've invented a time machine that makes technology go backwards.

[–] Tollana1234567@lemmy.today 7 points 3 days ago* (last edited 2 days ago)

cheapflation basically. going from a superior product to an inferior one that doesnt even work as good.

[–] rbos@lemmy.ca 22 points 3 days ago (1 children)

Maybe this will finally be the impetus for developers go start optimizing their app memory use.

Hard to say that with a straight face.

[–] Rekall_Incorporated@piefed.social 10 points 3 days ago* (last edited 2 days ago) (4 children)

I am not a developer, so this is just speculation, but I think the current development community (outside of individuals with a personal interest in the topic) is largely incapable of developing efficient, well-optimized applications. Not that they don't have the capability, but the broader industry ecosystem (on the consumer side) doesn't exist in terms of efficient application development.

[–] Croquette@sh.itjust.works 2 points 3 days ago* (last edited 3 days ago)

Its a self fulfilling disaster. The vast majority of developers learn their skills in a company. Companies don't want to pay for documentation. So developers don't learn to properly document.

When I work with a client that doesn't have a constraining standard for their product, they never ever want to pay for documentation.

When I work with a client that has to work with constraining standards, they want the strict minimum that will get them their cert.

Edit : forgot to add my closing point.

Having a good documentation will make coding a lot more efficient and easy and usually decent.

[–] boonhet@sopuli.xyz 2 points 3 days ago* (last edited 3 days ago) (1 children)

It's all about velocity. Electron allows you to ship bullshit to multiple platforms REALLY fast because you only develop for the web, but get Windows, Linux and MacOS as a bonus.

Nobody wants to do C++ anymore, otherwise you could ship most things in Qt and get way better performance and still keep it cross-platform.

I understand that and I don't have any illusions about things changing (short of major policy break in the EU that emphasizes that you can't beat the Americans at their own game and you need to develop a novel approach that the Americans can't compete with).

My counter argument is an application like QBittorrent. It's an open source app with no budget, it's cross-platform (including CLI and webUI, albeit MacOS support seems to be subpar due to lack of developers) and it is very efficient.

In the non-open source and/or Windows-only sphere, there is Mp3Tag, Notepad++, FastStone Image Viewer, Media Player Classic BE.

All very snappy applications, with a huge range of features/options (by the standard of consumer software) and they have the ability to handle large throughput.

[–] GreenKnight23@lemmy.world 1 points 3 days ago

the current development community (outside of individuals with a personal interest in the topic) is largely incapable of developing efficient, well-optimized applications.

as a developer I approve of this message. from my own experience, 70% of devs write just about the worst code you could imagine.

and now, it's even worse with AI.

[–] Die4Ever@retrolemmy.com 1 points 3 days ago* (last edited 3 days ago) (1 children)

A big problem is that developers will just make stuff work on their own machine, and they all have high spec machines lol

If a company forced all their developers to use dual core CPUs and 8GB RAM, you'd see more efficient code

[–] boonhet@sopuli.xyz 2 points 3 days ago (1 children)

It's the companies forcing everything to be done super fast in the first place. You think any developers go out thinking "Hmmm, today I'll create a really slow Electron application"? No, it's management going "We need an MVP in 2 weeks, and then new features shipped every week after that". So shortcuts are taken.

[–] squaresinger@lemmy.world 3 points 3 days ago (1 children)

This.

Developers aren't the ones making those decisions.

Try telling management that you don't need one team of devs but 6 instead (one each for the Windows, Mac, Linux, Android and iOS apps plus webapp), because you don't just want to make one electron app that runs everywhere.

I am working on the backend for the apps for a large retailer. Since it's an old app, we do have dedicated Android, iOS and Web apps. Since it's a retailer, we at least don't have to have Windows/Mac/Linux apps.

We have three separate frontend teams with about 20 people in total, because we essentially have to run three separate app projects. Since Android, iOS and Web have so hugely different tech stacks, there's pretty much nothing that's shared between these three apps. One's made in Kotlin, one's done in Swift and the website is in Typescript.

Senior management demanded that we cut some FTEs and fired the app team members for Android and iOS responsible for the ecommerce part of the apps and told us to instead use a webview. Now the other app devs are just waiting to get fired too once upper management figures out that the whole app can just be a webview.

Don't tell us we devs are lazy or don't know how to do our job. Complain about upper management that doesn't want to invest into a real solution.

[–] Rekall_Incorporated@piefed.social 2 points 2 days ago (1 children)

I can't stand Android WebView apps, especially in retail. The whole point of installing a mobile app is to get a smoother experience than using the mobile webUI.

[–] squaresinger@lemmy.world 1 points 2 days ago

Believe me, all of us devs are on the same side. But convincing upper management that this warrants tripling development costs is not easy.

[–] Dojan@pawb.social 25 points 3 days ago (1 children)

Lfmao. Good luck running Windows 11 with that.

[–] imetators@lemmy.dbzer0.com 3 points 3 days ago (4 children)

For a typical browsing/youtube/office laptop user this is going to be enough. For the rest of us - fuck us I guess.

[–] Dojan@pawb.social 2 points 3 days ago (1 children)

My work computer has 32 gigs, it’s the only computer in my home with Windows on it. If I sit with teams and a browser open it’s sat at 12,2GB.

[–] boonhet@sopuli.xyz 3 points 3 days ago (1 children)

You can't measure memory consumption like that. You could run the same shit with 8 GB and it'd be using like 6 or 7 I bet.

Operating systems use spare memory to cache things for faster access.

[–] Dojan@pawb.social 2 points 3 days ago

Windows does a poor job at faster access.

[–] Nindelofocho@lemmy.world 1 points 3 days ago* (last edited 3 days ago)

Its going to be netbooks all over again

[–] Railcar8095@lemmy.world 2 points 3 days ago

True. My typical W11 usage is to start the computer, tell copilot to fuck off 10 times and then reboot to Linux. Maybe I can do that with 4 gigs too

[–] Rekall_Incorporated@piefed.social 1 points 3 days ago* (last edited 3 days ago)

Depends on the the type office work.

If you use excel heavily with large datasets or say data vizualization software like PowerBI and Tableau, 8GB is definitely not going to be enough.

That being said, my grandma has an a 6GB RAM Windows 10 machine and it works fine for her relatively resource-lite use cases.

[–] floofloof@lemmy.ca 27 points 3 days ago* (last edited 3 days ago) (2 children)

Meanwhile, manufacturers have less flexibility with budget notebooks. Lowering their specs would make the affordable options struggle with even basic tasks in Windows 11.

Linux would help. And keeping that old PC for as long as it still works. Microsoft didn't see this coming with their attempt to force everyone onto new hardware. So much for their bloated, resource-guzzling "AI OS". No one wants it and no one will have the money to run it.

[–] rimu@piefed.social 17 points 3 days ago* (last edited 3 days ago) (2 children)

Maybe they'll (re)start selling laptops with Linux pre-installed!

Or switch to ddr4, lol. Laptop CPUs are crippled garbage anyway, might as well use old ram.

[–] T4V0@lemmy.pt 8 points 3 days ago

At least in my country DDR4 prices already increased by a factor of 2~3, so don't count on that.

[–] floofloof@lemmy.ca 3 points 3 days ago

It's not really any cheaper. The only way you can save money is by not buying anything but sticking with what you have already.

[–] cmnybo@discuss.tchncs.de 5 points 3 days ago

Linux still runs very well for general use with 8GB of RAM. You can also reduce RAM usage by using zswap to compress less frequently accessed data.

[–] 9point6@lemmy.world 20 points 3 days ago* (last edited 3 days ago)

Are we going to enter a world where Apple's memory upgrade pricing is not completely detached from reality?

...Nah I'm sure they want to keep their margin

[–] RobotToaster@mander.xyz 11 points 3 days ago (2 children)

That's crazy, my over 5 year old laptop has that and it was considered the bare minimum then.

[–] 9point6@lemmy.world 12 points 3 days ago (1 children)
[–] arin@lemmy.world 10 points 3 days ago (1 children)

Was just going to say this.

[–] squaresinger@lemmy.world 1 points 3 days ago

I got my first laptop with 8GB RAM in 2013. It was a €500 gaming laptop.

[–] PunnyName@lemmy.world 7 points 3 days ago (1 children)

Glad I got my Asus with 64 GB already, then.

While I am on a desktop, I am also happy that that I bought 64 GB of relatively high performance DDR4.

It looks like I will have to stay with my AM4 system for at least another 2-3 years. It works very well, the only thing that I am missing is an update to 5800X3D, which unfortunately is impossible to get for a fair price since it was a one-time run only it seems.

[–] Tollana1234567@lemmy.today 6 points 3 days ago

because they banked all thier resources to AI, and they cant switch back easily.

[–] altphoto@lemmy.today 4 points 3 days ago

Nobody will buy that stuff. Expect less when you sell it on eBay.

[–] Blackmist@feddit.uk 2 points 3 days ago

Welp, time to just steal it out of your colleague's work PCs.