this post was submitted on 17 Feb 2026
701 points (99.9% liked)

Microblog Memes

10885 readers
1632 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

RULES:

  1. Your post must be a screen capture of a microblog-type post that includes the UI of the site it came from, preferably also including the avatar and username of the original poster. Including relevant comments made to the original post is encouraged.
  2. Your post, included comments, or your title/comment should include some kind of commentary or remark on the subject of the screen capture. Your title must include at least one word relevant to your post.
  3. You are encouraged to provide a link back to the source of your screen capture in the body of your post.
  4. Current politics and news are allowed, but discouraged. There MUST be some kind of human commentary/reaction included (either by the original poster or you). Just news articles or headlines will be deleted.
  5. Doctored posts/images and AI are allowed, but discouraged. You MUST indicate this in your post (even if you didn't originally know). If an image is found to be fabricated or edited in any way and it is not properly labeled, it will be deleted.
  6. Absolutely no NSFL content.
  7. Be nice. Don't take anything personally. Take political debates to the appropriate communities. Take personal disagreements & arguments to private messages.
  8. No advertising, brand promotion, or guerrilla marketing.

RELATED COMMUNITIES:

founded 2 years ago
MODERATORS
top 35 comments
sorted by: hot top controversial new old
[–] anamethatisnt@sopuli.xyz 1 points 16 minutes ago

My wife's old gaming pc lasted 2016 - 2025 with the only upgrade being a new and larger SATA SSD. I've never had a gaming pc last that long before and the cost per year turned out really great.

I really hope that the consumer market get back on track before I need to make any more larger purchases.

[–] ArmchairAce1944@discuss.online 16 points 11 hours ago (2 children)

As an elder millennial I grew up in an era where a few years meant the difference between bleeding edge and obsolete. This continued until the late 2010s and things just seemed to seriously stagnate after that.

[–] WalrusDragonOnABike@reddthat.com 2 points 10 hours ago

I sorta feel like there was a significant dropoff after the first generation that included ddr4 and pcie4 unless you specifically except for GPUs if you specifically used the RTX features in terms of real-world performance* in the higher end consumer desktop side. I rarely had issue with my i7-4790k setup and it was a ddr3/pcie3/limited nvme support generation. Only replaced it last year because the mobo died.

*Based entirely on my own use cases.

[–] MBech@feddit.dk 2 points 11 hours ago

My GTX 1070 became obesolete after around 4 years. Not in the "Can't launch games" obsolete, but more "Can't keep a stable fps in newer games" obsolete. However my 3070 is still going strong, with no real issues in anything after 5 years. I see absolutely no reason to upgrade it.

[–] elbiter@lemmy.world 8 points 10 hours ago* (last edited 10 hours ago) (1 children)

Maybe it's time to value debloated software...

Maybe a mouse driver doesn't have to take 1 GB of RAM and send usage statistics to the manufacturer...

[–] spicehoarder@lemmy.zip 4 points 10 hours ago

2026 year of the Linux desktop

[–] mavu@discuss.tchncs.de 25 points 15 hours ago (1 children)

I'm so old, i even remember when computers got faster over time.

[–] Brummbaer@pawb.social 4 points 11 hours ago (1 children)

It's the end of the Silicon Valley era. In a few years I guess new and faster stuff will come from China.

[–] jacksilver@lemmy.world 3 points 11 hours ago (2 children)

Unless there is a paradigm or materials breakthrough I wouldn't expect a major leap anytime soon.

Its kinda like the same with TV quality/video game graphics. We've been squeezing a lot out of the current technology, but further enhancements will be incremental.

[–] spicehoarder@lemmy.zip 2 points 9 hours ago

The paradigm shift will be going back to low level programming. Quake type innovations like fast sqrt

[–] PhoenixDog@lemmy.world 2 points 11 hours ago

Yeah I think we're at the point of technology plateauing. Ways to interact with that tech may change, like VR vs traditional screens, but we can only compress so many pixels on a surface until it just starts looking marginally better for more money.

I still watch a lot of youtube and movies and stuff no more than 1080p because the increase in quality beyond that doesn't affect the quality of the video enough for me to give a shit. I've seen 4k and 8k tvs in tech stores like Best Buy and while they look awesome, it wouldn't change how I consume the same media.

[–] jballs@sh.itjust.works 15 points 14 hours ago (1 children)

I remember back in the day you could buy a hard drive that was shown on the box as having a something like 300 MB capacity, then when you opened the box you found out it actually had 500 MB capacity - because the advances in manufacturing were outpacing their ability to print new boxes.

Damn we had it good then.

[–] T156@lemmy.world 6 points 10 hours ago* (last edited 9 hours ago)

Kind of unimaginable these days, since you'd expect it to be artificially limited to 300 MB, even if the hardware could support 500.

Like how on the 970(?), on some video cards, you could overclock it into being a 970Ti if you were lucky, since it was before they started lasering parts off the chips to stop people doing that to lower-grade cards, and binning was just a matter of whether it passed muster or not to match the requirements for the higher-grade cards. They could sometimes be pushed to that level, just not reliably for all of them.

[–] bearboiblake@pawb.social 78 points 23 hours ago (2 children)

20 years ago or so, I was at a computer parts store, pricing up parts to build a computer. I was on a limited budget and had already decided on a CPU, graphics card, and a motherboard.

"Ah, crap, I forgot about RAM", I said. "No problem", the shopkeeper replied, "RAM is cheap"

I don't remember what the CPU I got was, but the GPU was an Nvidia GeForce 6600GT.

[–] PhoenixDog@lemmy.world 5 points 10 hours ago* (last edited 10 hours ago)

I bought my current PC 13 years ago. Haven't changed a part out of it. At the time I bought it, it had a GTX 745 in it, i7 4790 @ 3.60GHz, 12gb DDR3 ram, and I got it for less than a $1000. While I can't play games like RDR2 or the AAA modern day games on it, it still gets the job done for the games I enjoy playing in 2026.

[–] T156@lemmy.world 3 points 10 hours ago* (last edited 9 hours ago)

"Ah, crap, I forgot about RAM", I said. "No problem", the shopkeeper replied, "RAM is cheap"

That was also the mindset in programming up to very recently. Memory is cheap and plentiful, so you didn't need to worry too much for all but the most conservative of memory management.

Strange to think that with how computers are today, we're looping back around to resource-limited compute, not because of software bloat or anything, but that people won't be able to afford to change their computers out for something more powerful.

[–] osanna@thebrainbin.org 21 points 20 hours ago (2 children)

I'm so old, i got excited for a new couch. a few months ago, I got excited because i got a new rug.

[–] kameecoding@lemmy.world 11 points 16 hours ago (2 children)

I have a video about me being giddy that I have an air fryer that has two compartments that you can set up to finish at the same time, so if you have something you need to have it in there for 15 minutes and the other thing only for 10 minutes, then the 10 minute side waits 5 minutes and starts when the 15 minute side counts down to 10 minutes, so now my protein and the sides get finished at the same time.

That's awesome for me.

[–] MBech@feddit.dk 3 points 11 hours ago

I have one that does that too!! It's seriously awesome!

[–] osanna@thebrainbin.org 4 points 16 hours ago

wow. That's an epic appliance! I'm jelly :)

[–] frunch@lemmy.world 7 points 19 hours ago (1 children)

Hell yeah aging. It's funny how such simple things can bring satisfaction but if stuff like that makes you happy, i say you're doing a-ok 🥂

[–] osanna@thebrainbin.org 7 points 19 hours ago

I don't have much (am long term unemployed/disabled), so the few things I can afford really do get me excited

[–] Arghblarg@lemmy.ca 33 points 22 hours ago* (last edited 22 hours ago) (1 children)

I 'panic bought' (OK, not out of panic, but mild concern) a 22TB drive since the price seemed not too astronomical, and the local store had a few left. Just in case.

Seems the supplies really are drying up. Fuck these AI companies. Doesn't matter if they actually intended to wage a war on personal computation; their hoarding of the supply chain for years to come really is an assault on our ability to afford local self-hosted computing. I hope the bubble bursts, soon and hard.

[–] GalacticSushi@lemmy.blahaj.zone 6 points 11 hours ago

Doesn't matter if they actually intended to wage a war on personal computation

I think they are intentionally waging war on personal hardware. They want everything cloud based so they have direct access and ownership of all user activity. Their wet dream is everyone's computer being a streaming stick plugged into a monitor.

[–] RoidingOldMan@lemmy.world 22 points 23 hours ago (5 children)

I know RAM and graphics cards are up, but computers are still getting cheaper. A $400 laptop today vs. a $400 laptop 10 years ago, is slightly faster but also, due to inflation, the newer one is cheaper.

[–] CheeseNoodle@lemmy.world 5 points 11 hours ago

That only works if any regular persons pay has actually increased in that time.

[–] paultimate14@lemmy.world 2 points 12 hours ago

The pricing shocks are slower to hit OEM's, but rest assured they will. A $400 laptop from July 2025 is going to cost $800 in July 2026.

[–] Taldan@lemmy.world 3 points 13 hours ago

10 years ago, sure. But what about 2 years ago? Progress has slowed significantly, with core parts only very recently exploding in price

[–] Quetzalcutlass@lemmy.world 35 points 22 hours ago* (last edited 22 hours ago) (1 children)

RAM, graphics cards, SSDs, and HDDs are all up, some multiple-fold in price, and the disruption looks like it may continue for years. Combined with the death of Moore's Law, a $400 computer at today's prices might actually be worse than what you'd get for that money a decade ago.

[–] FlordaMan@lemmy.world 8 points 20 hours ago (1 children)

I find that hard to believe. Do you have an example of a laptop from 2016 that was sold for $400 that would perform better than a 2026 laptop costing $400?

[–] village604@adultswim.fan 3 points 15 hours ago

The 2026 laptop would need to be $550 to account for inflation.

[–] varnia@lemmy.blahaj.zone 4 points 19 hours ago (1 children)

Computer are still getting cheaper as long as you bring our own RAM and storage. /s

[–] rumba@lemmy.zip 3 points 15 hours ago

That's only because Intel/AMD hasn't figured out how to cash in on the artificial BS yet :)

give em a week, they'll tell us that all their processors are purchased through 2030

[–] takeda@lemmy.dbzer0.com 5 points 22 hours ago

More energy efficient as well.