this post was submitted on 08 Jan 2026
1041 points (99.2% liked)

Technology

78705 readers
4801 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Clent@lemmy.dbzer0.com 22 points 6 days ago

What a trash click bait headline. That's not how the statement "saying the quiet part out loud" works. This isn't a secret and it's not unspoken and it certainly doesn't not reveal some underlying motive.

[–] UsoSaito@feddit.uk 32 points 1 week ago (1 children)

It doesn't confuse us... it annoys us with the blatant wrong information. e.g. glue is a pizza ingredient.

[–] EndlessNightmare@reddthat.com 26 points 6 days ago

I actually do care about AI PCs. I care in the sense that it is something I want to actively avoid.

[–] Electricd@lemmybefree.net 15 points 6 days ago* (last edited 6 days ago) (1 children)

I want to run LLMs locally, or things like TTS or STT locally so it’s nice but there’s no real support rn

Most people won’t care nor use it

LLMs are best used when it’s a user choice, not a platform obligation

[–] ulterno@programming.dev 1 points 5 days ago (1 children)

I guess an NPU is better of being a PCIe peripheral then?
And it can then have their specialised RAM too.

[–] Electricd@lemmybefree.net 1 points 5 days ago (1 children)

Sorry, I’m not a hardware expert at all

When you’re talking about the PCIe peripheral, you’re talking about a separate dedicated graphics card or something else?

I guess the main point of NPUs are that they are tiny and built in

[–] ulterno@programming.dev 1 points 5 days ago (1 children)

When you’re talking about the PCIe peripheral, you’re talking about a separate dedicated graphics card or something else?

Yes, similar to what a PCIe Graphics Card does.
A PCIe slot is the slot in a desktop motherboard that lets you fit various things like networking (ethernet, Wi-Fi and even RTC specialised stuff) cards, sound cards, graphics cards, SATA/SAS adapters, USB adapters and all other kinds of stuff.

I guess the main point of NPUs are that they are tiny and built in

GPUs are also available built-in. Some of them are even tiny.
Go 11-12 years back in time and you'll see video processing units embedded into the Motherboard, instead of in the CPU package.
Eventually some people will want more powerful NPUs with better suited RAM for neural workloads (GPUs have their own type of RAM too), not care about the NPU in the CPU package and will feel like they are uselessly paying for it. Others will not require an NPU and will feel like they are uselessly paying for it.

So, much better to have NPUs be made separately in different tiers, similar to what is done with GPUs rn.

And even external (PCIe) Graphics Cards can be thin and light instead of being a fat package. It's usually just the (i) extra I/O ports and (ii) the cooling fins+fans that make them fat.

[–] Electricd@lemmybefree.net 1 points 3 days ago

Thanks for your answer

So, much better to have NPUs be made separately in different tiers, similar to what is done with GPUs rn.

Overall yea, but built-in graphics are remarkably efficient and they have the added benefit to be here even if you didn’t plan that use initially. I’m glad to be able to play video games on my laptop that was meant to be use for work only

Similarly, I had no interest in getting an NPU for this laptop but I found some use to it (well, when it’ll finally support what I want to do)

Manufacturers will never include a niche option, or will overprice it. Built in allows to get that directly

[–] Blackmist@feddit.uk 10 points 6 days ago (1 children)

Yeah, I'm not sure what the point of a cheap NPU is.

If you don't like AI, you don't want it.

If you do like AI, you want a big GPU or to run it on somebody else's much bigger hardware via the internet.

[–] rumba@lemmy.zip 3 points 6 days ago

A cheap NPU could have some uses. If you have a background process that runs continuously, offloading the work to a low-cost NPU can save you both power and processing. Camera authorization, if you get up, it locks; if you sit down, it unlocks. No reason to burn a core or GPU for that. Security/Nanny cameras recognition. Driving systems monitoring a driver losing consciousness and pulling over. We can accomplish this all now with CPUs/GPUs, but purpose-built systems that don't drain other resources aren't a bad thing.

Of course, there's always the downside that they use that chip for recall. Or malware gets a hold of it for recall, ID theft, There's a whole lot of bad you can do with a low-cost NPU too :)

[–] bytepursuits@programming.dev 8 points 6 days ago

They said they still adding all of it. They are adding ai. Just not talking about it. Which is probably correct 😂

[–] ZILtoid1991@lemmy.world 5 points 6 days ago (1 children)
> be me
> installed VScode to test whether language server is just unfriendly with KATE
> get bombarded with "try our AI!" type BS
> vomit.jpg
> managed to test it, but the AI turns me off
> immediately uninstalled this piece of glorified webpage from my ThinkPad

It seems I'm having to do more jobs with KATE. (Does the LSP plugin for KATE handle stuff differently from the standard in some known way?)

[–] ZILtoid1991@lemmy.world 1 points 3 days ago

If you're still reading this: I modified the code of the language server, so it now works with KATE...

[–] Darkness343@lemmy.world 5 points 6 days ago (1 children)
[–] musubibreakfast@lemmy.world 1 points 6 days ago (1 children)

I'm readying for some new bullshit. I just hope it's not tech related

[–] samus12345@sh.itjust.works 4 points 6 days ago

Does a third world war count as tech related? It certainly uses a lot of tech!

[–] TheBat@lemmy.world 219 points 1 week ago (3 children)
[–] edgemaster72@lemmy.world 4 points 6 days ago

This is extra funny to me since I just re-watched this episode the other day

[–] Holytimes@sh.itjust.works 93 points 1 week ago (16 children)

Weirdly dell always seems to understand what normal users want.

The problem is normal users have beyond low expectations, no standards and are ignorant of most everything tech related.

They want cheap and easy to use computers that require no service and if there is a problem a simple phone number to call for help.

Dell has optimized for that. So hate em or not, while their goods have gone to shit quality wise. They understand their market and have done extremely well in servicing it.

Thus I am not surprised at all dell understood this. If anything I would have been more surprised if they didn't.

[–] artyom@piefed.social 32 points 1 week ago (2 children)

I think they all understand what we want (broadly), they just don't care, because what they want is more important, and they know consumers will tolerate it.

load more comments (2 replies)
load more comments (15 replies)
load more comments (1 replies)
[–] mctoasterson@reddthat.com 85 points 1 week ago (3 children)

What people don't want is blackbox AI agents installed system-wide that use the carrot of "integration and efficiency" to justify bulk data collection, that the end user implicitly agrees to by logging into the OS.

God forbid people want the compute they are paying for to actually do what they want, and not work at cross purposes for the company and its various data sales clients.

[–] Gsus4@mander.xyz 29 points 1 week ago

Unveiling: the APU!!! (ad processing unit)

load more comments (2 replies)
[–] stoy@lemmy.zip 77 points 1 week ago (9 children)

I'd much rather have a more powerful generic CPU than a less powerful generic CPU with an added NPU.

There are very few people who would benefit from an added NPU, ok I hear you say what about local AI?

Ok, what about it?

Would you trust a commercial local AI tool to not be sharing data?

Would your grandmother be able to install an open source AI tool?

What about having enough RAM for the AI tool to run?

Look at the average computer user, if you are on lemmy, chances are very high that you are far more advanced than the average computer user.

I am talking about those users who don't run Adblocker, don't notice the YT ad skip button and who in the past would have installed a minimum of of five toolbars in IE, yet wouldn't have noticed the reduced view of the actual page.

These people are closer to the average users than any of us.

Why do they need local AI?

[–] unexposedhazard@discuss.tchncs.de 34 points 1 week ago (4 children)

Just offer NPUs as PCIe extension cards. Thats how computers used to be and should be. Modular and versatile.

load more comments (4 replies)
load more comments (8 replies)
[–] tal@lemmy.today 47 points 1 week ago* (last edited 1 week ago) (7 children)

Not the position Dell is taking, but I've been skeptical that building AI hardware directly into specifically laptops is a great idea unless people have a very concrete goal, like text-to-speech, and existing models to run on it, probably specialized ones. This is not to diminish AI compute elsewhere.

Several reasons.

  • Models for many useful things have been getting larger, and you have a bounded amount of memory in those laptops, which, at the moment, generally can't be upgraded (though maybe CAMM2 will improve the situation, move back away from soldered memory). Historically, most users did not upgrade memory in their laptop, even if they could. Just throwing the compute hardware there in the expectation that models will come is a bet on the size of the models that people might want to use not getting a whole lot larger. This is especially true for the next year or two, since we expect high memory prices, and people probably being priced out of sticking very large amounts of memory in laptops.

  • Heat and power. The laptop form factor exists to be portable. They are not great at dissipating heat, and unless they're plugged into wall power, they have sharp constraints on how much power they can usefully use.

  • The parallel compute field is rapidly evolving. People are probably not going to throw out and replace their laptops on a regular basis to keep up with AI stuff (much as laptop vendors might be enthusiastic about this).

I think that a more-likely outcome, if people want local, generalized AI stuff on laptops, is that someone sells an eGPU-like box that plugs into power and into a USB port or via some wireless protocol to the laptop, and the laptop uses it as an AI accelerator. That box can be replaced or upgraded independently of the laptop itself.

When I do generative AI stuff on my laptop, for the applications I use, the bandwidth that I need to the compute box is very low, and latency requirements are very relaxed. I presently remotely use a Framework Desktop as a compute box, and can happily generate images or text or whatever over the cell network without problems. If I really wanted disconnected operation, I'd haul the box along with me.

EDIT: I'd also add that all of this is also true for smartphones, which have the same constraints, and harder limitations on heat, power, and space. You can hook one up to an AI accelerator box via wired or wireless link if you want local compute, but it's going to be much more difficult to deal with the limitations inherent to the phone form factor and do a lot of compute on the phone itself.

EDIT2: If you use a high-bandwidth link to such a local, external box, bonus: you also potentially get substantially-increased and upgradeable graphical capabilities on the laptop or smartphone if you can use such a box as an eGPU, something where having low-latency compute available is actually quite useful.

load more comments (7 replies)
[–] manxu@piefed.social 46 points 1 week ago (1 children)

Dell is the first Windows OEM to openly admit that the AI PC push has failed. Customers seem uninterested in buying a laptop because of its AI capabilities, likely prioritizing other aspects such as battery life, performance, and display above AI.

Silicon Valley always had the annoying habit of pushing technology-first products without even much consideration of how they would solve real world problems. It always had it, but it's becoming increasingly bad. When Zuck unveiled the Metaverse it was already starting to be ludicrous, but with the AI laptop wave it turned into Onion territory.

[–] Lucidlethargy@sh.itjust.works 1 points 6 days ago (1 children)

What do you mean? Do you even have ANY foundation to this accusation?

Hold on, I need to turn off my heater. 22211123222234663fffvsnbvcsdfvxdxsssdfgvvgfgg

There it is. The off button. Touch controls are so cool guys.

[–] manxu@piefed.social 1 points 6 days ago

Ha! Enjoy your off button while they still make them. Once our AI Overlords have won the War, you can only politely ask your laptop to please temporarily quiet itself, please and thank you if it's not too much asking.

[–] Sam_Bass@lemmy.world 36 points 1 week ago* (last edited 1 week ago) (6 children)

Doesn't confuse me, just pisses me off trying to do things I don't need or want done. Creates problems to find solutions to

load more comments (6 replies)
[–] SethTaylor@lemmy.world 25 points 1 week ago* (last edited 1 week ago) (2 children)

Holy crap that Recall app that "works by taking screenshots" sounds like such a waste of resources. How often would you even need that?

https://www.windowscentral.com/software-apps/windows-11/the-verdict-is-in-windows-recall-is-great-actually

Virtually everything described in this article already exists in some way...

load more comments (2 replies)
load more comments
view more: next ›