What a trash click bait headline. That's not how the statement "saying the quiet part out loud" works. This isn't a secret and it's not unspoken and it certainly doesn't not reveal some underlying motive.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
It doesn't confuse us... it annoys us with the blatant wrong information. e.g. glue is a pizza ingredient.
I actually do care about AI PCs. I care in the sense that it is something I want to actively avoid.
I want to run LLMs locally, or things like TTS or STT locally so it’s nice but there’s no real support rn
Most people won’t care nor use it
LLMs are best used when it’s a user choice, not a platform obligation
I guess an NPU is better of being a PCIe peripheral then?
And it can then have their specialised RAM too.
Sorry, I’m not a hardware expert at all
When you’re talking about the PCIe peripheral, you’re talking about a separate dedicated graphics card or something else?
I guess the main point of NPUs are that they are tiny and built in
When you’re talking about the PCIe peripheral, you’re talking about a separate dedicated graphics card or something else?
Yes, similar to what a PCIe Graphics Card does.
A PCIe slot is the slot in a desktop motherboard that lets you fit various things like networking (ethernet, Wi-Fi and even RTC specialised stuff) cards, sound cards, graphics cards, SATA/SAS adapters, USB adapters and all other kinds of stuff.
I guess the main point of NPUs are that they are tiny and built in
GPUs are also available built-in. Some of them are even tiny.
Go 11-12 years back in time and you'll see video processing units embedded into the Motherboard, instead of in the CPU package.
Eventually some people will want more powerful NPUs with better suited RAM for neural workloads (GPUs have their own type of RAM too), not care about the NPU in the CPU package and will feel like they are uselessly paying for it. Others will not require an NPU and will feel like they are uselessly paying for it.
So, much better to have NPUs be made separately in different tiers, similar to what is done with GPUs rn.
And even external (PCIe) Graphics Cards can be thin and light instead of being a fat package. It's usually just the (i) extra I/O ports and (ii) the cooling fins+fans that make them fat.
Thanks for your answer
So, much better to have NPUs be made separately in different tiers, similar to what is done with GPUs rn.
Overall yea, but built-in graphics are remarkably efficient and they have the added benefit to be here even if you didn’t plan that use initially. I’m glad to be able to play video games on my laptop that was meant to be use for work only
Similarly, I had no interest in getting an NPU for this laptop but I found some use to it (well, when it’ll finally support what I want to do)
Manufacturers will never include a niche option, or will overprice it. Built in allows to get that directly
Yeah, I'm not sure what the point of a cheap NPU is.
If you don't like AI, you don't want it.
If you do like AI, you want a big GPU or to run it on somebody else's much bigger hardware via the internet.
A cheap NPU could have some uses. If you have a background process that runs continuously, offloading the work to a low-cost NPU can save you both power and processing. Camera authorization, if you get up, it locks; if you sit down, it unlocks. No reason to burn a core or GPU for that. Security/Nanny cameras recognition. Driving systems monitoring a driver losing consciousness and pulling over. We can accomplish this all now with CPUs/GPUs, but purpose-built systems that don't drain other resources aren't a bad thing.
Of course, there's always the downside that they use that chip for recall. Or malware gets a hold of it for recall, ID theft, There's a whole lot of bad you can do with a low-cost NPU too :)
They said they still adding all of it. They are adding ai. Just not talking about it. Which is probably correct 😂
> be me
> installed VScode to test whether language server is just unfriendly with KATE
> get bombarded with "try our AI!" type BS
> vomit.jpg
> managed to test it, but the AI turns me off
> immediately uninstalled this piece of glorified webpage from my ThinkPad
It seems I'm having to do more jobs with KATE. (Does the LSP plugin for KATE handle stuff differently from the standard in some known way?)
If you're still reading this: I modified the code of the language server, so it now works with KATE...
The world is healing
I'm readying for some new bullshit. I just hope it's not tech related
Does a third world war count as tech related? It certainly uses a lot of tech!

Stolen from BSKY
This is extra funny to me since I just re-watched this episode the other day
Weirdly dell always seems to understand what normal users want.
The problem is normal users have beyond low expectations, no standards and are ignorant of most everything tech related.
They want cheap and easy to use computers that require no service and if there is a problem a simple phone number to call for help.
Dell has optimized for that. So hate em or not, while their goods have gone to shit quality wise. They understand their market and have done extremely well in servicing it.
Thus I am not surprised at all dell understood this. If anything I would have been more surprised if they didn't.
I think they all understand what we want (broadly), they just don't care, because what they want is more important, and they know consumers will tolerate it.
What people don't want is blackbox AI agents installed system-wide that use the carrot of "integration and efficiency" to justify bulk data collection, that the end user implicitly agrees to by logging into the OS.
God forbid people want the compute they are paying for to actually do what they want, and not work at cross purposes for the company and its various data sales clients.
Unveiling: the APU!!! (ad processing unit)
I'd much rather have a more powerful generic CPU than a less powerful generic CPU with an added NPU.
There are very few people who would benefit from an added NPU, ok I hear you say what about local AI?
Ok, what about it?
Would you trust a commercial local AI tool to not be sharing data?
Would your grandmother be able to install an open source AI tool?
What about having enough RAM for the AI tool to run?
Look at the average computer user, if you are on lemmy, chances are very high that you are far more advanced than the average computer user.
I am talking about those users who don't run Adblocker, don't notice the YT ad skip button and who in the past would have installed a minimum of of five toolbars in IE, yet wouldn't have noticed the reduced view of the actual page.
These people are closer to the average users than any of us.
Why do they need local AI?
Just offer NPUs as PCIe extension cards. Thats how computers used to be and should be. Modular and versatile.
Not the position Dell is taking, but I've been skeptical that building AI hardware directly into specifically laptops is a great idea unless people have a very concrete goal, like text-to-speech, and existing models to run on it, probably specialized ones. This is not to diminish AI compute elsewhere.
Several reasons.
-
Models for many useful things have been getting larger, and you have a bounded amount of memory in those laptops, which, at the moment, generally can't be upgraded (though maybe CAMM2 will improve the situation, move back away from soldered memory). Historically, most users did not upgrade memory in their laptop, even if they could. Just throwing the compute hardware there in the expectation that models will come is a bet on the size of the models that people might want to use not getting a whole lot larger. This is especially true for the next year or two, since we expect high memory prices, and people probably being priced out of sticking very large amounts of memory in laptops.
-
Heat and power. The laptop form factor exists to be portable. They are not great at dissipating heat, and unless they're plugged into wall power, they have sharp constraints on how much power they can usefully use.
-
The parallel compute field is rapidly evolving. People are probably not going to throw out and replace their laptops on a regular basis to keep up with AI stuff (much as laptop vendors might be enthusiastic about this).
I think that a more-likely outcome, if people want local, generalized AI stuff on laptops, is that someone sells an eGPU-like box that plugs into power and into a USB port or via some wireless protocol to the laptop, and the laptop uses it as an AI accelerator. That box can be replaced or upgraded independently of the laptop itself.
When I do generative AI stuff on my laptop, for the applications I use, the bandwidth that I need to the compute box is very low, and latency requirements are very relaxed. I presently remotely use a Framework Desktop as a compute box, and can happily generate images or text or whatever over the cell network without problems. If I really wanted disconnected operation, I'd haul the box along with me.
EDIT: I'd also add that all of this is also true for smartphones, which have the same constraints, and harder limitations on heat, power, and space. You can hook one up to an AI accelerator box via wired or wireless link if you want local compute, but it's going to be much more difficult to deal with the limitations inherent to the phone form factor and do a lot of compute on the phone itself.
EDIT2: If you use a high-bandwidth link to such a local, external box, bonus: you also potentially get substantially-increased and upgradeable graphical capabilities on the laptop or smartphone if you can use such a box as an eGPU, something where having low-latency compute available is actually quite useful.
Dell is the first Windows OEM to openly admit that the AI PC push has failed. Customers seem uninterested in buying a laptop because of its AI capabilities, likely prioritizing other aspects such as battery life, performance, and display above AI.
Silicon Valley always had the annoying habit of pushing technology-first products without even much consideration of how they would solve real world problems. It always had it, but it's becoming increasingly bad. When Zuck unveiled the Metaverse it was already starting to be ludicrous, but with the AI laptop wave it turned into Onion territory.
What do you mean? Do you even have ANY foundation to this accusation?
Hold on, I need to turn off my heater. 22211123222234663fffvsnbvcsdfvxdxsssdfgvvgfgg
There it is. The off button. Touch controls are so cool guys.
Ha! Enjoy your off button while they still make them. Once our AI Overlords have won the War, you can only politely ask your laptop to please temporarily quiet itself, please and thank you if it's not too much asking.
Doesn't confuse me, just pisses me off trying to do things I don't need or want done. Creates problems to find solutions to
Holy crap that Recall app that "works by taking screenshots" sounds like such a waste of resources. How often would you even need that?
Virtually everything described in this article already exists in some way...