this post was submitted on 21 Jan 2026
1263 points (98.6% liked)

Technology

79015 readers
3528 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Workers should learn AI skills and companies should use it because it's a "cognitive amplifier," claims Satya Nadella.

in other words please help us, use our AI

top 50 comments
sorted by: hot top controversial new old
[–] boaratio@lemmy.world 3 points 5 hours ago

Did they ever have social permission in the first place?

[–] SaveTheTuaHawk@lemmy.ca 1 points 5 hours ago

Cute..like they need permission for anything.

[–] abbiistabbii@lemmy.blahaj.zone 2 points 6 hours ago

So you admit it. You admit AI isn't useful.

[–] AniZaeger@lemmy.world 1 points 6 hours ago

I'm all for an AI tax to offset the jobs replaced, with that tax going towards providing a universal basic income to those displaced workers, for life. Maybe then AI will actually be useful for something.

[–] EightBitBlood@lemmy.world 1 points 6 hours ago

Buy my snake oil. It's a cognitive amplifier. I just need several neighborhoods worth of electricity to make a bottle. Better find a use for this oil, otherwise I'll get lynched!

  • Microsoft CEO 2025.
[–] AlexLost@lemmy.world 25 points 23 hours ago (1 children)

You already don't have social permission to do what you are doing, and that hasn't stopped you. The world is bigger than the 10 people around your board's table.

[–] Canconda@lemmy.ca 5 points 22 hours ago (1 children)

Yea but what he means is that nobody is cutting power lines or driving trucks through walls yet.

load more comments (1 replies)
[–] Ruigaard@slrpnk.net 13 points 23 hours ago (3 children)

Isn't there plenty of research it's the opposite of a cognitive amplifier, people get cognitively lazy using ai.

[–] BarneyPiccolo@lemmy.today 2 points 21 hours ago

Geez, CEO's don't need any more excuses to be lazier. They gonna farm out their pointing and ordering people to carry out underlings' ideas, and throwing tantrums, to AI?

load more comments (2 replies)
[–] RabbitBBQ@lemmy.world 6 points 21 hours ago

He could set an example by replacing himself with AI

[–] zebidiah@lemmy.ca 10 points 1 day ago (3 children)

Honestly, this is the most reasonable take I have heard from tech bros on ai so far... Use it for something useful and stop using it for garbage!

Ai has a million great uses that could make so many things so much easier, but instead we are building AI to undress women on twitter

[–] eestileib@lemmy.blahaj.zone 4 points 23 hours ago

"we" aren't doing that. The tech bros are.

[–] kiagam@lemmy.world 3 points 1 day ago

Honestly that is one of the only things it is doing well

[–] Fedizen@lemmy.world 2 points 1 day ago

Women and children

[–] MehBlah@lemmy.world 8 points 1 day ago

Its a admission that it isn't doing anything useful.

[–] H1AA6329S@lemmy.world 30 points 1 day ago

I hope all parties responsible for this garbage, including Microsoft will pay a huge price in the end. Fuck all these morons.

Stop shilling for these corporate assholes or you will own nothing and will be forced to be happy.

[–] HaraldvonBlauzahn@feddit.org 50 points 1 day ago* (last edited 1 day ago) (1 children)

Literally burning the planet with power demand from data centers but not even knowing what it could possibly be good for?

That's eco-terrorism for lack of a better word.

Fuck you.

load more comments (1 replies)
[–] BioDriver@lemmy.world 14 points 1 day ago

AI can absolutely be useful. But it’s been wildly oversold and the actual beneficial use cases are not nearly as profitable as the marketing around it

[–] ScoffingLizard@lemmy.dbzer0.com 5 points 1 day ago (1 children)

It would be more useful to replace CEOs with AI. Or maybe even my dog.

load more comments (1 replies)
[–] SaveTheTuaHawk@lemmy.ca 4 points 23 hours ago

"But we're rendering naked children as fast as we can"

[–] GuyIncognito@lemmy.ca 3 points 22 hours ago (1 children)

Quick! Jam it into toasters! Put LLMs into keyboards so they rewrite everything inputted! AI-powered screwdrivers!

[–] qyron@sopuli.xyz 2 points 21 hours ago

Toasted AI! The best kind of AI!

[–] BilSabab@lemmy.world 3 points 22 hours ago

Just investing in better tech instead of stacking GPUs might suffice too

[–] Sunflier@lemmy.world 5 points 1 day ago

We're not replacing workers fast enough.

-This jackass, essentially.

[–] Aceticon@lemmy.dbzer0.com 16 points 1 day ago* (last edited 1 day ago) (3 children)

AI isn't at all reliable.

Worse, it has a uniform distribution of failures in the domain of seriousness of consequences - i.e. it's just as likely to make small mistakes with miniscule consequences as major mistakes with deadly consequences - which is worse than even the most junior of professionals.

(This is why, for example, an LLM can advise a person with suicidal ideas to kill themselves)

Then on top of this, it will simply not learn: if it makes a major deadly mistake today and you try to correct it, it's just as likely to make a major deadly mistake tomorrow as it would be if you didn't try to correct it. Even if you have access to actually adjust the model itself, correcting one kind of mistake just moves the problem around and is akin to trying to stop the tide on a beach with a sand wall - the only way to succeed is to have a sand wall for the whole beach, by which point it's in practice not a beach anymore.

You can compensate for this by having human oversight on the AI, but at that point you're just back to having to pay humans for the work being done, so now instead of having to the cost of a human to do the work, you have the cost of the AI to do the work + the cost of the human to check the work of the AI and the human has to check the entirety of the work just to make sure since problems can pop-up anywere, take and form and, worse, unlike a human the AI work is not consistent so errors are unpredictable, plus the AI will never improve and it will never include the kinds of improvements that humans doing the same work will over time discover in order to make later work or other elements of the work be easier to do (i.e. how increase experience means you learn to do little things to make your work and even the work of others easier).

This seriously limits the use of AI to things were the consequences of failure can never be very bad (and if you also include businesses, "not very bad" includes things like "not significantly damage client relations" which is much broader than merely "not be life threathening", which is why, for example, Lawyers using AI to produce legal documents are getting into trouble as the AI quotes made up precedents), so mostly entertainment and situations were the AI alerts humans for a potential situation found within a massive dataset and if the AI fails to spot it, it's alright and if the AI incorrectly spots something that isn't there the subsequent human validation can dismiss it as a false positive (so for example, face recognition in video streams for the purpose of general surveillance, were humans watching those video streams are just or more likely to miss it and an AI alert just results in a human checking it, or scientific research were one tries to find unknown relations in massive datasets)

So AI is a nice new technological tool in a big toolbox, not a technological and business revolution justifying the stock market valuations around it, investment money sunk into it or the huge amount of resources (such as electricity) used by it.

Specifically for Microsoft, there doesn't really seem to be any area were MS' core business value for customers gains from adding AI, in which case this "AI everywhere" strategy in Microsoft is an incredibly shit business choice that just burns money and damages brand value.

[–] Schadrach@lemmy.sdf.org 1 points 6 hours ago

So AI is a nice new technological tool in a big toolbox, not a technological and business revolution justifying the stock market valuations around it, investment money sunk into it or the huge amount of resources (such as electricity) used by it.

Specifically for Microsoft, there doesn’t really seem to be any area were MS’ core business value for customers gains from adding AI, in which case this “AI everywhere” strategy in Microsoft is an incredibly shit business choice that just burns money and damages brand value.

It's a shiny new tool that is really powerful and flexible and everyone is trying to cram everywhere. Eventually, most of those attempts will collapse in failure, probably causing a recession and afterward the useful use cases will become part of how we all do things. AI is now where the internet was in the late 80s - just beyond the point where it's not just some academics fiddling with it in research labs, but not in any way a mature technology.

Most gaming PCs from the 2020s can run a model locally though it might need to be a pruned one, so maybe a little farther along.

load more comments (2 replies)
[–] utopiah@lemmy.world 21 points 1 day ago* (last edited 1 day ago)

"bend the productivity curve" is such a beautiful way to say that they are running out of ideas on how to sell that damn thing.

It basically went from :

  • it's going to change EVERYTHING! Humanity as we know it is a thing of the past!

... to "bend the productivity curve". It's not how it "radically increase productivity" no it's a lot more subtle than that, to the point that it can actually bend that curve down. What a shit show.

[–] LittleBorat3@lemmy.world 6 points 1 day ago (1 children)

Isn't it alleged that China goes for specific use cases and not general intelligence?

Maybe that's the way to go and not the gamble that the US and western companies are doing.

[–] zebidiah@lemmy.ca 2 points 1 day ago

It's much easier to build an expert ai, you don't need it to have general knowledge and can feed them data manuals. It takes a lot less processing power and generates much better, accurate and relevant results. It also means there is a set quantity of data the ai can/needs to consume to know everything about x subject.

Think about it like the... The Spotify library is like 350tb, but you won't ever need all that, exclude all foreign language content,podcasts, and genres you don't care about and that 350tb can get slimmed down a much more relevant 35tb, specialist ai is kinda like that....

[–] RamRabbit@lemmy.world 28 points 1 day ago (2 children)

Just make copilot it's own program that is uninstallable, remove it from everywhere else in the OS, and let it be. People who want it will use it, people who don't want it won't. Nobody would be pissed at Microsoft over AI if that is what they had done from the start.

[–] filcuk@lemmy.zip 18 points 1 day ago (1 children)

No, it will be attached to every application, as well as the start menu, settings, notepad, paint, regedit, calculator and every other piece of windows you AI hating swine

[–] Stern@lemmy.world 14 points 1 day ago (3 children)

we attached it to the clock in case you need it to get the time wrong.

load more comments (3 replies)
load more comments (1 replies)
[–] Johnnyboy7781@lemmy.world 2 points 22 hours ago

Funny seeing this post during a big Microsoft outage lmao

[–] Spazz@lemmynsfw.com 2 points 22 hours ago

The only thing it produces is heat

[–] saimen@feddit.org 16 points 1 day ago (3 children)

Eeh didn't you pay attention in economy 101? If you generate more supply than demand that's a you problem. The free market will take care.

load more comments (3 replies)
[–] Doomsider@lemmy.world 24 points 1 day ago (3 children)

Delusional, created a solution to a problem that doesn't exist to usurp the power away from citizens and concentrate it in the minority.

This is the opposite of the information revolution. This is the information capture. It will be sold back to the people it was taken from while being distorted by special interests.

load more comments (3 replies)
[–] kescusay@lemmy.world 396 points 2 days ago (48 children)

"Cognitive amplifier?" Bullshit. It demonstrably makes people who use it stupider and more prone to believing falsehoods.

I'm watching people in my industry (software development) who've bought into this crap forget how to code in real-time while they're producing the shittiest garbage I've laid eyes on as a developer. And students who are using it in school aren't learning, because ChatGPT is doing all their work - badly - for them. The smart ones are avoiding it like the blight on humanity that it is.

load more comments (48 replies)
[–] itistime@infosec.pub 47 points 1 day ago

The oligarch class is again showing why we need to upset their cart.

[–] llama@lemmy.zip 45 points 1 day ago (7 children)

As far as I can tell there hasn't been any tangible reward in terms of pay increase, promotion or external recruitment from using the cognitive amplifier.

load more comments (7 replies)
[–] Siegfried@lemmy.world 48 points 1 day ago (2 children)

Social permission? I dont remember that we had a vote or something on this bullshit.

load more comments (2 replies)
[–] rustydrd@sh.itjust.works 52 points 1 day ago
[–] FreddiesLantern@leminal.space 96 points 2 days ago (13 children)

How can you lose social permission that you never had in the first place?

load more comments (13 replies)
[–] OshagHennessey@lemmy.world 66 points 1 day ago (1 children)

"Microsoft thinks it has social permission to burn the planet for profit" is all I'm hearing.

load more comments (1 replies)
[–] BarneyPiccolo@lemmy.today 1 points 21 hours ago

Yeah, I'm already there.

load more comments
view more: next ›