this post was submitted on 06 Jan 2026
34 points (97.2% liked)

Technology

41184 readers
277 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 4 years ago
MODERATORS
 
top 11 comments
sorted by: hot top controversial new old
[–] Midnitte@beehaw.org 18 points 4 days ago (2 children)

There's a great talk here where he talks about using local models where I could see them actually being useful.

Hopefully we get there and memory stops this ridiculous 5000% markup.

[–] Flax_vert@feddit.uk -1 points 4 days ago

There are a bunch of useful ways. For example, I was toying with a Minecraft server where people start a country and a local llm can come up with a country code based on their country's name and the ISO-3166 standard in a few minutes

[–] msage@programming.dev 0 points 4 days ago (1 children)

Cannonical talking about UX AI?

LoL.

LMAO even.

[–] Midnitte@beehaw.org 6 points 4 days ago

It's actually a really good talk from someone qoth decades of UX experience. The focus was more on innovation in UX (with the example that Microsoft got AI in Windows very... very wrong).

[–] kungen@feddit.nu 12 points 4 days ago (1 children)

How's that good news? It sounds like they are just double-dipping...

[–] CalcProgrammer1@lemmy.today 11 points 4 days ago

My only takeaway that could be seen as good news is that they at least expect consumers to have access to local computing power strong enough to run local AI, and that computing power is very likely in the form of GPUs that can also be used for PC gaming. Hopefully this means there's still some focus on consumer GPUs somewhere out there rather than just selling them all to OpenAI.

[–] artyom@piefed.social 10 points 4 days ago* (last edited 4 days ago)

Doesn't really make much sense. I mean yeah, privacy and all of that but think of the environmental impact of 1000 inefficient PCs vs. 1 efficient PC shared by 1000 people. Maybe just open source models hosted by a community would be better.

Or better yet just forget about it entirely.

[–] Megaman_EXE@beehaw.org 6 points 4 days ago* (last edited 4 days ago) (1 children)

The only way I would be comfortable with Ai is if I could craft it myself, run it locally, and prevent it from feeding bullshit results, was energy efficient, could prevent it from phoning home, wasn't built off of stolen data and also didn't give profit to big companies

[–] t3rmit3@beehaw.org 4 points 3 days ago (1 children)

I mean... You can. You can train and run models yourself. Lots of people and orgs do.

[–] Megaman_EXE@beehaw.org 2 points 3 days ago

Yeah thats true! I hope that the world shifts towards that rather than what most people have been doing.

[–] Quexotic@beehaw.org 3 points 3 days ago

Fantastic. They'll make US pay for it. There's no way they don't turn this into something more evil.