this post was submitted on 10 Apr 2025
134 points (96.5% liked)

Games

18446 readers
912 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 55 points 2 days ago* (last edited 2 days ago) (3 children)

There’s what AI could’ve been (collaborative and awesome), and then there’s what the billionaire class is pushing today (exploitative shit that they hit everyone over the head with until they say they like it). But the folks frothing at the mouth over it are unwilling to listen to why so many people are against the AI we’ve had forced upon us today.

Yesterday, Copilot hallucinated four different functions when I asked it to refactor a ~20 line TS function, despite me handing it 2 helper files that contained everything available for it to use. If I can’t confidently ask it to do anything, it’s immediately useless to me. It’s like being stuck with an impulsive liar that you have to get the truth out of.

[–] [email protected] 3 points 1 day ago

Dude I couldn't even get copilot to generate a picture with the size I wanted despite specifying the exact pixels for height and width.

[–] [email protected] 12 points 2 days ago (1 children)

A guy I used to work with would, at least I would swear it, submit shit code just so I would comment about the right way to do it. No matter how many times I told him how to do something. Sometimes it was code that didn't actually do anything. Working with co-pilot is a lot like working with that guy again.

[–] [email protected] 18 points 2 days ago

Funny enough, here’s a description of AI I wrote yesterday that I think you’ll relate to:

AI is the lazy colleague that will never get fired because their dad is the CTO. You’re forced to pair with them on a daily basis. You try to hand them menial tasks that they still manage to get completely wrong, while dear ol’ dad is gassing them up in every all-hands meeting.

[–] [email protected] 5 points 2 days ago (1 children)

It's fundamentally a make-shit-up device. It's like pulling words out of a hat. You cannot get mad at the hat for giving you poetry when you asked for nonfiction.

Get mad at the company which bolted the hat to your keyboard and promised you it was psychic.

[–] [email protected] 2 points 1 day ago

I think that's exactly who they're mad at

[–] [email protected] 93 points 2 days ago* (last edited 2 days ago) (44 children)

Considering that the AI craze is what's fueling the shortage and massive increase in GPU prices, I really don't see gamers ever embracing AI.

[–] [email protected] 69 points 2 days ago

[...] I really don’t see gamers ever embracing AI.

They've spent years training to fight it, so that tracks.

[–] [email protected] 12 points 2 days ago* (last edited 2 days ago) (6 children)

The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD.

…No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction, even though Battlemage is excellent.

For local AI, the only thing that gets sucked up are 3060s, 3090s, and for the rich/desperate, 4090s/5090s, with anything else being a waste of money with too little VRAM. And this is a pretty small niche.

[–] [email protected] 43 points 2 days ago (2 children)

Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don't get made. And what's left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn't been enough for a while. Whole situation is just absurd.

load more comments (2 replies)
[–] [email protected] 10 points 2 days ago (1 children)

Still have limited wafers at the fabs. The chips going to datacenters could have been consumer stuff instead. Besides they (nVidia, Apple, AMD) are all fabricated at TSMC.

Local AI benefits from platforms with unified memory that can be expanded. Watch platforms based on AMD's Ryzen AI MAX 300 chip or whatever they call it take off. Frameworks you can config a machine with that chip to 128 GB RAM iirc. It's the main reason why I believe Apple's memory upgrades cost a ton so that it isn't a viable option financially for local AI applications.

load more comments (1 replies)
load more comments (4 replies)
load more comments (42 replies)
[–] [email protected] 12 points 2 days ago

Carmack is an AI sent from the future, so he's a bit biased.

load more comments
view more: next ›