this post was submitted on 02 Jul 2025
226 points (92.5% liked)

Games

20136 readers
831 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 2 years ago
MODERATORS
 

Archive

Some video games have been trying to use generative AI for years now, and for the most part people simply have not been having it. Why would we? It's lazy, it's ugly, it's an ethical black hole and it's being driven by an executive class desperate to lay off even more workers. While earlier and more brazen attempts at employing the tech were obvious, lately it's becoming more common for studios to slide a little AI-generated content in without drawing attention to it.

Jurassic World Evolution 3 launched with some AI-generated character portraits, then got bullied into removing them. Clair Obscur, which will be a lot of people's game of the year, appeared to quietly launch with some AI-generated art then just as quietly patch it out. I was going to review the city-building grand strategy game Kaiserpunk until I saw they were using AI-generated images for their dialogue sections, after which I promptly uninstalled it.

The latest culprit is The Alters, which has found to have shipped not only with AI-generated placeholder text in-game, but also employed AI-generated translations in some of its side content as well. None of this was disclosed prior to the game's release; it was all discovered later, by players, and has prompted an explanation of sorts from the developers which tries to calm everyone down, but which has just made things worse, because if it took people discovering these specific instances to find that 11 Bit had used AI-generated content in the game's development, how do we know there's not more of it?

you are viewing a single comment's thread
view the rest of the comments
[–] hedgehog@ttrpg.network 4 points 1 day ago

Fair point, I should have asked about commercial games in general

That said I didn’t mean that the game studio itself would do the AI training and own their models in-house; if they did, I’d expect it to go just as poorly as you would. Rather, I’d expect the model to be created by an organization specialized in that sort of thing.

For example, “Marey” is one example I found of a GenAI model that its creators are saying was trained ethically.

Another is Adobe Firefly, where Adobe says they trained only on licensed and public domain content. It also sounds like Adobe is paying the artists whose content was used for AI training. I believe that Canva is doing something similar.

StabilityAI is also doing something similar with Stable Audio 2.0, where they partnered with a music licensing company, AudioSparx, to ensure that artists are compensated, AI opt outs are respected, etc..

I haven’t dug into any of those too deep, but they seem to be heading in the right direction at the surface level, at least.

One of the GenAI scenarios that’s the most terrifying to me is the idea of a company like Disney using all the material they have copyright for to train their own, proprietary GenAI image, audio, and video tools… not because I think the outputs would be bad, but because of the impact that would have on creators in that industry.

Fortunately, as long as copyright doesn’t apply to purely AI generated outputs, even if trained entirely on your own content, then I don’t think Disney specifically will do this.

I mention that as an example because that usage of AI, regardless of how ethically the model was trained, would still be unethical, in my opinion. Likewise in game creation, an ethically trained and operated model could still be used unethically to eliminate many people’s jobs in the interest solely of better profits.

I’d be on board with AI use (in game creation or otherwise) if a company were to say, “We’re not changing the budget we have for our human workforce, including for contractors, licensed art, and so on, other than increasing it as inflation and wages increase. We will be using ethical AI models to create more content than we otherwise would have been able to.” But I feel like in a corporate setting, its use is almost always going to result in them cutting jobs.