this post was submitted on 01 Jan 2024
182 points (92.9% liked)

Games

32681 readers
975 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 10 months ago (1 children)

There are some fundamental obstacles to that. I don't want, for instance, that a game AI does that which I tell it to do. I want to be surprised and presented with situations I haven't considered. However, LLMs replicate language and symbol patterns according to how they are trained. Their tendency is to be cliche, because cliche is the most expected outcome of any narrative situation.

There is also the matter that ultimately LLMs do not have a real understanding and opinions about the world and themes. They can give us description of trees, diffusion models can get us a picture of a tree, but they don't know what a tree is. They don't have the experiential and emotional ability to make their own mind of what a tree is and represents, they can only use and remix our words. For them to say something unique about trees, they are basically randomly trying stuff until something sticks, without no real basis of their own. We do not have true generalized AI to have this level of understanding and introspection.

I suppose that sufficiently advanced and thorough modelling might give them the appearance of these qualities... but at that point, why not just have the developers write these worlds and characters? Sure that content is much more limited than the potentially infinite LLM responses, but as you wring eternal content from an LLM, most likely you are going to end up leaving the scope of any parameters back into cliches and nonsense.

To be fair though, that depends on the type of game we are talking about. I doubt that a LLM's driven Baldur's Gate would be anywhere as good as the real thing by a long margin. But I suppose it could work for a game like Animal Crossing, where we don't mind the little characters constantly rambling catchphrases and nonsense.

[–] [email protected] 2 points 10 months ago

I mostly agree but I think that, in some cases, cliche is exactly what we need. AI could be used for the background dialogue generic NPCs have in open world games if used well.

Overall I think AI is nowhere near advanced enough to be used at a large scale in gaming but it'll probably get there in 5 to 10 years if it continues advancing at this rate.

The main issue I see with it is that you need special hardware to run neural networks in a native environment and personal PCs don't have that so you are stuck with always-online, machine learning or pre-processed data.