Back in early 2024, Valve put up new rules for game developers on Steam to pull in some information about generative AI, and they now seem to have tweaked it.
As spotted by GameDiscoverCo and posted on Bluesky, the form developers have to fill out has seen a few tweaks in the wording mainly to clarify that it's for content that is actually seen and consumed by players. From marketing materials on the Steam page, to content in the game - but not including AI tool helpers in their game development environment. As Valve say now on the form:
"Efficiency gains through the use of these tools is not the focus of this section. Instead, it is concerned with the use of AI in creating content that ships with your game, and is consume by players. This includes content such as artwork, sound, narrative, localization, etc."
So it's all about what we actually see, and split between pre-generated and live-generated which have separate sections for developers to tick, along with still being required to write a statement on what's used to display on the Steam store page.
To me, it seems like a pretty sane clarification to make. And, also you can still use the AI browser extension to better highlight games with generative AI on Steam.
"This includes content such as artwork, sound, narrative, localization, etc."It's so funny how "code" is conveniently absent in that list (or is it "etc."?) I wonder if that is because you can't enforce what you can't prove is in the product, anyway? Or because they realized that the vast majority of programmers is using at least some AI-generated content these days? Or simply because consumers don't "consume" code, because they can't see what it does? Or because vibe-coding is a legitimate efficiency tool in their view, while efficiency gains by generating AI art assets are not?
In any way, as a person whose code has very likely been used for AI training, I call hypocrisy on it. Apparently, the "poor artists" are entitled to protective measures, while coders aren't. Which reflects the vibe I am getting from the anti-AI crowd, really. One should think that people hating AI should hate all of it equally, at least.
Efficiency gains through the use of these tools is not the focus of this section."Basically everyone is using AI."
Instead, it is concerned with the use of AI in creating content that ships with your game, and is consume by player. This includes content such as artwork, sound, narrative, localization, etc."Let's stop being transparent & declare only stuff we obviously can't deny."
But reading the article it seems to work - ignorance is bliss.
Quoting: poiuzLet's translate Valve's message:Of course. It's a tool. Nobody stops you from taking the horse, but I'll be probably faster than you using my car.
Efficiency gains through the use of these tools is not the focus of this section."Basically everyone is using AI."
Quoting: poiuzI really don't get the negative resentment from a lot of people with AI. It's a tool. It can't replace people, although many seem to think so. Right now, it simply can't. I mean, you probably could for some smaller things, but quality wise it's not the best idea.Instead, it is concerned with the use of AI in creating content that ships with your game, and is consume by player. This includes content such as artwork, sound, narrative, localization, etc."Let's stop being transparent & declare only stuff we obviously can't deny."
But reading the article it seems to work - ignorance is bliss.
What it does is to aid you, to let you iterate faster. Do the annoying things... *shrugs*
Related to the code, I couldn't care less if someone uses AI to aid in coding, as long as the result is proper, clean code. And as a gamer, a stable game that doesn't crash or has weird bugs. How that is achieved, I really don't care, so why tick boxes for that?
1. Negative impact on environment, slap bang in the middle of a climate crisis.
2. Driving job losses based on exaggerated claims of "efficiency".
3. Lowers IQ (I can't be bothered digging out the link yet again)
4. Slows down development (even in cases where developers claimed it sped them up, evidence showed otherwise)
5. Driving a nuclear age (Meta, Google and Microsoft have now all commissioned their own reactors)
6a. Societal impact - talking people into hurting others and/or themselves, sometimes leading to deaths)
6b. Societal impact - driving non-consensual nudity on Grok, including child pornography. When Musk learned of this, he paywalled the "feature". He paywalled it... not removed... paywalled it. FFS. Also see deepfakes of politians, or fraud using social engineering techniques.
6c. Societal impact - genAI "slop" now devalues everything on the internet. When you see something cool, you think "meh, it's probably just AI shite". Or it actually IS shite, in which case, genAI is on a race to the bottom, since the next generation of genAI will be taught on today's internet - mistakes will be compounded, biases reinforced.
7. Loss leading pricing - hoping to hook consumers/enterprisesthen putting prices up (see OpenAI adding adverts to ChatGPT)
8. Hallucination (multiple cases of invented bullshit, including court filings, leading to lawyers being debarred).
9. Obnoxious marketing (see MS especially).
10. Diverting investment away from targeted solution, and into a financial bubble (because #7).
11. All genAI engines are built on plagiarised work, for which the original authors/artists got no recognition, nor commission. Same with code - all code was scraped, regardless of license, and that code can be regurgitated in new, OR snippet form, by genAI, without recognition of that license.
12. Impact on website scraping from multiple companies building genAI models. Wikipedia in particular has had to actively block enormous ranges to prevent the scraping from leading them into financial run. Again, can't be bothered to find the link, but there's a Wikimedia blog talking about it.
Anyone offering the "it's just a tool" argument, is being deliberately obtuse. They're basically arguing that the ends absolutely justify the means, no matter the cost.
And the cost is high. Big tech has absolutely no morals, and this is a race to the bottom, fueled by literally hundreds of billions of investment that could have have so much difference elsewhere.
But hey, it's just a tool, right?
Quoting: scaineMan, I can't believe we're still defending genAI. As I've pointed out in many other comments, the top reasons I hear for the "negative resentment" are, in no particular order:You forgot point 13.
1. Negative impact on environment, slap bang in the middle of a climate crisis.
2. Driving job losses based on exaggerated claims of "efficiency".
3. Lowers IQ (I can't be bothered digging out the link yet again)
4. Slows down development (even in cases where developers claimed it sped them up, evidence showed otherwise)
5. Driving a nuclear age (Meta, Google and Microsoft have now all commissioned their own reactors)
6a. Societal impact - talking people into hurting others and/or themselves, sometimes leading to deaths)
6b. Societal impact - driving non-consensual nudity on Grok, including child pornography. When Musk learned of this, he paywalled the "feature". He paywalled it... not removed... paywalled it. FFS. Also see deepfakes of politians, or fraud using social engineering techniques.
6c. Societal impact - genAI "slop" now devalues everything on the internet. When you see something cool, you think "meh, it's probably just AI shite". Or it actually IS shite, in which case, genAI is on a race to the bottom, since the next generation of genAI will be taught on today's internet - mistakes will be compounded, biases reinforced.
7. Loss leading pricing - hoping to hook consumers/enterprisesthen putting prices up (see OpenAI adding adverts to ChatGPT)
8. Hallucination (multiple cases of invented bullshit, including court filings, leading to lawyers being debarred).
9. Obnoxious marketing (see MS especially).
10. Diverting investment away from targeted solution, and into a financial bubble (because #7).
11. All genAI engines are built on plagiarised work, for which the original authors/artists got no recognition, nor commission. Same with code - all code was scraped, regardless of license, and that code can be regurgitated in new, OR snippet form, by genAI, without recognition of that license.
12. Impact on website scraping from multiple companies building genAI models. Wikipedia in particular has had to actively block enormous ranges to prevent the scraping from leading them into financial run. Again, can't be bothered to find the link, but there's a Wikimedia blog talking about it.
Anyone offering the "it's just a tool" argument, is being deliberately obtuse. They're basically arguing that the ends absolutely justify the means, no matter the cost.
And the cost is high. Big tech has absolutely no morals, and this is a race to the bottom, fueled by literally hundreds of billions of investment that could have have so much difference elsewhere.
But hey, it's just a tool, right?
The complete and total destruction of the consumer home PC market from inflated parts costs & the move towards subscription based 'ai' cloud gaming (and Ai windows cloudOS)
14. might end x86 Linux because of point 13.
ohh and probably if not regulated ..
15. Surveillance capitalism , Ai
but it can put a funny cape on your dog !
Quoting: scaine12. Impact on website scraping from multiple companies building genAI models. Wikipedia in particular has had to actively block enormous ranges to prevent the scraping from leading them into financial run. Again, can't be bothered to find the link, but there's a Wikimedia blog talking about it.Wikipedia recently signed API access deals for AI training, so I guess getting paid for the training data is preferable to it just getting scraped.
Last edited by MrBelles on 18 Jan 2026 at 6:15 am UTC
The hate against all AI usage without distinction between usage as a tool and usage to replace people is something I just mostly ignore nowadays. As do most people, thankfully.
It is mostly Reddit-level & social media brainrot and can be discarded. Eventually people on those platforms move on to the next thing to hate on, as they always do.
Quoting: KimyrielleIt's so funny how "code" is conveniently absent in that list (or is it "etc."?) I wonder if that is because you can't enforce what you can't prove is in the product, anyway?Well, yes.
You'd have to enforce open sourcing everything and even THEN you could almost never be certain.
Unenforceable rules are pointless.
Quoting: KimyrielleIn any way, as a person whose code has very likely been used for AI training, I call hypocrisy on it. Apparently, the "poor artists" are entitled to protective measures, while coders aren't.You are mixing different things here.
Almost nobody got any money for having their "thing" being used for AI training. At least coding-wise there were/are "AI trainers" - it is unclear to me how much of a share those have nowadays, but to my knowledge such a thing does not exist for artists or musicians.
So if anything, coders have it (a little bit) better here.
Quoting: scaineMan, I can't believe we're still defending genAI. As I've pointed out in many other comments, the top reasons I hear for the "negative resentment" are, in no particular order:Practically all of these are examples of misuse and the current Wild West lawlessness state of the area.
Not arguments against AI use itself.
You are right about the bubble, of course, but that is just the normal hype cycle we've seen with all bigger technologies (just think of the internet & dotcom bubble).
What matters now really doesn't matter as much as what will happen after the bubble pops, which I assume will be a much more regulated and purpose-driven state (again, just like with the dotcom bubble).
Quoting: scaineAnyone offering the "it's just a tool" argument, is being deliberately obtuse.I'd call it being level-headed and informed - as opposed to panic mongering apocalyptic nonsense and giving in to such.
Quoting: MrBellesOf course.Quoting: scaine12. Impact on website scraping from multiple companies building genAI models. Wikipedia in particular has had to actively block enormous ranges to prevent the scraping from leading them into financial run. Again, can't be bothered to find the link, but there's a Wikimedia blog talking about it.Wikipedia recently signed API access deals for AI training, so I guess getting paid for the training data is preferable to it just getting scraped.
If what you have is going to be taken anyway the best you can do is demand some compensation.
AI can be great, but THIS AI as it is used today is a smoking pile of bullshit.
I hope this will get corrected soon, so we all can benefit from this new technology without destroying the planet and human culture.
Quoting: poiuz"Basically everyone is using AI."
Quoting: KROMOf course. It's a toolSo what's stopping them from disclosure? Are thy ashamed of using "a tool"? I don't see developers hiding the fact that they use unity, and some consumers will avoid unity for example because of some specific problems unity games tend to have on their computers. So what? It's our right. It's also our right to know if the code was produced using genAI, because for example some people have a stance against supporting such products because of the social or environmental costs etc. Devs are free to choose their tools, consumers are free to choose their purchases.
We have obligatory lists of ingredients on food products, cosmetics, detergents - not only because of allergens, but simply because we (the consumers) have the right to know what kinds of shit went in there, before we buy. I'm not necessarily saying it should be obligatory to disclose the full toolbox used to make a game, but it would certainly be well received and I hope it will become a good practice.
Last edited by pb on 18 Jan 2026 at 9:19 am UTC
Quoting: pbSo what's stopping them from disclosure?Nothing. And plenty do.
Quoting: pbAre thy ashamed of using "a tool"?Shame has nothing to do with that.
But you can see from the absolutely braindead "outrage" over Swen Vincke's very reasonable stance on AI usage why a developer wouldn't even want to talk about it at all.
Anyway, this is about Steam enforcing disclosure when it comes to generating content that is "consumed" with the product.
And as said already, that only makes sense with parts where that can even be reasonably checked.
You can't check if someone uses Mistral in their IDE to help debugging, as a more efficient "Google" with more context, not at all, or writing half their code with it - the dev can tell you what exactly they do, or not.
Quoting: pbIt's also our right to know if the code was produced using genAI,There is no such right.
You are hallucinating harder than ChatGPT on its worst day.
Quoting: pbWe have obligatory lists of ingredients on food products, cosmetics, detergents - not only because of allergens, but simply because we (the consumers) have the right to know what kinds of shit went in there, before we buy.Apples and oranges.
You have a right to know what is in the product.
You have no right to know what brand of tool was used to harvest it, nor could that be reasonably checked.
Quoting: pbI'm not necessarily saying it should be obligatory to disclose the full toolbox used to make a game, but it would certainly be well received and I hope it will become a good practice.I agree, disclosure would be nice and IMO beneficial to devs.
But there is no right to that, and some will do it while others won't.
Last edited by TheSHEEEP on 18 Jan 2026 at 9:51 am UTC
Quoting: pbWe have obligatory lists of ingredients on food products, cosmetics, detergents - not only because of allergens, but simply because we (the consumers) have the right to know what kinds of shit went in there, before we buy. I'm not necessarily saying it should be obligatory to disclose the full toolbox used to make a game, but it would certainly be well received and I hope it will become a good practice.That example seems to be quite accurate to me:
The tools used to write the code are not an ingredient of the game. Unlike e.g. the assets. Which seems quite close to what Valve is asking for to be declared.




How to setup OpenMW for modern Morrowind on Linux / SteamOS and Steam Deck
How to install Hollow Knight: Silksong mods on Linux, SteamOS and Steam Deck