Let’s talk about video games, creativity, and that buzzing, often thorny issue of Artificial Intelligence. Specifically, let’s talk about a cracking looking game called The Alters, from the well-regarded folks at 11 bit studios – yes, the same crew behind emotionally heavy hitters like This War of Mine. You might have seen trailers; it’s a sci-fi tale about a chap named Jan Dolski trying to survive on a hostile planet by creating alternate versions of himself from different life choices. Clever premise, really makes you think about paths not taken.
But lately, the chatter around The Alters hasn’t been about its existential themes or intriguing gameplay mechanics. Oh no. It’s been about something rather less philosophical and a lot more… silicon-based. Fans have been poking around, doing what dedicated communities do best – scrutinising every pixel, every line of text, every translated phrase. And what they’ve reportedly found has kicked off a bit of a kerfuffle: evidence suggesting the game has been making rather extensive use of generative AI, seemingly without telling anyone upfront.
So, What Exactly Did They Find? And Why the Fuss?
The reports coming from the community forums and social media circles point to AI involvement across several areas of the game’s development. We’re talking about the visual side – images, perhaps concept art or even textures – where people claim to see those tell-tale glitches or stylistic quirks that often signal AI generation. You know the sort; slightly off anatomy, weirdly repeating patterns, things that just don’t look quite right when you peer closely. It’s like spotting the join on a bad bit of wallpaper.
Then there’s the writing. Fans are reportedly pointing to certain passages of in-game text, dialogues, or descriptions that read a bit… flat. A touch repetitive, perhaps, or lacking that certain human spark, that nuanced turn of phrase you expect from a narrative-driven game. It’s the linguistic equivalent of elevator music – functional, maybe, but utterly forgettable and devoid of personality. And the whispers also include translation – suggesting some localised versions might have leaned heavily on machine translation tools without sufficient human editing, leading to awkward phrasing or outright errors.
Now, why does this matter? Isn’t AI just a tool? Well, yes, that’s part of the debate. A hammer is a tool, a fancy digital sculpting program is a tool, heck, a pen is a tool. But generative AI, particularly when used in creative fields, comes loaded with baggage. There are concerns about the data it was trained on (often scraped from human artists’ work without consent), worries about its impact on jobs for artists, writers, and translators, and a fundamental question about authenticity and artistic intent.
But perhaps the biggest flashpoint here, the real reason fans are feeling ‘slammed’, is the *undisclosed* nature of the alleged AI use. If a developer is using AI extensively to generate assets or text, shouldn’t they be upfront about it? Players buy games expecting a certain level of human craftsmanship, a product of human imagination and skill. When AI is used without disclosure, it feels, to some, like a bait-and-switch. It erodes trust. It makes people question the value they’re getting for their money, and whether the “art” they’re consuming is truly original human expression or a composite generated by algorithms.
The Developer Responds: AI as a ‘Tool’?
Facing the growing murmurs and outright accusations, 11 bit studios did respond. Their statement, as reported, acknowledges that AI was indeed used in the development of The Alters. However, they framed it very much as a *tool*, used for tasks like generating initial concepts or iterating quickly on ideas. They pushed back against the notion that AI replaced human artists, writers, or translators, insisting these crucial roles were still filled by people. They seemed to suggest AI was more of a productivity aid, a souped-up digital sketchpad or brainstorming partner, rather than the primary engine generating the final content the player sees and reads.
It’s the classic line we hear time and again in these sorts of controversies, isn’t it? “It’s just a tool.” And, to be fair, there’s truth in that. AI *can* be a powerful tool for creators. It can speed up tedious processes, help break through creative blocks, or generate variations faster than any human could. Imagine an artist needing dozens of quick concepts for a background element – AI could potentially generate a whole heap to spark ideas. Or a writer needing variations on a simple descriptive sentence. Used judiciously and transparently, as an *aid* to human creativity, it’s not inherently evil.
The Stick Point: Was it *Just* a Tool, and Was it Transparent?
Ah, but here’s where the fan evidence clashes with the “just a tool” narrative. If the AI output is reportedly showing up in final, customer-facing assets – the actual images players see, the text they read, the translations they experience – then it moves beyond being merely a brainstorming tool. It becomes part of the finished product being sold. And if those parts look or read like typical AI output, it suggests either the AI was used more extensively than implied, or the human oversight wasn’t sufficient to refine it into something indistinguishable from purely human work.
This is where the transparency issue really bites. If 11 bit studios had perhaps stated early on, “Look, we’re experimenting with AI tools in our pipeline to help with X, Y, and Z, but all final assets are human-approved and polished,” the reaction might have been different. But the impression given, right up until fans started digging, was that this was a traditional, human-led development process. The lack of prior disclosure feels, at best, like an oversight, and at worst, like a deliberate attempt to avoid potential backlash by keeping quiet about something they knew was controversial.
It’s a bit like a baker selling a cake. If they use a top-of-the-line mixer, nobody cares. That’s a tool. If they use a pre-made cake mix but market it as “artisanal, made from scratch,” well, that’s misleading. The AI in The Alters situation feels closer to the pre-made mix appearing in the final product, but without it being listed on the ingredients.
The Broader Canvas: AI, Creativity, and Trust
This situation with The Alters isn’t isolated. It’s another brushstroke on the much larger, often messy, painting of how generative AI is intersecting with creative industries. We’ve seen similar debates rage in art communities, with voice acting, with stock photos, and in publishing. Creators are grappling with how to use these powerful new tools ethically and effectively, while consumers are grappling with what they mean for the value and authenticity of the art they love.
For game developers, there’s a clear temptation. Budgets are tight, development cycles are long, and the pressure to produce vast amounts of content is immense. AI promises speed and efficiency. But this incident with The Alters serves as a stark reminder that there’s a human element often overlooked: player expectation and trust. Gamers are a discerning bunch, often deeply invested in the craft that goes into their favourite medium. They appreciate human artistry, clever writing, and unique vision. When they suspect AI has taken shortcuts, particularly when it wasn’t disclosed, they feel cheated.
The potential financial details here, if we could see them, would be fascinating. Was AI used primarily to cut costs in specific areas? Was it a Hail Mary to meet deadlines? Or was it genuinely seen as a way to augment human creativity and enable new possibilities? Without knowing the specifics of 11 bit studios’ development budget and resource allocation, it’s hard to say definitively. But the outcome – fan backlash and questions of integrity – is a tangible cost, perhaps far exceeding any potential savings.
Ultimately, this comes down to transparency and respect for the audience. Many argue that developers have a responsibility to be clear about how their games are made, especially when employing technology that is controversial and has implications for human creators. Similarly, it is contended that players have a right to know what they are buying. Hiding AI usage, even if used as ‘just a tool’, seems like a gamble on player ignorance or indifference. And as the reaction to The Alters shows, that’s a gamble that might not pay off.
Where do we go from here? Will game studios adopt clear “AI disclosure” policies? Will players start demanding “human-made” labels on games? This feels like just the beginning of a long and complicated conversation.
What do you make of all this? Does the use of undisclosed AI in games bother you? Or is it just the natural evolution of development tools? Let us know in the comments below.