When Replacing a Person with AI is the Symptom, Not the Cause

When Replacing a Person with AI is the Symptom, Not the Cause

Warhorse Studios fired its translator to replace him with AI to 'save costs'. That announcement reveals much about leadership psychology than AI's future.

Simón ArceSimón ArceMarch 29, 20267 min
Share

When Replacing a Person with AI is the Symptom, Not the Cause

A former employee of Warhorse Studios, the Czech studio behind Kingdom Come: Deliverance 2, publicly denounced that he was fired from his role as a translator and that his position would be filled by artificial intelligence. The justification he received, according to his account reported by Eurogamer, was straightforward: to save costs. Two words. No context, no discussion, no transition.

This story grabs headlines in gaming and technology media for obvious reasons: it’s another chapter in the narrative of AI displacing jobs. But if one stops for just a few more seconds, what emerges is not a debate about automation. Instead, it is a portrait of an organization that made a people-focused decision using the same vocabulary as it adjusts a budget line. And that has consequences that no language model can calculate in advance.

The Efficiency Not Reflected on the Balance Sheet

I understand the logic. Warhorse Studios is a medium-sized studio competing in an industry where the production costs of a AAA video game can reach figures that would make any manufacturing sector CFO pale. The pressure on margins is constant. If an artificial intelligence tool can process translations at a fraction of the cost of a human professional, the arithmetic seems simple.

The problem is that this arithmetic omits several lines from the balance sheet. Organizational trust does not appear on financial statements, but its absence does show up in results. When a team sees that a person can be eliminated overnight with a justification of just two words, what sets in is not efficiency; it’s caution. Caution has a very concrete operational price. Professionals with options begin to explore other opportunities. Those without immediate options start to manage their energy differently. Creativity, which in a video game studio is not an intangible asset but the core product, requires at least a minimum dose of psychological safety to exist.

This mechanism does not need speculation: it is one of the most replicated findings in organizational behavior literature over the past thirty years. Creative teams under the perceived threat of replacement do not produce better. They produce more cautiously, which in industries where narrative and cultural innovation is the competitive differentiator, equals poorer production.

The savings in a translator’s salary have a number. The cost in the cultural density of the team does not, and that’s precisely why it is the cost that most leaders choose to ignore.

The Conversation That Never Happened

What interests me most about this case isn’t the firing itself. What intrigues me is the form it took. A decision of this nature, made without an apparent transitional process, without exploring alternative roles, and without internal communication before the affected individual publicly denounces it, is the signature of an organization that has not developed the muscle to sustain difficult conversations.

The difficult conversations in this context would have been several. First, the discussion about where the studio’s localization strategy is headed and what role AI plays in it, before it affects specific individuals. Second, the conversation about whether there is a different profile for the professional working alongside AI tools rather than being replaced by them. Third, the discussion about how to manage such a transition without the narrative reaching the media first from the voice of the affected individual.

None of these conversations appear to have taken place. And when conversations don’t happen within the organization, they happen outside. In this case, they occurred in Eurogamer. That is not an external communication problem. It’s an internal leadership problem that found its expression in a channel the studio does not control.

What managers often call an “operational decision” is frequently the substitution of an uncomfortable conversation for an administrative action. The administrative action has the advantage of feeling decisive. The disadvantage is that it leaves unresolved exactly what the conversation would have addressed.

What the Market Judged without Appeal

Warhorse Studios released Kingdom Come: Deliverance 2 in early 2025. The game received praise for its narrative ambition, historical depth, and localization, which in a title deeply rooted in 15th-century Bohemia is not a cosmetic detail: it’s part of the product. The cultural coherence of a work of this type depends on linguistic decisions that have nuances that current AI models can approximate but rarely capture with the precision required by a text with dramatic weight and historical context.

This is not an argument against AI in the entertainment industry. It is a warning about the cost of applying a tool without having first precisely defined the problem you want to solve. If the problem is “to reduce the volume of low-impact text that needs to be processed manually,” AI is a reasonable answer. If the problem is “to maintain the cultural and tonal coherence of a complex historical narrative,” eliminating the human professional is a response that may work in the short term and degrade the product in the next production cycle.

The video game market has memory. The community surrounding a title like Kingdom Come is precisely the type of audience that notices the difference in localization quality and verbalizes it in forums, reviews, and word of mouth that can build or erode a studio’s reputation. No payroll savings can cover the reputational cost of a poorly localized release. That calculation also does not appear in the two-word statement.

Leadership Measures What It Can Measure

The decision by Warhorse Studios is, in its barest form, a reflection of an organization that comfortably measures what has numbers and avoids confronting what does not. The salary of a translator has a number. The quality of a conversation that didn’t happen does not. The team’s trust does not have it either. The reputational cost of a narrative that escapes the control of management does not have it until it appears in the sales metrics of the following quarter, and then it is too late to attribute it accurately.

This is the pattern that repeats in organizations across all sectors when financial pressure mounts. Leaders cut where they can measure the cut, and avoid conversations where the impact is real but deferred. It is not malice. It is the consequence of evaluation systems that reward speed of decision over quality of process.

Artificial intelligence will infiltrate almost every organizational function. That is a fact that makes little sense to debate. What does make sense to discuss, and what separates leadership teams that navigate these transitions with the organization intact from those that experience fragmented teams, is the quality of the process by which these decisions are made. Not how much is saved, but how the decision is made, by whom, with what information, and with what honesty about the consequences.

The culture of an organization is not the result of its values declared on its corporate website. It is the exact residue of all the decisions its leaders made when no one was watching, and of all the conversations they chose to replace with an administrative statement.

Share
0 votes
Vote for this article!

Comments

...

You might also like