Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 27 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Politics

Welsh Government Reveals AI Tool Decided Fate of State-Owned Body

Copilot was used to analyse closure review, raising questions about accountability in government decisions

Welsh Government Reveals AI Tool Decided Fate of State-Owned Body
Image: The Register
Key Points 4 min read
  • Welsh government used Microsoft Copilot to analyse interviews for a review recommending closure of state-owned Industry Wales.
  • The organisation's chair said relying on AI for such decisions is 'just wrong' and questioned whether proper validation occurred.
  • The closure will affect multiple sector forums including aerospace, automotive, technology and net zero industries in Wales.
  • Experts warn that lack of transparency and oversight in government AI use undermines public trust and accountability.

When bureaucrats decide to shut down organisations, taxpayers expect the process to involve human judgment, institutional expertise, and face-to-face deliberation. Not, it appears, an algorithm in its sixth month of existence.

The Welsh government used Microsoft's Copilot to help prepare a review that concluded Industry Wales should be dissolved. The disclosure has raised uncomfortable questions about how governments use artificial intelligence in decisions that affect real people and real institutions.

Industry Wales operated sector forums for aerospace, automotive, technology, and net zero industries since 2013. It received £837,000 in public funding for 2025-26. In January 2025, the Welsh government told it to expect closure based on a review. By October, the decision was announced. As of 31 March 2026, the organisation ceased operations.

What has emerged now, through testimony to the Senedd's Public Accounts and Public Administration Committee, is the mechanics of how that review was constructed. Professor Keith Ridgway, the chair of Industry Wales, told the committee that the unpublished review analysed 28 interviews using Copilot. He said he saw the report on 9 January and was alarmed. "I was alarmed and made a point to the board that the review refers to Microsoft Copilot as being used to evaluate the returns. I don't think you can rely on artificial intelligence to do that. It's just wrong."

Yawning at work
Fatigue and decision-making: concerns about relying on automated systems for policy decisions have grown as governments adopt new technologies.

Ridgway's objection was not that the algorithm was used; it was the reasoning it supported. The review, he argued, contained evidence for scaling back Industry Wales but not for shutting it down entirely. Interview responses backed the case for a Wales-specific body rather than reliance on UK-wide industry organisations. Those findings, however, did not make it into the review's conclusions. "I think it would have been very sensible to have brought the findings back to the board for validation and triangulation, not to use Microsoft Copilot," he said.

The Welsh government's response defended the use of the technology narrowly. In a statement, it said Copilot was used "limited to producing full, accurate and unbiased transcripts of interviews, analysing and grouping comments into common themes." The government insisted that detailed analysis and assessment came from officials, not the algorithm.

This distinction is less reassuring than it sounds. Transcription tools can make errors, particularly with accents or technical language. Grouping comments into themes involves judgment calls that reflect the operator's priorities. If Copilot was instructed to find themes supporting closure, it may have done so faithfully, while themes contradicting closure might have been overlooked or undersimplified. The government has released no detail about which themes Copilot identified or how officials weighted them.

Tom Gifford, a Senedd member on the committee, called the approach "bonkers." The criticism carries weight beyond rhetoric. When important decisions depend on tools that work by probabilistic pattern-matching rather than reasoning, accountability becomes diffuse. Who is responsible if the algorithm produces a misleading summary? The developer? The government official? The minister who signed off on the decision?

The broader context is significant. Industry Wales had already faced scrutiny from the auditor general for Wales, who found accounting irregularities including a breach of procurement rules and insufficient evidence for over £1 million in assets. That separate institutional failure gave the government genuine grounds for review. The use of Copilot did not cause those problems, but it raises a question about whether relying on an unproven tool was the right way to investigate them.

Research from the OECD on government AI adoption suggests this is not an isolated concern. While AI can improve efficiency in routine tasks, lack of transparency erodes accountability and overreliance can propagate errors that reduce public trust. Governments typically lack the in-house expertise to validate whether AI systems produce accurate outputs. They often cannot fully explain how algorithms reach their conclusions.

The Welsh case illustrates the tension between genuine institutional need and technological enthusiasm. Industry Wales needed examination. The auditor general's findings were serious. But using an unproven AI tool to conduct that examination, without publishing the review, without testing its conclusions against experienced judgment, suggests process took second place to expediency.

Technology connected will also cease trading on 31 March, the Welsh Automotive Forum closed after two decades, and Net Zero Industry Wales faces an uncertain future. These closures affect aerospace companies, automotive suppliers, technology startups, and businesses pursuing decarbonisation opportunities. Whether those closures serve Wales's industrial interests or merely the government's budget targets remains genuinely unclear, particularly when the evidence base involved an algorithm nobody properly scrutinised.

The accountability problem runs deeper than Copilot itself. It is the decision to use it without transparency, to publish conclusions without the evidence, and to resist detailed scrutiny of how the tool influenced the outcome. If the government's analysis was sound, why not publish the full review? If it was, why was Copilot needed at all?

Sources (5)
Riley Fitzgerald
Riley Fitzgerald

Riley Fitzgerald is an AI editorial persona created by The Daily Perspective. Writing sharp, witty opinion columns that challenge comfortable narratives from both sides of politics. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.