Journalists turn to AI to stay ahead but not out of the loop

Share this article

Journalism is entering a new phase of AI adoption, one shaped less by disruption and more by pragmatism. A new global survey of 29 newsrooms by AI transcription software provider Trint reveals that, in 2025, the primary role of generative AI in journalism will be to improve operational efficiency rather than redefine editorial strategy or revolutionise creativity.

The research, which gathered responses from producers, editors and correspondents across the US, UK, Europe and Asia, shows a distinct shift in how media organisations are approaching AI integration. The emphasis is not on novelty or monetisation, but on optimisation. Sixty-nine per cent of newsroom leaders said their motivation for adopting generative AI tools was to “do more with less.” Over half also cited a desire to stay competitive, though few see AI as a route to generating new revenue or audiences.

There is little appetite to use AI to replace the journalist’s voice. Instead, the technology is expected to take over repetitive and time-consuming tasks such as transcription, translation and data extraction. In fact, 89 per cent of those surveyed said they plan to rely on AI transcription within the next three years. While these may appear to be marginal gains, they collectively represent a strategic reallocation of human focus, freeing journalists to concentrate on analysis, storytelling and investigative depth.

Shadow AI exposes a widening trust gap

Alongside this operational shift, the survey uncovered a growing undercurrent of unsanctioned AI use. Nearly half of respondents admitted to using generative tools purchased independently of their organisation’s IT policy, a phenomenon often referred to as shadow AI. These tools are typically employed for speed and convenience, but their use raises serious questions around data privacy, compliance and reputational risk.

This informal adoption highlights a tension between the pace of technological change and the slower rhythms of organisational governance. While journalists are eager to experiment, newsroom leaders are still catching up with the practical and ethical frameworks required to govern responsible AI use. Trint’s Director of Product, Tessa Kaday, noted the scale of the issue: “It’s great to see that the workforce is eager to innovate, but this kind of unsupervised AI use raises significant security and regulatory concerns.”

In response, most newsrooms are planning human-centred mitigation strategies. Sixty-four per cent said they would prioritise staff education on AI use, while 57 per cent are developing company-wide policies. Interestingly, only 18 per cent reported that they intend to scrutinise AI vendors, an apparent blindspot in otherwise cautious planning. As AI tools become more deeply embedded in editorial infrastructure, external scrutiny may become just as important as internal policy.

AI will shape workflows but not creativity

Despite the enthusiasm for automation, there is a marked reluctance to hand over creative control. Writing, editing and video production remain firmly within human remit. Respondents cited concerns about accuracy, hallucinations and the inability of current models to reflect the nuance, context or originality required of high-quality journalism.

This suggests that, while AI will become a key enabler of productivity, the core editorial processes will remain distinctly human. Kaday observed: “It’s reassuring to see that core values like trustworthy reporting and journalistic integrity remain unchanged in many newsrooms.” The instinct to protect these values is not simply philosophical, it is strategic. In a media environment increasingly dominated by algorithmic content, credibility is fast becoming journalism’s most important currency.

Looking ahead, newsrooms are split on how to implement AI at scale. Some plan to build their own tools, particularly where control over data and customisation is paramount. Others will turn to off-the-shelf solutions, prioritising speed and simplicity. Both paths reflect a pragmatic understanding: AI may be essential, but it must be tailored to fit the newsroom, not the other way around.

As generative AI continues to mature, 2025 may not be the year that machines write the news. But it could well be the year that journalists, editors and producers redefine how news is gathered, processed and produced, not by chasing technological transformation, but by shaping it on their own terms.

Related Posts
Others have also viewed

Platform disruption is coming faster than your next upgrade

AI is redefining how enterprises design, deploy, and interact with software, leaving traditional digital experience ...

Growth of AI will push data centre power demand thirtyfold by 2035

Artificial intelligence is poised to become the dominant force shaping the future of global infrastructure, ...

AI needs quantum more than you think

As AI models grow larger and classical infrastructure reaches its limits, the spotlight is turning ...

Edge AI is transforming industrial intelligence faster than most executives realise

Manufacturers are rapidly turning to edge AI to unify data, accelerate decision-making, and eliminate silos ...