Newsrooms are talking about AI. They’re barely using it.


  • Ali Zahid
  • 13 Minutes ago

Over the past year, every newsroom conversation seems to revolve around AI. Some fear it will replace journalists. Others think it will solve everything overnight. But the reality inside most newsrooms is much simpler: we are barely using it.

Not because the tools are not there. They are. Not because journalists cannot learn them. They can. The gap is structural, and it is hiding in plain sight.

Anthropic, the company behind Claude, recently published research that caught my attention. Rather than theorising about what AI might eventually do to various professions, they went straight to the data, millions of real AI conversations, and asked what people are actually using it for at work today. Then they mapped those findings against hundreds of occupations to measure the gap between what AI could theoretically handle and what organisations are genuinely deploying it for.

Theoretical AI exposure simply means: how much of what you do in your job could AI technically assist with? A journalist spends most of their day reading stories, documents, researching backgrounds, writing copy, and reformatting content for different platforms. AI can already help with most of that. So journalism scores high on theoretical exposure. But scoring high on potential and actually using the tools are two very different things. That gap is exactly what this research is measuring.

*Computer and math occupations: 92 percent theoretical AI exposure, 30 percent actual use. Business and finance: 85 percent potential, 14 percent observed. Office administration: 87 percent potential, 11 percent real-world usage. Legal: 80 percent potential, 6 percent actual adoption. Arts and media, the category that most directly covers journalism, sits at 62 percent theoretical coverage and roughly 13 percent real use.

The same gap shows up across every profession that relies on knowledge work. Construction and trades? Fourteen percent theoretical exposure, three percent actual. Nobody is losing sleep over AI replacing plumbers. But in journalism, where the exposure numbers are among the highest, the panic has completely outpaced the practice.

What AI Cannot Do and Should Not Try To

Before arguing that newsrooms should embrace AI, it is worth being clear about what it cannot do. This is where the hype usually falls apart.

AI cannot cultivate a source. It cannot sit across from a minister who is dodging a question and read the room. It cannot make the kind of call an experienced editor makes on instinct. The feeling that a story is wrong, not because the facts are off, but because something nags and says wait another day. And it cannot build the trust in a community that leads to the phone call nobody else gets.

These are not just tasks. They are the core of what journalism is. And none of them are at risk from AI.

What AI does threaten, and this is the harder conversation, is the filler that surrounds real journalism in many newsrooms. The hours spent on mechanical work. The rewriting of other outlets’ stories just to hit a quota. The content that exists to fill space, not serve readers. If AI makes that filler harder to defend, it is not undermining journalism. It is clearing a path back to it.

What AI Could Be Doing Right Now

None of this is theoretical. The tools exist. They work. And most newsrooms are barely touching them.

Think about what a single morning used to look like.

An eighty-page government report arrives. That is two to three hours of careful reading just to figure out what matters. Now? Five minutes to extract the key findings, and the journalist still verifies every claim before publishing.

A forty-five minute interview gets recorded. That used to mean three or four hours of transcription, head down, fingers moving. Now it is ten minutes. Searchable text. Ready to work with.

The same story needs to land on six platforms: website, social, notifications. That used to be twenty minutes of rewriting, rephrasing, reformatting. Now it is two.

Headlines? Editors can test ten variations in the time it once took to write one.

None of this replaces editorial judgment. What it removes is the weight, the hours that produce nothing a reader would ever notice or value. That time goes back to reporting. To verification. To the kind of thinking that actually moves stories forward.

The trade sounds obvious. Most newsrooms have not made it yet.

The Meeting I Have Not Forgotten

At a previous organisation, website traffic was sliding. I pulled up the publishing pattern and saw the problem immediately: the website had become a print newspaper uploaded to the internet. Stories produced for print went online. The digital team itself was putting out two or three original pieces a day, if that.

I asked them to change course. Produce more original digital content, stories that could reach people beyond our traditional readership. We were on the world wide web. We did not have to write for only one corner of it.

I also suggested using AI. Not to replace reporting. To handle the parts that were not reporting. Transcribe expert interviews. Surface angles from dense material. Turn a thirty-minute conversation with a policy expert into a publishable piece in under an hour, instead of three or four hours of manual transcription and note-taking. More subjects. Different audience segments. No corners cut on the journalism itself.

The reaction was immediate. Death stares.

Then the line I have heard, in variations, ever since: this is not real journalism.

The irony was difficult to ignore. The team most invested in defending real journalism was spending its days taking stories from other websites, rewording them, and posting them. That was the workflow AI was supposedly threatening to corrupt.

If a journalist gathers information, verifies it, analyses it carefully, and writes something that serves the reader, the journalism lives in that judgment. The tools used to organise research or surface patterns do not change that responsibility. Journalists already depend on search engines, transcription software, analytics dashboards. AI is just a more capable extension of a toolkit that has been expanding for decades.

When AI Is Used Without Editorial Discipline

Poorly used tools introduce real risks.

A recent incident in Pakistan showed exactly how. In November 2025, Dawn, the country’s most respected English-language newspaper, published a business story about automobile sales. At the bottom, readers found what appeared to be a leftover AI prompt, suggesting the writer create a snappier version of the piece.

It went live. Within hours, it was everywhere on social media.

The easy lesson was AI carelessness. The real one was older and more uncomfortable: the final copy had not been properly read before publication. That is rule one. Has been since before computers. Someone reads it before it goes out.

AI does not remove that responsibility. It raises the stakes. When editors stop reading their own pages carefully, a new kind of error enters. Not a factual mistake, but a failure of process. That is what happened at Dawn. And it will happen again at any organisation that adopts AI tools without reinforcing the basic disciplines around them.

The Gaps Nobody Is Talking About

There is a second place where adoption failure becomes visible: analytics.

Just last week, a colleague mentioned that something felt off about our website traffic. The instinct was right. But the conversation stopped there.

No one asked which search queries had shifted over the past two months. Which pages were driving organic sessions. Whether engagement time was behaving differently across story formats.

The data existed. It was sitting in our dashboards, updated daily. We just were not looking at it in a way that would tell us anything.

With the right questions and structured data, an AI assistant can surface those patterns in minutes, work that would take an experienced analyst days to do manually. But without a workflow for gathering and interpreting that data, it just sits there. The problem is not access. It is structure.

The same logic applies to the tools themselves. Most newsroom management is not investing in AI for their teams. So individual journalists and editors are quietly subscribing on their own. Paying out of pocket. Building skills while their institutions stand still.

I tell people to do exactly that. Do not wait for someone to hand you permission. Learn it yourself.

But organisations that leave this entirely to individuals will eventually watch those individuals leave.

The Skills Gap Quietly Emerging

There is also a generational dimension to this. Newsroom leaders are beginning to feel it in hiring.

Anthropic’s research found that hiring of 22-to-25-year-olds into AI-exposed roles across knowledge industries has dropped by around 14 percent since generative AI tools became widely available. The implication is not that entry-level jobs are vanishing. It is that the job itself has changed.

The tasks that used to define junior work, scanning documents, summarising reports, preparing research briefs, formatting content, can now be done with AI. Organisations are no longer looking for someone to perform those tasks manually. They want people who can work the tools.

Journalism schools are only beginning to catch up. Which means young journalists arrive in newsrooms one of two ways: they have taught themselves AI skills on the side, or they have not touched them at all. Very few have been shown how to use these tools responsibly inside an actual editorial workflow.

The journalists who will be most valuable in five years are not necessarily the strongest writers in the room today. They are the ones who understand how to combine editorial judgment with the tools that are reshaping how information is gathered and processed.

What Getting This Right Actually Looks Like

The newsrooms adapting most effectively are not the ones with the largest budgets. They are the ones that started with a simple question: which parts of our editorial process consume the most time and produce the least original value?

For most newsrooms, the answer is the same. Research aggregation. Background briefs before interviews. Podcast transcription. Document summarisation. Metadata and SEO tagging. Social media reformatting.

None of these are journalism. All of them currently consume journalist hours that could go toward work readers cannot get elsewhere.

Once you have that list, the next question is: what does AI help look like for each person in the building? A court reporter needs something different from a live bulletin producer, who needs something different from a social media editor. Generic workshops will not get you there. The training has to be tied to the work.

And then the rules. Clear guidelines on where AI is appropriate, how its use is disclosed, who is accountable when something goes wrong. Without those boundaries, individual journalists make consequential decisions alone. The Dawn incident was partly the result of exactly that absence.

Where This Leaves Us

The transition will not be clean. Newsrooms are slow to change workflows, and AI will produce some uncomfortable moments before it produces lasting improvements. The Dawn incident will not be the last of its kind.

But the essential work of journalism has never depended on the absence of tools. It has depended on the judgment applied to using them. That has not changed.

The gap in the Anthropic data, 62 percent potential and 13 percent actual, will close. The only question is whether it closes through deliberate editorial leadership or through a series of avoidable mistakes that each newsroom has to learn separately.

Do not wait for someone to force it on you. That applies to individual journalists. And it applies to the institutions that employ them.

*Source: Anthropic Economic Index, March 2026 | anthropic.com/economic-index

Newsrooms and AI
Author

Ali Zahid

Ali Zahid is Vice President (Digital Media) at Hum Network Limited and holds a Master’s in Media Management from Parsons School of Design. A media executive with a sports journalist’s spirit, he is passionate about digital innovation, technology and sports.

You May Also Like