And here's what makes this genuinely interesting: according to the Reuters Institute's Journalism, Media, and Technology Trends and Predictions 2026, only 38% of news executives feel confident about the future of journalism - down from 60% in 2022. That's not a crisis of content. That's a crisis of operational capacity meeting an accelerating news cycle, and the two are colliding hardest in the smallest newsrooms.
This post isn't for the BBC. It's for the 3-person team in Sarajevo or Larissa, with a 45-minute press conference recording sitting on a laptop, a CMS waiting for content in two languages, and a news cycle that won't pause while someone types it all out by hand.

From "AI Is Banned" to "AI Is Infrastructure": The Global Shift in AI in the Newsroom
Three years ago, the conversation was about prohibition. Major newsrooms issued moratoriums. Editorial ethics committees convened. The concern was legitimate: AI-generated content carries real disinformation risk. Researchers have documented a significant rise in AI-generated content appearing in fact-checking queues - 16% of fact-checked claims in 2025 involved AI-generated content, up from 7% the previous year (Aos Fatos / Reuters Institute). That's not a small jump.
But the debate has moved on. The question inside well-resourced international newsrooms is no longer "should we use AI" - it's "which parts of the workflow should AI own, and how do we govern that?"
According to Pugpig's 2026 publisher survey, 97% of publishers now consider back-end automation - transcription, copyediting, content tagging - either "important" or "essential." Not experimental. Essential. That's a significant consensus to build in three years.
But here's the problem with that consensus: the version being discussed in industry press assumes you have a dedicated tech team to implement anything. It assumes you can evaluate model providers, negotiate API contracts, and run integration projects across your existing stack. For newsrooms with 50+ staff and engineers on payroll, that's a reasonable assumption. For a 4-person editorial team publishing in Serbian and Bosnian, it's a different planet entirely.
The Small Newsroom Reality Nobody's Writing About
Picture a specific Tuesday morning. A journalist gets back from a municipal press conference with 40 minutes of audio on their phone. The story is timely - it needs to be up before the afternoon cycle. But before a single word of journalism gets written, someone has to transcribe that recording. Manually. In a language that most generic AI transcription tools treat as an edge case or bundle together inaccurately with related South Slavic variants.
So the journalist types. Or a colleague does. Either way, that's an hour-plus of skilled editorial time spent doing something that contributes zero journalistic value.
And transcription is just one task. The same team is manually tagging content for SEO and archives, copy-pasting between a CMS and a separate translation tool, and formatting video subtitles by hand for any video-first content they produce. These aren't occasional tasks. They're daily.
They're also time-fixed - a 60-minute recording takes roughly 60 minutes to transcribe manually, regardless of how fast you type - and they block everything downstream.
What gets crowded out? Original reporting. Investigative work. Audience engagement. Faster publishing cycles.
The work that actually builds a newsroom's reputation and reach keeps getting pushed back while the mechanical tasks get done first.
The infrastructure gap compounds all of this. Small editorial teams don't have in-house engineers. Any AI solution that requires complex integration, ongoing maintenance, or a technical implementation project is a non-starter - not because the team isn't capable, but because there's nobody whose job it is to run that project. Capability isn't the constraint. Capacity is.
And yes - the accuracy concerns for smaller languages are legitimate, not ignorant. The ROI scepticism on tight budgets is legitimate. These aren't objections to dismiss. They're the right questions to ask before spending anything.
The Wrong Lesson From the AI Hype Cycle
Most small editorial teams, when they hear "AI in the newsroom," picture one of two things: a chatbot writing their articles, or a tool that will eventually take someone's job. Both framings cause the same outcome - teams either dismiss AI entirely or chase the wrong tools for their actual situation.
Here's a more useful frame.
Think about spell-check. Nobody debates whether spell-check "replaces" editors. Nobody context-switches to a separate spell-checking app, copies the corrected text, and pastes it back into their document.
Spell-check became invisible infrastructure - it lives inside the writing environment, it runs in the background, and editors stopped thinking about it somewhere around 1995.
The next layer of invisible infrastructure is transcription, tagging, and translation. The technology exists. The question is whether it's built into your workflow or bolted on as a side tool.
That distinction matters more than it sounds.
A standalone AI app - one you open separately, run your content through, and paste results back from - creates friction. It requires a context switch every time. And tools that require context switches get abandoned. Not because the technology is bad, but because the workflow is broken. Teams who've tried generic AI tools and given up after a month usually aren't abandoning AI. They're abandoning friction.
Integrated AI is different. It lives inside the tools editors already use every day. The transcription happens where the content management happens. The translation step is a step, not a separate project. Tagging runs automatically as content is published. Editors stop thinking about these tasks the same way they stopped thinking about spell-check. That's the goal.
What Integrated AI Actually Looks Like - LitteraWorks and mPanel in a Real Workflow
Consider a concrete hypothetical. A journalist records a 40-minute press conference on a mobile device. They upload the file - or, with a mobile app, capture and start the transcription process directly from the field. The transcript comes back in the language it was spoken, without a desktop session, without a separate transcription service, without manual typing. LitteraWorks handles transcription across 40+ languages, with specific capability for the South Slavic language variants that generic tools routinely mishandle.
That transcript now needs to become a publishable article in two languages. In a traditional workflow, that means exporting, translating in a separate tool, and re-importing the result. With mPanel, the AI translation capability lives inside the CMS itself. Editors work in one environment. The multilingual version is a step, not a project.
Automated VTT subtitle generation with maintained timing removes an entire production step for video-first outlets and broadcasters. Previously, that work required either a specialist or a significant block of manual time. Now it's part of the same workflow.
Automated content tagging - for SEO, for archive discoverability, for topic clustering - runs as content moves through the system rather than as a separate editorial task someone has to remember to do before publishing.
None of this requires an engineer to configure. That's the point of building AI into the CMS rather than alongside it.
AppWorks has worked with 50+ organizations across 4 continents, including partnerships on Erasmus+ and Creative Europe EU-funded projects. That track record matters for newsrooms with compliance requirements and tight reporting accountability - this isn't experimental software being tested in production. It's a platform with a real implementation history in exactly the kind of multilingual, resource-constrained media environment that Balkan newsrooms operate in.
Can a Small Team Actually Match Larger Newsroom Output?
The ROI question deserves a direct answer. The relevant question isn't whether AI saves time in the abstract - it's whether the specific tasks being automated are the bottleneck tasks for your specific team. Abstract time savings don't matter. Bottleneck removal does.
For a 2-5 person editorial team publishing multilingually, transcription and translation are almost always bottleneck tasks. They're time-fixed. They block downstream work. And because they're time-fixed, automating them has compounding returns: faster transcription means faster story development; faster tagging means better archive discoverability over time; faster translation means wider audience reach without additional headcount.
Nieman Lab's 2026 analysis found that 67% of newsrooms report they haven't saved any jobs as a result of AI efficiencies yet. That's worth sitting with. It suggests most implementations aren't targeting the right tasks - or aren't integrated deeply enough into daily workflow to create real capacity. The newsrooms likely to see different results are the ones treating AI as operational infrastructure, not as a productivity experiment running alongside the real work.
So what does "big newsroom output" actually require? Not more staff. Not a tech team. It requires removing the tasks that eat time without producing journalism. For a small Balkan editorial team, that means transcription handled before the journalist gets back to their desk, translation that doesn't require a separate tool, and tagging that doesn't require a separate meeting.
That's a workflow that exists. It's just not the one most small newsrooms are running yet.
Where This Leaves Small Multilingual Teams
The global conversation about AI in the newsroom will keep happening in New York and London. The governance debates, the attribution frameworks, the model selection discussions - all of it will be written for newsrooms with dedicated engineering capacity and legal teams and AI ethics committees.
But the operational problem - the 40-minute recording, the two-language CMS, the news cycle that won't pause - is solvable right now, for teams of 2 or 5 or 8, without hiring an engineer or running an integration project.
The Reuters Institute also flagged that publishers are forecasting a 43% decline in search engine traffic over the next three years as AI answer engines change how audiences find news. That's a real pressure coming toward every newsroom, large and small. The teams that have automated their operational drudge work will have more capacity to respond - more original reporting, more audience-facing work, more publishing volume. The teams still typing out transcripts by hand will be doing that as audience discovery patterns shift around them.
The gap between those two groups isn't about budget or headcount. It's about whether AI is built into the workflow or sitting on the shelf.
If your editorial team is still manually transcribing interviews or juggling separate tools for translation and publishing, it's worth seeing what an integrated workflow actually looks like. Book a demo with AppWorks and we'll show you how LitteraWorks and mPanel work inside a real newsroom setup - no sales pitch, just the actual product.