Journalism has always been a job that rewards people who can do more with less. AI is raising the stakes on that equation — dramatically expanding what a single reporter can research and produce, while simultaneously eliminating the entry-level roles that the industry has historically used to develop new talent.
What This Means for Your Day-to-Day Work
If you're a working journalist, the most immediate change is probably already in your workflow: transcription is effectively solved. Tools like Otter.ai and Whisper have made manual transcription feel like a relic. Interviews that once took hours to process are searchable and quotable within minutes. For any reporter doing audio or video interviews, this alone is a significant shift in how time gets spent.
The deeper change is in research. AI can now synthesize background on a story, surface prior coverage, and help map the landscape of a complex topic faster than any human research assistant. This is genuinely useful — and it's raising the floor on what editors expect a reporter to know before they pick up the phone.
What AI isn't doing, and won't do well anytime soon, is the work that makes journalism matter: building trust with sources over months or years, reading a room, noticing what's missing from an official account, and making the judgment call about what the public needs to know. Those skills are not automatable. But they're also not sufficient on their own anymore.
The Bifurcation of the Industry
The most important thing to understand about AI's impact on journalism is that it's not hitting the industry evenly. There's a clear split:
Local news and commodity content are absorbing the hardest hit. Automated systems now write thousands of financial summaries, sports recaps, and local government meeting reports that once employed junior reporters. Many entry-level newsroom positions have disappeared, and local outlets running on thin margins are under significant pressure to automate further.
Investigative, explanatory, and accountability journalism is not under the same threat — and in some ways is better resourced than before. Reporters who do this work now have AI tools that give them research and data analysis capacity that previously required a whole team. The constraint isn't the AI; it's having the skills and editorial judgment to use it well.
Practical Steps for Right Now
- Get comfortable with AI transcription. If you're still transcribing manually, stop. The tools are reliable enough and the time savings are significant.
- Use AI for background research, but verify everything it tells you. AI research tools are useful for orientation, not for facts you'll put in print. Treat their output the way you'd treat a well-read intern: useful starting point, not a source.
- Develop a verification workflow for AI-generated media. Synthetic images, audio deepfakes, and AI-written statements submitted as real are an emerging problem for every newsroom. Tools like Hive Moderation help, but the editorial instinct to be suspicious is still the first line of defense.
- Think of AI as a research assistant, not a writer. The journalists getting the most out of AI are using it to do more reporting — more sources, more documents, more background — and then writing the story themselves. The ones who use it to write for them are producing work that reads like it.