Skip to content

How AI could sabotage the next generation of journalists   

A question I often get when I train editorial teams on the use of AI is, “Is using AI cheating?”

Although it’s a yes or no question, it’s obviously not a yes or no answer. The short answer is sometimes, but the key to figuring out the long answer is using the tools with an open mind. If you’re a professional in a field like journalism, you’ll generally be able to tell when it’s speeding up drudgery and when your judgment and expertise are most needed.

However, the recent viral story in New York magazine about how colleges and universities are struggling with rampant, unauthorized AI use from students got me thinking about what’s happening much earlier in the pipeline. After all, those college students who are using AI to cheat on essays and admissions interviews eventually get jobs in the workforce. How will entry-level reporters, editors, and interns regard the use of AI, and how can newsrooms guide them so they develop the critical skills good journalists need?

Prioritize people, not just output

This highlights an area of AI policymaking that often gets the short shrift. Newsroom AI policies are rightly concerned with the integrity of the information the publication is putting out and transparency with audiences, primarily. What AI might be doing to the skill-building of junior staffers is a tertiary concern, at best. Left unchecked, however, this problem has the potential to be existential: How do you produce competent senior staff when the junior staff is either replaced by AI or—as the New York piece suggests—replacing themselves with AI.

You start with the first principles. Most AI policies begin with some kind of affirmation that humans remain at the center of what journalism is about. That lens needs to turn inward in a real way, with a commitment to balance innovation and efficiencies with professional development. In a newsroom, a healthy AI policy also ensures staff in entry-level or junior roles have opportunities to build core journalistic competencies.

The policy should be clear to those workers even before they walk in the door. These days a lot of interviews happen over video conference, and many newsrooms that aren’t explicitly local have gone fully remote over the last few years. The fact is, if a candidate is on the other end of a video interview, hiring managers should be assuming they have some kind of AI helping them, even if there aren’t telltale signs like delayed answers and rote wording.

And there are still ways to adapt the hiring process to this reality. Where possible, newsrooms should incorporate in-person interviews and testing. For remote workers, real-time teamwork exercises will reveal a lot more than “take home” ones like memos and writing tests.

Why junior staff need their “reps”

A good AI policy spells out exactly which tasks are allowed to be partially or totally done by AI, while still leaving room to experiment in noncritical areas. (The New York Times’s policy is a good example.) In selecting those tasks, however, efficiency and productivity shouldn’t be the only factors. How that mix of tasks changes between junior and senior staff should be taken into account.

A good way to think about this: Just because AI can do a task doesn’t mean it should do it always, in every instance. Yes, an AI tool can now competently turn a three-hour school board meeting into a news story, but reporting and writing “rote” stories like this are a fundamental part of learning the ropes of journalism: taking good notes, finding the story in a sea of information, checking facts, and getting the right quotes. Newsrooms need to ensure this kind of foundational exercise, essentially “getting your reps in,” is still a priority for reporters just starting out.

This approach runs the risk of emphasizing newsroom hierarchy and increasing frustration among junior staffers who know that AI could speed up their work. That’s why it’s important to have a clear path out. For instance, new hires might need to complete training modules that emphasize foundational journalistic skills before they gain broader access to AI tools. That would send the message that using AI is a privilege—one earned through demonstrating competence.

How to future-proof journalism

So, using AI might be cheating in some cases and not cheating in others—even for the same task. That might be confusing, but it also might be a sign of a thoughtful AI policy that doesn’t see increased output as the be-all and end-all of success.

Because in the end, an AI policy isn’t just a rule book that allows or forbids offloading certain tasks to robots in the name of efficiency. It should be a map for how a newsroom preserves the integrity of its journalism and the trust of its audience as it navigates one of the most impactful technological changes in history. If you try to sail into the future without thinking about the long-term health of your staff, you risk arriving at the destination with a crew of nothing but robots.