The issue of how artificial intelligence programs will affect journalism is an interesting and complicated one. Some say they could have benefits. Others say they might be harmful. It depends on how they are used.
Did you think this was yet another article about AI for which the writer cleverly asked an AI program to write the lead? Fooled ya! This was actually my trying to write like an AI program.
Either way, pretty lame, eh?
The use of artificial intelligence in journalism is spreading rapidly, and debates over what newsrooms should and shouldn’t use it for are spreading even more rapidly. Rest assured the private equity and hedge fund owners of news chains are trying to figure out how they can use it to save on labor costs, which has led to some panic among the industry rank and file about job security and product quality.
AI software currently has a variety of uses in the field. Just a few examples: transcribing interviews; identifying trending topics online; delivering individually personalized news; establishing flexible website paywalls; and scraping for data on the internet. Let’s focus on its more controversial ability to create content.
The Associated Press uses AI to write articles from reports of corporate earnings. The Washington Post uses it to write articles from high school football statistics. These are examples of smart applications producing formulaic stories in quantity and freeing journalists for more ambitious work.
“Large language models” such as ChatGPT and Bard can also write whole stories from inputted data. This is not as smart. Even though the capabilities of AI are improving rapidly, results are too often factually wrong, dully written and generic rather than localized. Journalism garbage, in other words.
Nieman Lab recently surveyed news organizations around the world that have guidelines on AI and found that most do not allow creation of stories and photos. (Accepted uses included research, headline suggestions, social media posts and creation of illustrations.) The Knight Foundation examined 130 AI-related newsroom projects and determined that only 15% of them involved automated story generation.
The future might look different. The New York Times reported in July that Google demonstrated a story-writing program to representatives of The Times, The Washington Post and The Wall Street Journal, who saw it as potential assistance to their journalists.
AI offers tremendous potential gains in the production of standardized news stories. But writing journalism that readers will pay for demands more than that: critical thinking, context, nuance, creativity, style. And no good story can be written without the good reporting and interviewing that must come first. All that comes from pros, not programs.
Of course, news owners and managers have to recognize this, which explains the alarm among news unions and other news staff. Bosses can’t afford to underestimate the value of high-quality work and what it takes to achieve it.
New tech is always scary. It can be misused. But it can also be a gift.
Tom Arenberg is an instructor of news media at the University of Alabama. He worked for The Birmingham News and the Alabama Media Group for 30 years. He published this commentary originally as a post on his blog, The Arenblog.
About News is a BirminghamWatch feature that publishes commentary by those who teach the craft and think about the values and performance of today’s journalism, a civic flashpoint. BirminghamWatch is a member of the Institute for Nonprofit News whose members generally rely on individual gifts, foundation grants and sponsorships to support their work. It also publishes About News articles on Facebook and Twitter and invites readers to join the conversation about their news in those forums.