The incorporation of AI into journalism is a double-edged sword

Published: Last Updated on

Artificial intelligence has become more prevalent now than ever before. I see articles in my social media and news feeds every day about the developments of AI, ranging from AI art to stories and school papers written by ChatGPT. But with the rise of AI comes concerns about how to use it properly and ethically, especially when it comes to using it professionally.

Recently, AI has been popping up in the journalism industry. In January, it was found that CNET had been publishing stories written by AI without clearly specifying if the articles had been written by human reporters, according to an article from The Verge. This caused controversy among both readers and employees of the publication and it begs the question: How will the growth of AI affect journalists? While I sometimes use an auto-transcription AI, called, to help transcribe interviews, I personally would not consider using AI programs to write articles. But not every journalist may be averse to using these kinds of programs.

In the Spring 2023 edition of the Journalism & Mass Communication Educator, John Pavlik wrote an article titled: “Collaboration with ChatGPT: Considering the Implications of Generative Artificial Intelligence for Journalism and Media Education,” which looks at how AI can be used in the industry. Pavlik explains that ChatGPT is a natural language processing (NLP) platform that allows users to enter text prompts and then generate responses through machine learning engaging with the internet. Essentially, you can ask the AI a question and it will create a response based on the information available to it. Furthermore, AI is already regularly used in many media operations, including the Associated Press, primarily to aid in the reporting process, according to Pavlik.

Pavlik’s article is unique in that he submitted prompts to ChatGPT to test its knowledge of the media and what it thinks about using AI in the journalism industry. The responses generated included statements about how AI cannot be creative in the same way we consider humans to be creative and that AI can be used as a tool for reporters, allowing journalists to focus on other aspects of their work while the generator does the writing or information processing for them. Although the AI stated some may “argue that AI is not yet advanced enough to fully replace human journalists and media professionals” and that “it is likely to complement and enhance their work rather than replace it,” I think it is important to consider the implications of introducing this tool into the industry.

Using AI as a tool to help aid in the reporting process can be greatly beneficial to journalists. In a time when newsrooms are shrinking and reporters have to cover more and more stories to make up for those gaps, AI could be the answer to meeting deadlines and generating more stories. But this could also mean the downfall of the industry. In the case of CNET, there could be whole articles written by AI, which introduce issues of accuracy and ethics. ChatGPT itself states in Pavlik’s article that AI can increase the risk of bias and errors in reporting, any articles written using AI require careful oversight and editing of content and the generator does not have any copyright or legal restrictions. In short, AI in journalism is a double-edged sword. This tool can improve the reporting process if used correctly, but it also takes a seasoned reporter/editor to handle any issues that may be introduced by AI. 

On top of these issues, there is a certain human aspect to journalistic writing that I think cannot be replicated using AI, regardless of whether or not the program passes the Turing Test (a test developed by Alan Turing to see if a human can tell if something is either human or machine, according to Pavlik). Part of journalism is giving a voice to people and communities, which can include sources ranging from experts to first-hand accounts of an event, and journalists often incorporate their own observations or even write stories in a more narrative format to help paint a picture for their readers. To me, good journalism—while objective and informative—is about the human nature of storytelling and everything that comes with that. It should read like a person is telling and narrating that story, and all of the experiences a reporter has creates that sense of humanity. I do not think that it is something that can be achieved with AI.

The future of print journalism as technology advances is something that has been up for debate for the past couple of decades. Personally, I think print journalism will continue to prosper and evolve, just like it always has. But in order to do so, we must incorporate these new technologies into the reporting process without losing a grasp on the craft itself. AI can be a helpful tool for journalists, but it must be used with caution and expertise. Journalism is about reporting the news fairly and accurately and we cannot always count on computers to understand that. I dread the thought of a future where all news is written by computers, but until then, we must learn how to work with new technology to preserve the future of this industry.

Recommended for You