(AP Sets Guidelines for AI in News Coverage Amidst Increasing Concerns)
The Associated Press (AP) has introduced guidelines for the use of artificial intelligence (AI) in news coverage, acknowledging its potential while emphasizing the need for caution. The guidelines state that AI cannot be used to create publishable content and images, and material produced by AI must be vetted carefully, just like any other news source.
AP is one of several news organizations that have taken steps to establish rules for integrating fast-developing tech tools like ChatGPT into their work. As part of this effort, AP will include a chapter in its influential Stylebook, providing guidance for journalists on how to cover stories related to AI. The chapter will also include a glossary of terminology to help journalists understand key concepts.
Amanda Barrett, Vice President of News Standards and Inclusion at AP, explained that the goal of these guidelines is to strike a balance between experimentation and safety. The Poynter Institute, a journalism think tank, hailed this move as a transformational moment and urged news organizations to create and share their own standards for the use of AI.
Generative AI, which has the capability to create text, images, audio, and video on demand, is not yet fully capable of distinguishing between fact and fiction. Therefore, AP emphasized that AI-generated material should be treated with caution, and any alteration of AI-generated content should only be used when it is the subject of the story itself.
Wired, a leading tech magazine, has also taken a cautious approach by stating that it does not publish stories generated by AI, unless the fact that it is AI-generated is the main focus of the story. Nicholas Carlson, editor-in-chief of Insider, reinforced the importance of journalists taking full responsibility for the accuracy, fairness, originality, and quality of every word in their stories.
The rise of AI-generated hallucinations or made-up facts has underscored the necessity of establishing standards to ensure the credibility and reliability of news content. News organizations have recognized that while generative AI can be a useful tool, it should be limited to assisting editors with tasks such as creating digests, headlines, and story ideas, rather than directly publishing content.
AP has had a decade of experience experimenting with simpler forms of AI, using it to generate news stories from sports box scores or corporate earnings reports. However, they are proceeding cautiously in this new phase, aiming to protect their journalism and credibility. To further enhance their AI capabilities, AP recently reached a deal with OpenAI, the maker of ChatGPT, to license AP’s archive of news stories for training purposes.
Protection of intellectual property rights is another concern for news organizations, as they want to avoid unauthorized use of their material by AI companies. The News Media Alliance, which represents hundreds of publishers, has released a statement of principles to safeguard their members’ rights.
While concerns about AI potentially replacing human jobs persist, AP is encouraging its journalists to familiarize themselves with the technology. They believe that journalists need to be equipped with knowledge about AI since they will be reporting on it extensively in the years to come.
In conclusion, with its new guidelines, AP aims to strike a balance between utilizing AI’s potential and ensuring responsible journalism. They are dedicated to upholding the standards of accuracy, fairness, and originality in news coverage, while understanding the transformative impact of AI on various aspects of society.