The Associated Press released guidelines around generative AI to its journalists as it and other news organizations look at ways to use the technology in news gathering.
AP vice president for standards and inclusion Amanda Barrett said in a blog post the publication does not see AI “as a replacement of journalists in any way” but developed guidelines for reporters and editors on how to use it.
Journalists for AP can experiment with ChatGPT but are asked to exercise caution by not using the tool to create publishable content. Any result from a generative AI platform “should be treated as unvetted source material” and subject to AP’s existing sourcing standards. The publication said it will not allow AI to alter photos, videos, or audio and will not use AI-generated images unless it is the subject of a news story. In that event, AP said it would label AI-generated photos in captions.
Writers cannot put confidential information into AI tools and should ensure other sources they use are “free of AI-generated content.” AP staff are asked to avoid accidentally using AI content created to spread misinformation and should verify the veracity of the content they use.
Journalists follow AP’s moves around standards closely, as the majority of the news industry uses or at least modifies the AP Stylebook to write articles like the one you’re reading now. Some people even own several (old) physical editions of the Stylebook because they’re news nerds. Its guidance could be influential in a fraught debate over journalists using AI.
Even as AP sets these standards for its staff, the publication signed an agreement with ChatGPT maker OpenAI to use its news stories to train generative AI models. AP also uses automated tools to produce quick write-ups around financial reports and minor sports leagues; it has joined other organizations like BuzzFeed to use AI in its workflow. It also joined other news companies and groups in signing an open letter urging transparency into data to train generative AI models.