13 things journalists need to know about AI

A good rule of thumb is to start from the assumption that any story you hear about using AI in real-world settings is, beneath everything else, a story about labor automation.  Max Read’s blog 

This new era requires that newsrooms develop new, clear standards for how journalists will — and won’t — use AI for reporting, writing and disseminating the news. Newsrooms need to act quickly but deliberatively to create these standards and to make them easily accessible to their audiences. Poynter

Any assistance provided to these (AI) companies (by news organizations) could ultimately help put journalists out of business, and the risk remains that, once the media’s utility to the world of AI has been exhausted, the funding tap will quickly be turned off. Media executives can argue that having a seat at the table is better than not having one, but it might just make it easier for big tech to eat their lunch. Columbia Journalism Review 

Google is testing a product that uses artificial intelligence technology to produce news stories, pitching it to news organizations including The New York Times, The Washington Post and The Wall Street Journal’s owner, News Corp, according to three people familiar with the matter. New York Times

“Reporters tend to just pick whatever the (AI) author or the model producer has said,” Abeba Birhane, an AI researcher and senior fellow at the Mozilla Foundation, said. “They just end up becoming a PR machine themselves for those tools.” Jonathan Stray, a senior scientist at the Berkeley Center for Human-Compatible AI and former AP editor, said, “Find the people who are actually using it or trying to use it to do their work and cover that story, because there are real people trying to get real things done.” Columbia Journalism Review

Journalists’ greatest value will be in asking good questions and judging the quality of the answers, not writing up the results. Wall Street Journal 

There are 49 supposed news sites that NewsGuard, an organization tracking misinformation, has identified as “almost entirely written by artificial intelligence software.” The Guardian

Recently, AI developers have claimed their models perform well not only on a single task but in a variety of situations … In the absence of any real-world validation, journalists should not believe the company’s claims. Columbia Journalism Review

If media outlets truly wanted to learn about the power of AI in newsrooms, they could test tools internally with journalists before publishing. Instead, they’re skipping to the potential for profit. The Verge

One of the main ways to combat misinformation is to make it clearer where a piece of content was generated and what happened to it along the way. The Adobe-led Content Authenticity Initiative aims to help image creators do this. Microsoft announced earlier this year that it will add metadata to all content created with its generative AI tools. Google, meanwhile, plans to share more details on the images catalogued in its search engine. Axios 

In the newsroom, some media companies have already tried to implement generative AI to create content that is easily automated, such as newsletters and real estate reports. The tech news media CNET started quietly publishing articles explaining financial topics using “‘automated technology’ – a stylistic euphemism for AI,” CNET had to issue corrections on 41 of the 77 stories after uncovering errors despite the articles being reviewed by humans prior to publication. Some of the errors came down to basic math. It’s mistakes such as these that make many journalists wary of using AI tools beyond simple transcription or programming a script. Columbia Journalism Review

OpenAI and the Associated Press are announcing a landmark deal for ChatGPT to license the news organization's archives. Axios

AI in The Newsroom (video) International News Media Association International