|

How an Editors’ Note Fueled Another Kate Conspiracy Theory

When Catherine, Princess of Wales, announced that she had been diagnosed with cancer last month, it seemed to quell the rumors that had swirled over her stepping back from public life.

Not for everyone. With disinformation spreading fast online, at times amplified by hostile states, some social media users were primed for skepticism. A note from Getty Images beside the video announcement, released on March 22, said it “may not adhere” to its editorial policy and fanned more conspiracy theories over the video’s authenticity.

There is no evidence, according to researchers, that the video is a deepfake, and agencies routinely attach such notes to content given to them by third parties.

With images easy to manipulate, researchers say that news agencies are being transparent about the source of their content.

The editors’ note, added along with other details, including that Kensington Palace had handed out the video, was short: “This Handout clip was provided by a third-party organization and may not adhere to Getty Images’ editorial policy,” it read.

That disclaimer is not unique to this video. A spokeswoman for Getty Images said on Wednesday that it added a “standard editor’s note” to any content provided by third-party organizations. Other agencies also use such notes routinely for clarity.

It was not clear when that policy came into practice, and the spokeswoman declined to comment further. Online sleuths, however, pointed out that the same note was added to a clip provided by a government agency of the bridge that collapsed last week in Baltimore.

Kensington Palace also did not produce the video alone: A branch of the BBC said in a statement that it filmed the message at Windsor on March 20.

“I don’t see any compelling evidence that it’s a deepfake,” said V.S. Subrahmanian, a professor of computer science at Northwestern University who has researched deepfakes. Professor Subrahmanian ran a copy of the video through a system of 15 algorithms his team has been developing to detect manipulated videos, and he also manually examined it with another analyst.

Components such as the video’s audio and Kate’s movements appeared to be natural, and technical evidence suggested it was unlikely to be fake. “Context is a very big part of it,” he added. “The bigger context is that it was a video shot by the BBC, who is a highly reliable source.”

Photo agencies take claims of doctored images seriously and have severed ties with photographers who have altered their work.

When it is difficult to send their own photographers to a scene, the agencies may rely instead on “handout” content given out by group involved in a story.

“They are very keen not to take handouts and have their own photographers where possible,” said Nic Newman, a senior research associate at the Reuters Institute for the Study of Journalism. News agencies, however, have concerns about the way public figures, including politicians and celebrities, are increasingly using handouts to try to “control the narrative,” he said.

The note was an example of agencies’ efforts to be more transparent with their clients who used those photos, he said, but there was the risk that they could fuel conspiracy theories.People often take those labels and then blow them up out of all proportion.”

Before Catherine announced her diagnosis, photo agencies caused a furor when they said a photo of her — released by the palace and widely circulated online — had been “manipulated” and urged news organizations to withdraw it.

The Associated Press, a major agency that issued a “kill notice” for the photo, said that its staff had spotted changes that did not meet the news agency’s standards. The Princess of Wales later apologized for the confusion, saying that she had been experimenting with editing “like many amateur photographers.”

The episode prompted news agencies to look again at their policies, Mr. Newman said, and re-evaluate which sources were trustworthy. “The whole question of whether you can believe what you see is certainly not as clear as it used to be.”

“There is a trust deficit in society, at least in the United States,” Professor Subrahmanian said. “Deepfakes have the potential to widen that trust deficit.”

Check out our Latest News and Follow us at Facebook

Original Source

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *