Microsoft is closely scrutinizing malicious influence efforts launched by Russia, Iran and China and has found some unexpected developments in people’s response to deepfakes.
Microsoft Threat Analysis Center’s Clint Watts said his team has logged hundreds of instances of these adversaries using artificial intelligence over the last 14 months. He told the Billington CyberSecurity Summit earlier this month that Microsoft has spotted a few unexpected things, particularly involving deepfakes aimed at tricking voters into believing manipulated media.
“The story a year ago was deepfakes in the election, and candidates being deepfaked — audiences have been remarkably brilliant about detecting deepfakes in crowds,” said Mr. Watts, the center’s general manager.
“The more you watch somebody, the more you realize a fake isn’t quite right. And you’ll see this whether it’s Putin, Zelenskyy, or any of the candidates in the election, people on social media are pretty good,” he said, referring to Russian President Vladimir Putin or Ukrainian President Volodymyr Zelenskyy.
The social media mob’s wisdom disappears when the users are isolated, however.
“It’s when they’re alone that they tend to get duped,” Mr. Watts said at the summit. “So the setting matters, public versus private.”
Mr. Watts said small fakes with subtle changes tend to work better on video than fully AI-generated content. He said Russian influence actors all tried deepfakes, and “they’ve moved back to bread-and-butter, small video manipulations.”
While people have proven adept at detecting video manipulations, Mr. Watts warned that manipulated audio can better trick people if done skillfully and deployed at the proper time.
“Audio is easier to create and more impactful as compared to the other mediums,” Mr. Watts said. “If it’s done right and used in the last factor, which is at a specific condition or time — times of crisis, conflict, competition — people tend to fall for things they wouldn’t normally fall for.”
• Ryan Lovelace can be reached at rlovelace@washingtontimes.com.
Please read our comment policy before commenting.