- The Washington Times - Thursday, September 5, 2019

Facebook announced Thursday that the company is investing $10 million toward tackling so-called “deepfake” technology amid concerns over its potential to manipulate public opinion.

The social networking company said in a blog post that it is partnering with fellow tech titan Microsoft and researchers from universities in the U.S and abroad to develop ways to better detect doctored videos made with deepfake technology.

“’Deepfake’ techniques, which present realistic AI-generated videos of real people doing and saying fictional things, have significant implications for determining the legitimacy of information presented online,” wrote Mike Schroepfer, Facebook’s chief technology officer. “Yet the industry doesn’t have a great data set or benchmark for detecting them. We want to catalyze more research and development in this area and ensure that there are better open source tools to detect deepfakes.”

Videos will be filmed using professional actors to serve as a data set for developers to create deepfake detection tools, with $10 million from Facebook going toward research collaborations and prizes meant to encourage participation, Mr. Schroepfer wrote about the “Deepfake Detection Challenge.”

No Facebook user data will be used in the data set, the blog post said.

Partners involved in addition to Microsoft include researchers from the Partnership on AI consortium, Cornell Tech, MIT, the University of Oxford in the U.K., the University of California, Berkeley, the University of Maryland at College Park and the State University of New York at Albany, Facebook said.

Microsoft referred to the Facebook post when reached for comment.

Calls for internet companies to take action against deepfakes have intensified leading up to the 2020 U.S. presidential election amid concerns that seemingly real but doctored videos could play a part in any online disinformation campaign launched during the race.

Russian internet users weaponized social media platforms including Facebook to spread disinformation and propaganda during the 2016 U.S. presidential race, according to federal law enforcement and intelligence agencies, and senior members of the Trump administration have warned that the 2020 elections risk being attacked online as well.

“Social media platforms have a unique responsibility to identify and remove disinformation before it goes viral, and with voting in the first 2020 primaries less than six months away, the platforms must urgently prepare for increasingly sophisticated disinformation campaigns,” Sen. Mark Warner of Virginia, the ranking Democrat on the Senate Intelligence Committee, said in response to the Deepfake Detection Challenge.

“These efforts by Facebook, Microsoft, the Partnership on AI, and their academic partners will be very important to the process, and I hope these companies will continue to pursue comprehensive measures to prevent harmful deepfakes from poisoning our national dialogue online,” Mr. Warner said.

Facebook came under fire in May after the company refused to follow YouTube in blocking users from sharing a doctored video of House Speaker Nancy Pelosi, California Democrat, that had been slowed to make her speech seem slurred.

Mark Zuckerberg, Facebook’s chief executive and co-founder, was subsequently targeted in a deepfake video that spread virally the following month on Instagram, a Facebook-owned photo- and video-sharing app.

“Imagine this for a second: One man, with total control of billions of people’s stolen data, all their secrets, their lives, their futures,” Mr. Zuckerberg seemed to say in the doctored clip.

Facebook and Instagram have about 2.38 billion and 1 billion users, according to a report published Wednesday by the New York University’s Stern Center for Business and Human Rights.

The report, “Disinformation and the 2020 Election: How the Social Media Industry Should Prepare,” listed detecting and removing deepfake videos as a top priority for companies ahead of next year’s race.

• Andrew Blake can be reached at ablake@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide