Google and Meta are building new censorship tools for smaller platforms to use against whatever is deemed extremist content, putting the big tech companies in a position to push their preferences for content moderation across a larger swath of the internet.
Google is working with a U.N.-supported group on a project to help websites identify and tear down content. Meta is making free software that will help platforms jointly crack down on content by spotting copies of images and videos.
While the participating organizations all say terrorism and extremist content are their focus, Google and Meta executives also acknowledge that the new tools will have broader applications and they expressed concerns about other objectionable content involving COVID-19.
The U.N.-supported group Tech Against Terrorism announced its partnership this week with Google’s Jigsaw unit on the plan to flag and delete terrorist content.
Jigsaw spokesperson Shira Almeleh told The Washington Times her team understands terrorism to be “an attack on open societies” and she said hate and violence pose a threat to everyone.
“Jigsaw and Tech Against Terrorism, in collaboration with the Global Internet Forum to Counter Terrorism (GIFCT), are working to develop a new web app that enables small and medium-sized platforms to triage and action on URLs and hashes that exist in counter-terrorism databases,” Ms. Almeleh said in a statement.
Meta is set to lead GIFCT in 2023, which it helped form alongside Twitter, Microsoft and Google’s YouTube in 2017.
The forum operates a Hash-Sharing Database that allows more than a dozen participating companies to spot digital signals of terrorist and extremist activity and share them without transmitting the users’ data.
The database includes numbers that correspond with images, videos, PDFs and other content that is labeled according to the content type and the terrorist entity responsible for generating it.
The new tools Google’s Jigsaw is developing appear aimed at providing similar information to a larger number of platforms without the resources to deal with sophisticated adversaries online.
Details about what content is extremist and promotes terrorism and who is responsible for making the determination are not completely clear. Ms. Almeleh said Jigsaw is working with small platforms to determine what they need and teaming with “practitioners and nonprofits” that also have databases and expertise regarding extremism.
Tech Against Terrorism was created by the U.N. and has received funding from tech companies such as Google, Microsoft and Meta’s Facebook as well as governments including South Korea and Switzerland.
Tech Against Terrorism’s website said it acknowledges there is no universal definition of terrorism and that the group supports an industry-led approach to regulating speech online.
“As a project, we also believe that there tends to be an overemphasis from governments on violent content compared to non-violent content, as the latter may be just as (if not more) influential in a person’s radicalization process,” the group said on its website. “However, we acknowledge that it is more difficult to identify and moderate non-violent content, and our project aims to support companies to determine appropriate approaches to content regulation.”
While the group said on Twitter it is in the early stages of its work with Google, Jigsaw chief executive Yasmin Green told the Financial Times that the problem she wants to solve involves small websites struggling to enforce rules without the necessary personnel.
Ms. Green pointed to Islamic State content as overwhelming smaller websites, in an example of the content Google wants to address. She said she has also spotted “COVID hoax claims,” however, appearing on smaller sites as larger tech platforms more successfully stamp out those publishers’ objectionable speech elsewhere.
Google’s Jigsaw team is a division within Google that studies threats to open societies, and it lists examples of threats on its website including disinformation, violent white supremacy, toxic language and state-imposed internet shutdowns.
The Jigsaw group’s website touts its work with the New York Times to help moderate the newspaper’s online comments section. It also said Google workers assisted people in exposing coordinated disinformation campaigns in Ukraine in 2018.
Google’s Jigsaw unit has approximately 70 people primarily based in New York, according to the Financial Times, while Meta has said it has a team of hundreds of people focused on counter-terror work.
Meta, the parent company of Facebook and Instagram, said its software would help people identify images and videos at scale to spur collaborative crackdowns.
Meta president Nick Clegg recently wrote on the company’s blog that its tool “can be used for any type of violating content.”
“We hope the tool — called Hasher-Matcher-Actioner (HMA) — will be adopted by a range of companies to help them stop the spread of terrorist content on their platforms, and will be especially useful for smaller companies who don’t have the same resources as bigger ones,” he wrote.
According to Meta, the tool is an infrastructure that does not come with rules for how to moderate content nor include definitions of terrorism. The company noted its tool can also be used to stop child sexual abuse material.
Mr. Clegg said that Meta spent approximately $5 billion on safety and security last year, with a dedicated staff of 40,000 people. The Meta executive said the hundreds of workers fixated on counterterrorism come from the law enforcement, intelligence, and national security communities.
The new services offered by Google and Meta arrive as new rules for content regulation hit Europe via the implementation of the Digital Services Act.
Under the law taking effect last year, online platforms will face penalties for failing to comply with new rules that allow governments to order the removal of content.
Small and midsize tech companies may face a choice of adopting a more restrictive censorship approach favored by Big Tech or suffering penalties from regulators.
Platforms will face differing rules based on their size and they have a February deadline to report their number of active users to the European Commission.
• Ryan Lovelace can be reached at rlovelace@washingtontimes.com.
Please read our comment policy before commenting.