Google and YouTube announced new steps over the weekend to hide or block “extremist and terrorism-related” content from the video-sharing platform.
A blog post by Google’s general counsel, Kent Walker, laid out four steps that are being taken to identify and remove blatant terrorism propaganda, along with other “inflammatory” content that does not clearly violate YouTube’s terms of service.
“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now,” Mr. Walker said Sunday.
YouTube’s plan includes:
- Technology to identify extremist and terrorism-related videos.
- More resources for its Trusted Flagger program.
- A tougher stance on content that does not clearly violate its policies, such as “videos that contain inflammatory religious or supremacist content.”
- An expansion of counter-radicalization efforts via its “Creators for Change program.”
The company says its technology will prevent “re-uploads of known terrorist content” while simultaneously leaving network news content alone.
Details regarding the kinds of religious content that will be flagged as “problematic” were scant, although the company said it will add 50 expert NGOs to 63 existing organizations that take part in the Trusted Flagger program.
“In the future, [videos that don’t clearly violate policies] will appear behind an interstitial warning and they will not be monetized, recommended or eligible for comments or user endorsements,” Mr. Walker said. “That means these videos will have less engagement and be harder to find. We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”
• Douglas Ernst can be reached at dernst@washingtontimes.com.
Please read our comment policy before commenting.