YouTube blocked users from commenting live during Tuesday’s congressional hearing about hate speech and white nationalism after the broadcast’s chat room became flooded with bigoted and anti-Semitic reactions.
“Due to the presence of hateful comments, we disabled comments on the livestream of today’s House Judiciary Committee hearing,” a YouTube spokesperson said in a statement.
“Hate speech has no place on YouTube,” said the spokesperson, adding that the Alphabet-owned video service has “invested heavily in teams and technology dedicated to removing hateful comments and videos and we take action on them when flagged by our users.”
Announced following last month’s deadly rampage targeting Muslims in Christchurch, New Zealand, the hearing was held in part to “foster ideas about what social media companies can do to stem white nationalist propaganda and hate speech online,” panel leadership said previously.
“These platforms are utilized as conduits to spread vitriolic hate messages into every home and country,” Rep. Jerrold Nadler of New York, the committee’s Democratic chairman, said in his opening statement. “Efforts by media companies to counter this surge have fallen short, and social network platforms continue to be used as ready avenues to spread dangerous white nationalist speech.”
Mr. Nadler revisited the topic roughly two hours later amid learning about comments being posted on YouTube in the chatroom accompanying the congressional’s committee’s live broadcast of the hearing.
“These Jews want to destroy all white nations,” Mr. Nadler quoted a commenter. “Anti-hate is a code word for anti-white,” the congressman quoted another.
“This just illustrates part of the problem we’re dealing with,” Mr. Nadler said.
Witnesses testifying during Tuesday’s hearing included religious and political activists, as well as public policy experts from Google and Facebook, among others.
“I want to state clearly that every Google product that hosts user content prohibits incitement of violence and hate speech against individuals or groups based on specified attributes,” testified Alexandria Walden, a Google counsel for free expression and human rights.
“We view both as grave social ills, so our policies go beyond what the U.S. requires,” she said during her opening statement.
Fifty people were killed as a result of the March 15 mass-shooting at two mosques in New Zealand. A manifesto uploaded to the internet prior to the massacre and believed to have been written by the suspected gunman – a self-described racist, ethno-nationalist from Australia – claimed that the rampage was waged to take revenge against Islamic “invaders.”
New Zealand censors have since banned both the manifesto and video footage of the massacre that was livestreamed on Facebook.
• Andrew Blake can be reached at ablake@washingtontimes.com.
Please read our comment policy before commenting.