- The Washington Times - Friday, September 2, 2022

YouTube’s crackdown on content for the upcoming midterm elections is underway, as the Google-owned video platform said it has started tearing down videos it dislikes and elevating news coverage it prefers.

YouTube Vice President Leslie Miller said her company has already removed content, issued warnings and suspended people from posting new content in response to “violative content related to the midterms.”

“We’ve already removed a number of videos related to the midterms for violating our Community Guidelines and Terms of Service,” Ms. Miller said on the company’s blog. “This includes videos that violated our election integrity policy by claiming widespread fraud, errors, or glitches occurred in the 2020 U.S. presidential election, or alleging the election was stolen or rigged.”

YouTube is also hand-picking news coverage it wants its audience to see while suppressing other videos from reaching voters.

“When you search for midterms content on YouTube, our systems are prominently recommending content coming from authoritative national and local news sources like PBS NewsHour, The Wall Street Journal, Univision and local ABC, CBS and NBC affiliates,” Ms. Miller wrote on Thursday.

She said YouTube’s systems are also “limiting the spread of harmful election misinformation by identifying borderline content and keeping it from being widely recommended.”

As the November elections grow nearer, YouTube will display a panel highlighting information about federal candidates when people search for them. Ms. Miller said people will see “timely context around election results underneath videos” on Election Day, and it will link to Google’s election results feature.

YouTube is the latest tech company to unveil new rules for the upcoming elections. Twitter said in August it started enforcing its new policy for the midterms aiming to restrict the spread of information the platform deems misleading or harmful.

TikTok, Facebook and Instagram will all have restrictions on political advertising.

Meta, Facebook and Instagram’s parent company, said last month it would halt political ads in the final week before the coming election.

TikTok does not allow any paid political ads and said last month it relied on external partnerships to improve the enforcement of its election-related policies.

“To bolster our response to emerging threats, TikTok partners with independent intelligence firms and regularly engages with others across the industry, civil society organizations, and other experts,” wrote Eric Han, TikTok head of U.S. Safety, on the company’s website. “In addition, TikTok partners with accredited fact-checking organizations who help assess the accuracy of content in more than 30 languages, and while they do not moderate content on our platform, their assessments provide valuable input which helps us take the appropriate action in line with our policies.”

While the prominent platforms are announcing a series of restrictions on content, the social media platform Parler, which courts conservatives, has said it is taking a much different approach.

The company said Thursday it would allow “all legal speech” without bans for any candidate’s opinions, and it would impose no shadow bans or reach restrictions on candidate accounts or content.

• Ryan Lovelace can be reached at rlovelace@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.