OPINION:
During my years at Google, down the hall from the free food, the kombucha on tap and the nap pods, I would settle down on my ergonomic swivel chair and take a deep breath, preparing myself to look at the vilest content the Internet had to offer.
I am a recovering in-house content moderator. Content moderators are the people who process all incoming requests to have posts removed from online platforms. They sift through terrorist executions, violence against women, revenge porn and dead bodies, just to name a few examples.
They perform their analysis based on federal law, company policies and a strong desire to keep users safe. They work alongside lawyers, economists, former Hill staffers and engineers to think critically about the company’s legal obligations as well as its commitment to free speech. As much as the Internet is currently riddled with bad content, without these individuals, it would more akin to Dante’s Nine Circles of Hell.
The position was established by a law which, for all intents and purpose, created the Internet. Section 230 of the Communications Decency Act states that Internet platforms cannot be held liable for content generated by its users, even if the platform moderates that content. In other words, a content moderator can analyze reports of abusive content and make decisions as to whether the content stays up or comes down, without fear of the company they work for being sued and slammed with a fine.
Protection from legal retaliation that could put the company out of business was crucial in allowing me to use good judgment in making my decisions. I often thought about what my job would have looked like had that fear been constantly looming over me. Well, I no longer have to imagine, thanks to Sen. Josh Hawley.
On June 19, the junior senator from Missouri unveiled a bill ironically called the “Stop Internet Censorship Act.” This legislation would deny Section 230 protections to platforms that are not able to receive certification from the Federal Trade Commission. In doing so, the bill would effectively grant the government control over online speech.
Under the Hawley bill, the FTC would audit major platforms’ moderation practices every two years to determine whether those practices were “biased against a political party, political candidate or political viewpoint.” In practice, this would look something like this: A few FTC auditors would walk into a technology company and declare the beginning of the audit. They would comb through tens thousands of removals decisions, looking for those that are “politically biased” — a process that could take, at minimum, weeks to complete.
In the meantime, content moderators would hold back on their take down procedures because no one could really tell them how “politically biased” is interpreted. In other words, disinformation, Nazi propaganda and white supremacist videos would fester on the Internet. If a moderator fails this test, not only would they be fired, but thousands of lawsuits and fines would come tumbling down on the company.
At my former job, I tried to keep in mind that while I had to look at horrific content, thanks to my efforts, many others would not have to. Yet in a world where this bill passes, I would sit down at my same desk, take a deep breath and prepare myself to look at terrorist executions, aftermaths of mass shootings and hatred-motivated violence — but this time, with full knowledge that I had absolutely no control over its distribution.
I would read my first complaint of the day — a woman pleading that I remove the images of her son who was slain during a mass shooting. “We are very sorry for your loss, ma’am,” I would have to respond. “However, we are unable to remove this content at this time.” From her perspective, we would look like a heartless company, when in reality, we sit at our computers in fear that our next removal decision will be our last.
If passed, this bill would hold the technology industry hostage, forcing companies to operate the way the government of the day pleases.
Big brother is watching, and Josh Hawley is installing the cameras.
• Daisy Soderberg-Rivkin is a fellow at the R Street Institute, where she investigates the role of technology as it affects children.
Please read our comment policy before commenting.