- Thursday, May 18, 2023

“Your post goes against our Community Standards on hate speech.”

At first glance, I assumed the bewildering message popping up on my Facebook app was a mistake — or a spoof of some sort.

“What in the world did I post?” I pondered.

But then I noticed the supposedly offensive message affixed to the bottom of Facebook’s missive. A screenshot from an April 2 post on my page was highlighted.

It read, “Jesus died so you could live.”

It was a message I recalled posting early last month in an effort to summarize the central message of the Gospel: Jesus’ sacrificial death for mankind and his ushering in of hope and salvation. The message is essentially New Testament 101.

Though some might disagree with the proclamation, there’s certainly nothing about the wording that any rational person would call “hate speech.”

As I foggily stared down at my screen, I scanned the rest of the message to see what else the social media giant had to say about the purportedly inflammatory post.

Listen to me discuss the issue on “Higher Ground With Billy Hallowell” (subscribe):

Facebook proceeded to declare that its “community standards” are meant to ensure everyone feels “safe, respected and welcome,” though, in those moments of confusion over the hate speech flag, I didn’t feel any of those sentiments.

An ominous line about future infractions was included in the warning. It read, “If your content goes against our Community Standards again, your account may be restricted or disabled.”

“Wow,” I thought.

I was offered the opportunity to “disagree with the decision” and fully assumed it was some sort of artificial intelligence error — a casualty of a technological era bent on attempting to automate itself into oblivion.

To my shock, however, Facebook posted a follow-up message a few hours later seemingly doubling down on its hate-speech claims. It read, “We have removed your post from Facebook,” and said my appeal of the decision was “reviewed” and the post was found to be in violation.

“Your appeal was reviewed and your post does not follow our Community Standards for hate speech,” it read. From there, I wasn’t left with any options to further address the matter.

The entire ordeal was surreal and, frankly, disturbing. I’m not at all interested in purporting to be a victim, but as an advocated of free speech and religious freedom, the idea that a message as innocuous as “Jesus died so you could live” would be censored or banned is patently bizarre.

I made it a point to contact the Facebook press office as well as directly messaging a staff member — but to no avail. Thus far, no one has replied. Regardless of whether it was an error, I was left with many questions.

Why would a posting from April 2 go unnoticed for more than a month before the hate-speech claim? How does Facebook pinpoint such posts — and are reviews and appeals conducted by humans rather than potentially disastrous technologies incapable of properly vetting such material?

If this was a technological error, I can have grace and understanding, though my worry is that social media platforms and other tech outfits will increasingly rely on such systems without taking into account the perils of doing so.

Of course, if this wasn’t an error, there’s a much bigger problem brewing at Facebook. It’s worth noting that the original post from April 2 was still up and active even after the warning and claim of removal, adding even more questions into the mix.

We are no doubt living in a culture that’s increasingly hostile to Judeo-Christian values. We’re also living in a time when technology is overreaching into every area of our lives. I’m not claiming Facebook intentionally was hostile toward my message, though I’m still scratching my head over it all.

Thus, with the rise of artificial intelligence and other advances — and with so few companies running so much of our online communications — we must be cautious, watchful, discerning and shrewd. That’s why I’m speaking out about this incident.

On its own, it’s a fluke, but if it’s part of a bigger pattern — even one being knitted unintentionally — it’s something that must be addressed.

At the least, if Facebook is going to accuse people of the serious offense of violating hate-crime standards, the platform should, on ethical and moral grounds, have a proper system through which people can address such claims and clear their names.

In the meantime, I have no problem continuing to post and share that Jesus died so we all could live. It’s the most transformational and powerful message humanity has ever received — and one no form of censorship will ever preclude me from proclaiming.

Billy Hallowell is a digital TV host and interviewer for Faithwire and CBN News and the co-host of CBN’s “Quick Start Podcast.” He is the author of four books, including “Playing With Fire: A Modern Investigation Into Demons, Exorcism, and Ghosts” and “The Armageddon Code: One Journalist’s Quest for End-Times Answers.” He can be reached at bhallowell@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.