Meta is bringing its “first strikes” policy to its other platforms after what it called a successful trial run on Facebook, and an internal report found that the tech company over-policed content.
The policy, introduced in August for Facebook, helps users who violate Meta’s content moderation policies to lift their suspensions. Users who receive a first strike will be asked to take a training course that, upon completion, will remove the strike from their account.
Additionally, users can participate in the training program again if they have no other offenses for one year.
However, the new policy will not apply to more serious violations of Meta’s content policies, like sexual exploitation or drug abuse.
According to Meta, the policy has been a success. The company said users who went through the training process were less likely to break the platform’s rules again and felt they understood the platform’s policies better.
The expansion comes just days after Meta announced that an internal report found it has over-policed content on its platform. The company said it planned to scale back its moderation policies, admitting that its previous policing of content has restricted speech.
However, Meta also said this week that its AI moderation policies significantly cut down on election misinformation.
• Vaughn Cockayne can be reached at vcockayne@washingtontimes.com.
Please read our comment policy before commenting.