The Big Delete:’ Inside Facebook’s crackdown in Germany
Days before Germany’s federal elections, Facebook took
what it called an unprecedented step: the removal of a series of accounts that
worked together to spread COVID-19 misinformation and encourage violent
responses to COVID restrictions.
The crackdown, announced Sept. 16, was the first use of
Facebook’s new “coordinated social harm” policy aimed at stopping not
state-sponsored disinformation campaigns but otherwise typical users who have
mounted an increasingly sophisticated effort to sidestep rules on hate speech
or misinformation.
In the case of the German network, the nearly 150
accounts, pages and groups were linked to
the so-called Querdenken movement, a loose coalition that has protested
lockdown measures in Germany and includes vaccine and mask opponents,
conspiracy theorists and some far-right extremists.
Facebook touted the move as an innovative response to potentially harmful content;
far-right commenters condemned it as censorship. But a review of the content
that was removed — as well as the many more Querdenken posts that are still
available — reveals Facebook’s action to be modest at best. At worst, critics
say, it could have been a ploy to counter complaints that it doesn’t do enough
to stop harmful content.
“This action appears rather to be motivated by Facebook’s
desire to demonstrate action to policymakers in the days before an election,
not a comprehensive effort to serve the public,” concluded researchers at
Reset, a U.K.-based nonprofit that has criticized social media’s role in
democratic discourse.
Facebook regularly updates journalists about accounts it
removes under policies banning “coordinated inauthentic behavior,” a term it
created in 2018 to describe groups or people who work together to mislead
others. Since then, it has removed thousands of accounts, mostly what it said
were bad actors attempting to interfere in elections and politics in countries
around the world.
But there were constraints, since not all harmful
behavior on Facebook is “inauthentic”; there are plenty of perfectly authentic groups
using social media to incite violence, spread misinformation and hate. So the
company was limited by its policy on what it could take down.
But even with the new rule, a problem remains with the
takedowns: they don’t make it clear what harmful material remains up on
Facebook, making it difficult to determine just what the social network is
accomplishing.
Case in point: the Querdenken network. Reset had already
been monitoring the accounts removed by Facebook and issued a report that
concluded only a small portion of content relating to Querdenken was taken down
while many similar posts were allowed to stay up.
Credit; By DAVID KLEPPER for apnews
No comments