Facebook now warns if a friend is becoming an “extremist”. Facebook’s latest measure against extremism surprises users. Detect and report extremist content.
In one of the most controversial decisions of the last year, Facebook has implemented a new functionality that indicates to the user that they could have consumed content considered harmful, and that they could have friends on the social network who have become extremists.
The move comes months after Facebook was used as the base for various pro-Donald Trump groups to organize the invasion of the US Capitol, in an apparent attempt to prevent Joe Biden from voting as the new president. The company initially reacted by expelling Trump from its platform, considering that the statement it published encouraged the attack.
But that wasn’t the first time, far from it, that Facebook’s role in training extremists came to light. Now, the company has implemented a functionality with which it tries to inform users that such groups and content can be found. Some users in the US have received two types of pop-up messages, warning that they may be interacting with “extremist” content and people.
One message directly asks the user “Are you concerned that someone you know has become an extremist?”, While the other warns more directly: “You may have recently been exposed to harmful extremist content.”
Facebook is now signposting help if you think a friend is becoming an extremist ????
In both cases, the message displays a button that allows you to get help, which redirects to resources and information about extremist content on the network. These include support groups for militants of violent far-right groups.
This is a pilot program, of which at the moment only a few users are participants. Facebook has confirmed that it is part of its commitment to the Christchurch Call to Action summit, which was born in response to the 2019 New Zealand bombing , which ended with the murder of 51 people in a mosque. The person in charge broadcast the killing by streaming through social networks and gained a lot of popularity among extremist groups.
It should be noted that Facebook does not show information on what content has activated this message , nor which of our friends is the one who is consuming this type of messages and videos.
In addition, nothing indicates that it is deleting the content that it considers “violent” and that it has activated the message, only pointing out that the company “cares about preventing extremism on Facebook.” In February, the company claimed that it had taken action against 26.9 million hateful posts on its platform.