Facebook test warns several users when they have seen extremist content
Facebook has launched a new test that allows multiple users to know when they are exposed to extremist content, as well as other features that offer users of users in cases where they believe one of their friends can adopt an extremist view. This step occurs amid the growing concern that the social media platform is used to minimize the population.
The social media platform connects people together, but it is not always a good thing. The platform has become home to increasingly extremist content, including everything from wrong information together designed to manipulate a person’s emotions for closed groups that actively spray extreme and dangerous views.
Some Facebook users have noticed new instructions that address this problem, including one that tells them when they have been exposed to potentially extremist content and others who offer help if they are worried about a friend. The company confirmed the test immediately after that, and said it was part of a greater effort to test the way the platform could help risk users.
A Facebook spokesman told CNN that the company worked with non-profit organizations and experts on extremism as part of this effort, but it did not have more to add now. Users report two different warnings, including those that read, “We care about preventing extremism on Facebook. Other people in your situation have received confidential support.”
Another message reads, “Violence groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others.” This warning allows users to know whether they have seen content that might be extremist, also offering resources to get support. It’s not clear how many users see today’s prompt.
Whether the warning will be enough to reverse the wave of information errors and extremism on the platform is a bigger question. Facebook has been home to manipulative media for years, after playing a big role in the 2016 presidential election, for example. Although Facebook has launched a method for dealing with manipulation on its platform, including placing warnings about fake news, efforts seem to have a little effect on users who continue to find and share the wrong information.