If you are attempting to like a page like this, you will see a pop-up saying that the page has “repeatedly shared fake data,” which “independent fact-checkers have said the information is fake news”.
Taking another step to traumatize information on the platform, Facebook is taking steps to weed out pages that repeatedly spread fake news.
If you are attempting to like a page like this, you’ll see a pop-up saying that the page has “repeatedly shared false data,” which “independent fact-checkers have said the information is fake news”. “. You’ll then be presented with the choice of going back to the previous page or following the page in any method.
There will also be a “Learn more” link which will offer some more information regarding why this page is tagged as such, additionally as another “Learn more” link which Facebook’s fact-checking program can offer more information on.
The company additionally said it might extend penalties to individual Facebook accounts that repeatedly share misinformation, within the sense that different users would see less in their news feeds.
Finally, Facebook has redesigned information that pops up once users share content that fact-checkers have mistaken. The notification can currently include a fact-checker article explaining why the post is misleading, in addition, because of the choice to share that article.
Users also will be notified that users who repeatedly share fake news are going to be placed down in the news feed, reducing their possibilities of being seen by other users.
Over the past few years, Facebook has been introducing many measures to affect misinformation on the platform. These embrace introducing a message forwarding limit on messenger, encouraging users to browse an article before sharing it, putting warning labels on fake news, and most magnificently – preventing Donald Trump from using the platform. Despite these efforts, the company still has a great distance to go before it’ll say that it has truly gotten rid of fake news.