Base Read

Facebook will stop Recommending Health groups

It is essential that people can obtain health information from authoritative sources.

0 73

Facebook will stop Recommending Health Groups


  • It is essential that people can obtain health information from authoritative sources.
  • The company’s new rules extend existing regulatory measures.
  • The company has tried other methods to limit the spread of anti-vaccine content and coronavirus misinformation, including adding contextual information to posts discussing COVID-19 and placing slogans on vaccine-related pages.
  • Facebook also stated that it will continue to restrict the content of militia groups and other violence-related organizations.

Facebook is adding new rules designed to slow the spread of misinformation and other harmful content in its online forum function. From now on, if a group is deleted due to a violation of Facebook policy, its members and administrators will temporarily be unable to create any new groups.

If the group does not have an administrator, it will be archived. Moreover, Facebook’s recommendations will not include any health-related groups.

Facebook groups have been blamed for spreading misinformation and conspiracy, especially when Facebook’s recommendation algorithm promotes it.

The company’s new rules extend existing regulatory measures. For example, the administrator has been banned from creating new groups similar to banned.

Facebook New Update

Some new policies encourage groups to manage more actively. If an administrator resigns, they can invite members to replace them; if no one does, Facebook will obviously “suggest” the role of administrator to members, and if it fails, it will be archived.

Besides, if a group member violates the community standards, the moderator must approve all their posts for 30 days. If the host repeatedly approves posts that violate Facebook’s guidelines, they can be deleted.

The health guidelines take a broader approach, focusing on entire content categories rather than specific violations of the rules.

Facebook stated that although groups can “a positive space to provide and receive support in difficult situations… People must obtain health information from authoritative sources.”

The company has tried other methods to limit the spread of anti-vaccine content and coronavirus misinformation, including adding contextual information to posts discussing COVID-19 and placing slogans on vaccine-related pages. Even so, its size makes it a powerful carrier for false health stories.

Facebook also stated that it will continue to restrict the content of militia groups and other violence-related organizations. Groups discussing potential violence will be deleted, and the ranking of even non-violating content will soon be lowered in the news feed.

However, the company has been working hard to define the boundaries of objectionable content, including a post by Kenosha, Wisconsin, who claimed to be a militia group in which a 17-year-old militia supporter killed two people on the night of protest.

Read this article for free
This article is part of our archive of over 1,000 stories and is only available to subscribers. To get a preview of the work we do, you can enter your email to access the full article.
You can unsubscribe at any time

Leave A Reply

Your email address will not be published.