Facebook is “down-ranking” posts that moderators believe contain health misinformation.
“Misleading health content is particularly bad for our community,” wrote Facebook product manager Travis Yeh in a blog post. “So, last month we made two ranking updates to reduce posts with exaggerated or sensational health claims and posts attempting to sell products or services based on health-related claims.”
Facebook’s down-ranking system, which already was moderating content like clickbait and spam, will cause misleading posts to show up less often in users’ News Feeds. The company announced its moderation plan for health content on July 2.
A week before, a report in The Washington Post brought attention to the role private Facebook groups play in spreading information on questionable remedies for cancer and other illnesses.
These remedies, which many of these groups claim are intentionally hidden by doctors, include baking soda, apple cider vinegar, frankincense, and silver particles.
Similarly, some private Facebook groups encouraged parents to “heal” their autistic children by having them ingest turpentine, industrial bleach, and even their own urine.
In order to figure out which posts will be down-ranked, Facebook teamed up with health care professionals to identify keywords and phrases that commonly appear in posts containing exaggerated or false health claims.
Users can still directly access these posts but will see fewer posts from group members in their News Feeds.
Image by Pexels.