Facebook released its updated Content Distribution Guidelines on Thursday, shedding more light on how the tech giant decides what content it suppresses.
While Facebook has previously provided some details on the types of content that receive reduced distribution in Facebook’s News Feed, the updated guidelines are designed to provide clarity and accessibility, Director of Product Management Anna Stepanov announced in a blog post Thursday.
“The Content Distribution Guidelines make it clear what content receives reduced distribution on News Feed because it’s problematic or low quality,” Stepanov wrote.
The guidelines list content Facebook suppresses in its News Feed such as clickbait, unoriginal or low-quality content, and articles from news publishers rated as “broadly untrusted” by Facebook users. Content from untrustworthy or suspicious domains also receives reduced distribution, as does sensationalist news and certain types of promoted health content such as claims of products providing “miracle cures.”
The Content Distribution Guidelines outline what content receives reduced distribution in News Feed because it’s problematic or low quality – things like misinfo, clickbait, and ad farms. https://t.co/Gy8l5qhJaS— Andy Stone (@andymstone) September 23, 2021
All content rated as “False, Altered, or Partly False” by fact-checkers certified by the International Fact-Checking Network is deprioritized, though content rated as having “Missing Context” does not generally receive reduced distribution.
The tech company previously ran afoul of lawmakers and the New York Post for limiting the reach of a true story related to information contained on Hunter Biden’s laptop due to pending fact-check approval, while the International Fact-Checking Network attracted ire for fact-checking claims from pro-life group Live Action.
The released guidelines follow increased scrutiny of Facebook’s content moderation practices due to the spread of perceived misinformation on its platform. The company reportedly refused to hand over user data to the White House and failed to answer specific questions regarding the amount of alleged misinformation on its platform, The Washington Post reported in August.
Facebook also initially shelved the release of a transparency report that showed the most popular content on the platform was related to the COVID-19 vaccine, and has repeatedly taken action against researchers attempting to study its algorithms. The company was grilled by lawmakers Tuesday for its lack of transparency regarding its data collection policies and the impact its services have on users.
Facebook did not immediately respond to The Daily Caller News Foundation’s request for comment.
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities for this original content, email firstname.lastname@example.org.
Have an opinion about this article? To sound off, please email letters@DailySignal.com and we’ll consider publishing your edited remarks in our regular “We Hear You” feature. Remember to include the URL or headline of the article plus your name and town and/or state.
The post Facebook Reveals How It Decides What Content to Suppress appeared first on The Daily Signal.