For many years, it has been an open secret that Facebook manipulates the content that is published and some have more relevance or visibility than others. The algorithm that decides those posts has always been something of a mystery, but in the interests of transparency, Facebook has published the guidelines it follows to show (or hide) the content.
These guidelines are in the Facebook news center. and they detail, for example, which are the types of comments that are more likely to be hidden from other people, either because they are offensive or because the rest of the community marks them as hostile and prefer not to see it. .
Facebook also reveals how they identify posts that explicitly seek people to interact, in what is called Engagement Bait (or interaction hook, in Spanish). Also, there is content that is classified as sensationalist and that is generally less displayed, especially when it talks about miracle cures or health claims that are generally questioned.
The idea of Facebook is that its users know what type of content is more likely to be less visible within the platform, but that does not mean that the content is automatically removed or anything; it simply shows less to users. In statements to The Verge, an executive in charge of these policies said that the idea is that people know “what kind of content is problematic, but that it is not worth removing.”
The list of this problematic content is extensive and there are still some details in the air regarding questionable publications that seem to be repeated very frequently, especially related to fake news or the like. And that’s an area where Facebook has never stood out for its transparency.
But for now at least, users are already going to know why some of their publications may result with fewer interactions than a while ago, especially when it comes to asking likes.