Facebook news feed echo chamber is your fault for not having enough friends, says Facebook
Does Facebook filter what views and opinions you see on your news feed? In a study conducted by the social network, it turns out you've only got yourself to blame for not having enough friends with diverse beliefs.
As you may already know, Facebook uses an algorithm called "Edgerank," to learn your online behaviour and decide what content you want & don't want to see. Sometimes they get it right and then others you see a little too much of your ex, or are reminded that you have terrible friends.
The problem with this situation is the specifics of how they filter these posts aren't well known, and they have a documented past of messing with it, from increasing the amount of videos you see to directly compete with YouTube to altering the levels of positive and negative posts in an experiment to change your mood. But is Facebook manipulating the content you see to the point of sheltering you from the diversity of political opinion?
In this experiment, the company used anonymous data from 10.1 million users with political affiliations listen on their profiles, to see whether their posts are caught within a "filter bubble" of sorts. They monitored links to "hard news," who shared them by political affiliation and how you saw these based on conversational levels.
Conclusion, the "filter bubble" exists, but it's your own fault it does.
"While News Feed surfaces content that is slightly more aligned with an individual's own ideology (based on that person's actions on Facebook), who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much diverse content they encounter."
Put simply according to Facebook, the algorithm doesn't categorise by opinion, only by numbers. You engage with more posts and you have friends with select political views, chances are you will see more like it. But if you had more friends, you will see more diversity, as Facebook pointed out that "birds of a feather flock together."
Friends are more likely to be similar in age, educational attainment, occupation and geography. It is not surprising to find that the same holds true for political affiliation on Facebook.
To me this study seems to have a few holes. It's limited to people who make their political views known through profile settings, volunteering "interpretable ideological affiliations." Objectively, this makes up a small percentage of the overall userbase of Facebook, as many people don't add data like this beyond the obligatory relationship status. They've answered the question at the beginning, but have also dodged it slightly.
But that's just my opinion. If you want to make your own conclusions, take a look at the full study on Facebook Research.