Have we ever understood the way that Facebook algorithms work?
All we really know is that – at some point – our newsfeeds began to become more curated to what Facebook thought we would be interested in. And that ever since this happened, naysayers have criticized the social network for shaping our perception of the world via its algorithmically-filtered newsfeed.
Well, a new study published in Science has (sort of) debunked the long-held notion that Facebook’s algorithm leads to the creation of an “echo chamber” or “filter bubble.”
Data scientists at Facebook studied the accounts of 10 million users who volunteered their ideological affiliation in their profile to examine how the users interact and socially share news. They found that liberals and conservatives are in fact regularly exposed to at least some “crosscutting” political news. As in, stories that don’t conform to their pre-existing biases.
They found that the Facebook newsfeed algorithm leads conservatives to see only 5 per cent less liberal content than their friends share, while liberals see 8 per cent less conservative content.
The position of a link on your News Feed – which is determined by the algorithm’s interpretation of your interests – also has an effect on the likelihood that a user will click on it, the study found.
The thing is, though, apparently it’s you who’s responsible for the fate of your newsfeed – at least, according to Facebook scientists.
The study revealed that the greatest impact on what users see comes from what they’ve clicked on in the past. So, you can’t blame the Facebook algorithm for making your profile politically one-sided: it’s that way based on your decision to either click on or ignore stories.
The study also found that Liberals are about 6 per cent less likely to click on crosscutting content, while conservatives are 17 per cent less likely. Hmmm.
The study, however, has been met with some backlash from critics who point to the study’s relatively small sample size (yes, 10 million people is now considered small) and potential interpretation issues.