Not long after America’s presidential elections in 2016 rumour had it that the votes were not rightfully earned. Not only did the Democrats accuse the Republicans from miscounting, many believed that Facebooks their algorithm, as well as fake news on the platform had a major influence on the results.

In short: the algorithm in question is fed by the user’s behaviour. If you like pictures of baby animals, Facebook registers the times that you see these pictures and will suggest pages, articles, videos, advertisement and people around that theme. Then the algorithm combines your and other data, becomes smarter and continuously adapts itself to your online behaviour. The heavier Facebook is used, the more data is generated. Meaning that Facebook’s suggestions to the user will be personalised better. And with more data available, this personalization will be successfully optimized. This is an ongoing circle, called the filter bubble. More on that later in this paper.

The accusation contains that this ongoing circle of personalized content causes an incomplete view on reality. And if Facebook users have an incomplete or incorrect view of reality, there is a chance that this influenced their vote. Would these users have voted differently if they had been confronted with two-sighted articles?

In this paper research about the filter bubble and personalized content will be compared. Is a filter bubble a concept to fear? Can this concept indeed influence Dutch elections? This last question will be applied on the case of Dutch parliamentary elections, which will be held in spring 2017.

Click here to read the paper on ‘Framing our next president’

Leave a Reply

Your email address will not be published. Required fields are marked *

CommentLuv badge

This site uses Akismet to reduce spam. Learn how your comment data is processed.