Filter bubbles & echo chambers

Filter bubbles are specific technical effect of the attention economy. Facebook’s news feed is a filter bubble, created by a machine-learning algorithm which draws on data created by user networks, likes and comments and how much organisations are willing to pay to be present there.

Filter bubbles follow a longer-term trajectory within advertising (including political advertising and now disinformation) which has sought to collect data in order to tailor adverts to target groups, however, now they can be targeted to specific individuals. This can contribute to the formation of echo chambers, which is the reinforcement of existing beliefs (confirmation bias) through selective exposure to information. Hence, the technical and economic drivers of filter bubbles can act to reinforce echo chambers.

Increasing numbers of automated social media ‘bots’ have been linked with the spread of political disinformation and thus the reinforcement of echo chambers. Filter bubbles and the related echo chambers they feed into are linked to a decline in trust in the ability to traditional news media to provide reliable information. They have been found to exacerbate political divisions and polarisation, and have negative implications for the mechanisms of liberal democracy. Developing a broad consensus around decisions made in the public good becomes increasingly difficult.