Bente Kalsnes
Phd Candidate, Department of Media and Communication, University of Oslo
Email: bente.kalsnes@media.uio.no
Section 6: Digital Campaign
- Did Russia just hand Donald Trump the Presidency?
- Taking Julian Assange seriously: considering WikiLeaks’ role in the US presidential campaign
- Social media did not give us Donald Trump and it is not weakening democracy
- Trump and the triumph of affective news when everyone is the media
- Tweeting the election: political journalists and a new privilege of bias?
- The dissolution of news: selective exposure, filter bubbles, and the boundaries of journalism
- Two tribes go to vote: symbolism on election day
- Ideas are for sharing
- In the age of social media, voters still need journalists
- Dark magic: the memes that made Donald Trump’s victory
The discussion about filter bubbles has exploded after the 2016 US election. Evidence suggests voters access separate, ideologically homogenous, newsfeeds – the Red Feed and the Blue Feed as demonstrated by the Wall Street Journal. Therefore it is time to ask more questions about how algorithmic platforms such as Facebook and Google impact voters’ information environment during elections. As we know from numerous Facebook press releases, Facebook strives to select the most relevant and engaging content to appear in the Newsfeed. But how should society deal with a ‘relevant’ newsfeed that turns into a filter bubble, often based on fake news?
Quality of information is particularly important during elections campaigns, when the electorate should make informed choices about candidates and policies. Obviously social media did not give us Donald Trump, as argued by Daniel Kreiss – larger historical, cultural factors have given ground for Trump’s presidency: such as frustration, polarisation and mistrust in elites and institutions. But I will still argue it is worthwhile to discuss the quality of information voters interact with in the decision-making process and how information is selected and presented. Thus, fake news and Facebook’s algorithm is relevant in this context.
An increasing number of citizens are using social media to follow the election campaign and inform themselves about the candidates. In 2016, 62% of Americans got some news via social media, up from 49% in 2012 according to a Pew survey. Facebook is in this context the most used platform, in addition to Reddit and Twitter. 44 percent of US adults and two thirds of Americans aged 18 to 29 claimed to have used social media in an ordinary week in order to learn more about the 2016 presidential election.
It is still too early to tell how strong the filter bubble has been for voters in this election, but Wall Street Journal’s website Blue Feed and Red Feed gives us a pretty good idea of the sharp contrast between the two information streams.
The most relevant and engaging newsfeed might be wonderful for users and consumers, but concerning for scholars of democracy. If the information environment becomes so polarized and fragmented, it allows voters to live in different realities –the so-called balkanization of the public sphere. It gets even more problematic when fake news is added into the filter bubble. Fake news got heavy circulation online during the run-up to the election, and Facebook’s algorithm allowed the misinformation to be amplified and disseminated widely.
Filter bubbles are often understood as personal ecosystem of information that has been catered by algorithms, such as Google or Facebook. This way, the users are presented with information that confirms and strengthens their own cultural or ideological bubbles. Even though the term filter bubble got its digital definition from Pariser, we have had analogue filter bubbles that skews or limits our views, but historically, they have been related to our news consumption, education, social network, or geography, to mention a few of our social filter bubbles. There has always been too much information in the world for us to grasp, comprehend and register, so we have filtered and excluded information based on our needs. Before the internet, editorial media helped us sort and prioritize information and news. After the internet became mainstream, algorithms became useful tools to sort and present information, either it was related to which book to buy, which movie to see, which song to listen to, or which news story to read.
Facebook’s role in selecting and calculating the most “relevant” information has ramifications that are also political. The debate about whether Facebook is a media or a technology company got intensified earlier this fall. The Norwegian newspaper Aftenposten protested Facebook’s censorship of the Napalm girl picture, arguing that Facebook made editorial decisions interfering with the free press. As Facebook increasingly becomes the information source for people around the world, the company has a unique responsibility in striving for information diversity and quality. In addition to “relevant” and “engaging”, “serendipity” should be built into the newsfeed. The Red Feed and the Blue Feed reinforces old filter bubbles from the party press era. Do we want filter bubbles to be reflections of the party press that we got rid of decades ago in liberal democracies? If Facebook is not able to diversify and fact-check the newsfeed, the most popular social network might end up with an algorithmic driven newsfeed based on fake party propaganda.