report video

Please log in or create an account and become a temulandian in order to submit a report.

How news feed algorithms supercharge confirmation bias

68955 Views
1385
Big Think
4.18M subscribers

How news feed algorithms supercharge confirmation bias
New videos DAILY: bigth.ink
Join Big Think Edge for exclusive video lessons from top thinkers and doers: bigth.ink/Edge

According to a Pew Research poll, 45% of U.S. adults get at least some of their news from Facebook, with half of that amount using Facebook as their only news outlet.
Algorithms on social media pick what people read. There’s worry that social media algorithms are creating filter bubbles, so that they never have to read something they don’t agree with and thus cause tribal thinking and confirmation bias.
The Charles Koch Foundation is committed to understanding what drives intolerance and the best ways to cure it. The foundation supports interdisciplinary research to overcome intolerance, new models for peaceful interactions, and experiments that can heal fractured communities.
For more information, visit charleskochfoundation.org/courageous-collaborations.The opinions expressed in this video do not necessarily reflect the views of the Charles Koch Foundation, which encourages the expression of diverse viewpoints within a culture of civil discourse and mutual respect.

ELI PARISER:

Eli Pariser has dedicated his career to figuring out how technology can elevate important topics in the world. He is the co-founder of Upworthy and bestselling author of The Filter Bubble: What the Internet Is Hiding from You.

TRANSCRIPT:

ELI PARISER: A filter bubble is your own personal universe of information that’s been generated by algorithms that are trying to guess what you’re interested in. And increasingly online we live in these bubbles. They follow us around. They form part of the fabric of most websites that we visit and I think we’re starting to see how they’re creating some challenges for democracy.

We’ve always chosen media that conforms to our address and read newspapers or magazines that in some way reflect what we’re interested in and who we want to be. But the age of kind of the algorithmically mediated media is really different in a couple of ways. One way is it’s not something that we know that we’re choosing. So we don’t know on what basis, who an algorithm thinks we are and therefore we don’t know how it’s deciding what to show us or not show us. And it’s often that not showing us part that’s the most important – we don’t know what piece of the picture we’re missing because by definition it’s out of view. And so that’s increasingly I think part of what we’re seeing online is that it’s getting harder and harder even to imagine how someone else might come to the views that they have might see the world the way they do. Because that information is literally not part of what we’re seeing or consuming. Another feature of kind of the filter bubble landscape is that it’s automatic and it’s not something that we’re choosing. When you pick up a left wing magazine or a right wing magazine we know what the bias is, what to expect.

A deeper problem with algorithms choosing what we see and what we don’t see is that the data that they have to base those decisions on is really not representative of the whole of who we are as human beings. So Facebook is basically trying to take a handful of sort of decisions about what to click on and what not to click on, maybe how much time we spend with different things and trying to extract from that some general truth about what we’re interested in or what we care about. And that clicking self who in fractions of a second is trying to decide am I interested in this article or am I not it just isn’t a very full representation of the whole of our human self. You can do this experiment where you can look back at your web history for the last month and obviously there are going to be some things there that really gave you a lot of value, that represent your true self or your innermost self. But there’s a lot of stuff, you know, I click on cell phone reviews even though I’ll always have an iPhone. I never am not going to have an iPhone. But it’s just some kind of compulsion that I have. And I don’t particularly need or want algorithms amping up my desire to read useless technology reviews.

The people who create these algorithms like to say like they’re neutral. We don’t want to create a kind of take an editorial point of view. And I think there’s something to that that’s important, you know. We don’t want Mark Zuckerberg to impose his political views on all of us and I don’t think he is. But it’s also kind of a weird dodge because every time that you create a list and that’s essentially all that Faceboo…

For the full transcript, check out bigthink.com/the-present/facebook-algoithm-filter-bubble

Big ThinkBigThinkBigThink.comEducationEducationalLifelong LearningEDUAIalgorithmcomputersdatainternetmediasocial mediawebbrainEli ParisersponsoredCharles Koch FoundationCKFfacebooktwitternewseli parisereli pariser big thinkeli pariser filter bubbleseli pariser ted talkeli pariser tedeli pariser upworthyconfirmation biasnews feed algorithmhow facebook news feed algorithm worksnews feed audience and visibility for postsnews feed